[ View the story "Racist, sexist tech " on Storify] Racist, sexist tech What should be done about technology that discriminates?
The Stream· Tue, Jul 21 2015 18:24:13
YouthSpark, a program launched by Microsoft to "empower young people through technology", published a series of portraits of young women who are "Changing the Face of Coding."
Critics of the tech world say women of colour are most often underrepresented. Facebook’s black female headcount increased by just
over 2013 to a total of 11 black women. And among Twitter's 49 black employees,
14 are women
Tech entrepreneur and former Twitter manager Laura Gomez founded
, a software that prevents talented people from entering the "black hole" during tech recruitment.
Introducing AtipicaLaura I. Gómez
Other critics note that one of the key reasons so many diverse qualified candidates are left out of major tech companies is because of the importance placed on
Women of Colour in Tech
: "Facebook’s hiring practices are a case in point of where tech hiring remains flawed: they rely heavily on referrals, and mandate candidates have a Facebook account – imagine all the info! Pictures, data, friend groups all of these things trigger hundreds of biases."
Diversity in the tech world may be one issue, but some people have been noticing others, particularly when the Internet discriminated through an algorithm. Researchers at Carnegie Mellon found Google
showed ads for high-paying jobs
1,852 times to men but just 318 times to women.
When Google's search algorithm surfaces racist suggestions, it's bringing the evil thoughts of its users into view. The code doesn't care.Mic Wright
The Stream produced a video on the "
World White Web
", an initiative to stop whiteness from being the norm on the Internet. The project asks people of colour to publish photos of their hands online, in an effort to diversify the white hands that dominate Google searches.
The Stream | FacebookFacebook
Face detection software is also subject to racism. A Google photos
in late June shows a black woman and a black man being auto tagged as gorillas. The company responded by apologising and
the gorillas tag from the app.
The photo posted on Twitter also fired up a discussion between the social media community and Yonatan Zunger, the chief architect of social at Google. Below, Zunger says the error within the machine learning process is magnified because of the history of racism.
@julianpeeters @louisgray @jackyalcine The history of racism is what makes this error particularly bad.Yonatan Zunger
@yonatanzunger @julianpeeters @jackyalcine small edit: "continued history" of racism. Machines aren't biased. People are.Louis Gray
@julianpeeters @louisgray @jackyalcine But the error itself was just ordinary machine learning trouble.Yonatan Zunger
The Apple Watch has also been criticized for potentially not being able to detect the heart rate of people with
I'm glad I can decide on not wanting an apple watch since it doesn't like tattoos or darker skin. Bloody racist.Figaro
And in a video that went viral a few years ago with nearly three million views, two co-workers compare the differences between the way the video tracking device on a Hewlett Packard computer operates based on the darkness of one's skin.
HP computers are racistwzamen01
We asked the Stream community if they have come across any examples of racist imagery on the Internet.
@DanMing @AJStream if u google Latina, or looked up the latina hashtag on tumblr, many of the posts used to sexualize latinas, so+Bruja Quisqueyana
Google ‘Beauty’ and see white women. Google ‘Asian’ and get porn. I think it’s as simple as that. #AJStreamJuliet Shen
I just searched for hnds in Google & am surprised the only black-looking hnds r blck & whte images of white pple's hnds.mohamud
Members of our community also discussed how Internet users and programmers influence the outcome of an algorithm.
@ajstream The application of an algorithm may be racist, but not the algorithm itself.Elamin
@AJStream it can. Algorithms are man-made and thus it somehow reflect the personality of the programmerAbdulmalik Taj-Liad
@Nuri_ibrahim Another question is does #unconsciousbias find its way into coding/algorithms & how can we detect it if it does? @AJStreamCaleph Wilson
@AJStream Diversity. Unlike the popular belief, user choices are often created and/or reinforced by the media itself. Not other way aroundAlper Ard