A major flaw in Google's algorithm allegedly tagged two black people's faces with the word 'gorillas'
Google has apologized to a black man who says the search giant's photo algorithms sorted pictures of him and a friend, also black, under the category "gorillas."
"Google Photos, y'all f**ked up," Jacky Alciné wrote on Twitter, accompanying a screengrab of the photo and the racially offensive tag. "My friend's not a gorilla."
Google Photos, y'all fucked up. My friend's not a gorilla. pic.twitter.com/SMkMCsNVX4
- diri noir avec banan (@jackyalcine) June 29, 2015
Google says it's now trying to figure out how this happened, the Wall Street Journal reports. The answer is probably an error within Google Photo's facial recognition technology.
They gave this statement to WSJ:"We're appalled and genuinely sorry that this happened," a company spokeswoman said. "There is still clearly a lot of work to do with automatic image labeling, and we're looking at how we can prevent these types of mistakes from happening in the future."
We've also contacted Google for comment.The developer, Jacky Alciné, wondered on Twitter how this could have happened.What kind of sample image data you collected that would result in this son?
- diri noir avec banan (@jackyalcine) June 29, 2015
Like I understand HOW this happens; the problem is moreso on the WHY.This is how you determine someone's target market.
- diri noir avec banan (@jackyalcine) June 29, 2015
And it's only photos I have with her it's doing this with (results truncated b/c personal): pic.twitter.com/h7MTXd3wgo
- diri noir avec banan (@jackyalcine) June 29, 2015
@jackyalcine Can we have your permission to examine the data in your account in order to figure out how this happened?
- Yonatan Zunger (@yonatanzunger) June 29, 2015
@jackyalcine I would say I'm surprised but I'm not. Just disappointed with humanity.
- Yonatan Zunger (@yonatanzunger) June 30, 2015
@yonatanzunger Two photos show up under the term now (for both Gorilla and Gorillas). Sending a DM of image.
- diri noir avec banan (@jackyalcine) June 29, 2015
@jackyalcine Handing it to the team. Seriously, big thanks for helping us fix this: it makes a real difference.
- Yonatan Zunger (@yonatanzunger) June 29, 2015
@jackyalcine Quick update: we shouldn't be making piles with that label anymore, and searches are mostly fixed, but they can still turn up..
- Yonatan Zunger (@yonatanzunger) June 29, 2015
@jackyalcine ..photos where we failed to recognize that there was a face there at all. We're working on that issue now.
- Yonatan Zunger (@yonatanzunger) June 29, 2015
@jackyalcine We're also working on longer-term fixes around both linguistics (words to be careful about in photos of people [lang-dependent]
- Yonatan Zunger (@yonatanzunger) June 29, 2015
"We need to fundamentally change machine learning systems to feed in more context so they can understand cultural sensitivities that are important to humans," Babak Hodjat, chief scientist at Sentient Technologies, told WSJ. "Humans are very sensitive and zoom in on certain differences that are important to us culturally. Machines cannot do that."
Next