A major flaw in Google's algorithm allegedly tagged two black people's faces with the word 'gorillas'
Google has apologized to a black man who says the search giant's photo algorithms sorted pictures of him and a friend, also black, under the category "gorillas."
"Google Photos, y'all f**ked up," Jacky Alciné wrote on Twitter, accompanying a screengrab of the photo and the racially offensive tag. "My friend's not a gorilla."
Google says it's now trying to figure out how this happened, the Wall Street Journal reports. The answer is probably an error within Google Photo's facial recognition technology.
"We're appalled and genuinely sorry that this happened," a company spokeswoman said. "There is still clearly a lot of work to do with automatic image labeling, and we're looking at how we can prevent these types of mistakes from happening in the future."
We've also contacted Google for comment.
The developer, Jacky Alciné, wondered on Twitter how this could have happened.
What kind of sample image data you collected that would result in this son?- diri noir avec banan (@jackyalcine) June 29, 2015
Like I understand HOW this happens; the problem is moreso on the WHY.This is how you determine someone's target market.- diri noir avec banan (@jackyalcine) June 29, 2015
Also, apparently when Alciné searched for photos of "gorillas" within his Google Photo library, the problem persisted, with other photos of himself and the same friend populating the results:
Less than two hours after Alciné had sent his original tweet, Google's Chief Architect of Social then jumped in, tweeting at Alciné to try and correct the problem.
The Googler also offered condolences to Alciné in response to racist Twitter trolls.
Google had created a fix for the problem in about an hour. But the next morning, Alciné told Zunger that two of his photos were still surfacing under the offensive word.
Zunger then said he'd send the issue out for a more permanent fix.
It appears that in the short term, the company will stop using the label "gorillas" as a category. Later, they'll figure out how to make their algorithms more sensitive to language that could be offensive.
As the Wall Street Journal pointed out, the Google flub illuminates the shortcomings of facial recognition technology.
"We need to fundamentally change machine learning systems to feed in more context so they can understand cultural sensitivities that are important to humans," Babak Hodjat, chief scientist at Sentient Technologies, told WSJ. "Humans are very sensitive and zoom in on certain differences that are important to us culturally. Machines cannot do that."
- Here’s how the partnership between Stagwell and Enormous will benefit both organizations
- Google Pixel 6, Pixel 6 Pro to share the same primary, ultra wide cameras: Report
- OnePlus blames voltage fluctuation after a user reports Nord 2 charger explosion
- Paras Defence and Space Technologies IPO allotment is tomorrow, IPO was subscribed 304 times
- Bharat Bandh — Checkout what will be open and what will remain closed today