A major flaw in Google's algorithm allegedly tagged two black people's faces with the word 'gorillas'

Advertisement

Google has apologized to a black man who says the search giant's photo algorithms sorted pictures of him and a friend, also black, under the category "gorillas."

Advertisement

"Google Photos, y'all f**ked up," Jacky Alciné wrote on Twitter, accompanying a screengrab of the photo and the racially offensive tag. "My friend's not a gorilla."

 

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

Google says it's now trying to figure out how this happened, the Wall Street Journal reports. The answer is probably an error within Google Photo's facial recognition technology.

Advertisement

They gave this statement to WSJ:

"We're appalled and genuinely sorry that this happened," a company spokeswoman said. "There is still clearly a lot of work to do with automatic image labeling, and we're looking at how we can prevent these types of mistakes from happening in the future."

We've also contacted Google for comment.

The developer, Jacky Alciné, wondered on Twitter how this could have happened.

Advertisement

 Also, apparently when Alciné searched for photos of "gorillas" within his Google Photo library, the problem persisted, with other photos of himself and the same friend populating the results:

Less than two hours after Alciné had sent his original tweet, Google's Chief Architect of Social then jumped in, tweeting at Alciné to try and correct the problem.

Advertisement

The Googler also offered condolences to Alciné in response to racist Twitter trolls.

Advertisement

Google had created a fix for the problem in about an hour. But the next morning, Alciné told Zunger that two of his photos were still surfacing under the offensive word.

Zunger then said he'd send the issue out for a more permanent fix.

Advertisement

It appears that in the short term, the company will stop using the label "gorillas" as a category. Later, they'll figure out how to make their algorithms more sensitive to language that could be offensive.

As the Wall Street Journal pointed out, the Google flub illuminates the shortcomings of facial recognition technology.

Advertisement

"We need to fundamentally change machine learning systems to feed in more context so they can understand cultural sensitivities that are important to humans," Babak Hodjat, chief scientist at Sentient Technologies, told WSJ. "Humans are very sensitive and zoom in on certain differences that are important to us culturally. Machines cannot do that."

NOW WATCH: The 'Tesla of scooters' is finally available and it looks incredible