Artificial intelligence experts from Facebook, Google, and Microsoft called on Amazon not to sell its facial recognition software to police

Artificial intelligence experts from Facebook, Google, and Microsoft called on Amazon not to sell its facial recognition software to police

Jeff Bezos

zz/Dennis Van Tine/STAR MAX/IPx

Amazon CEO Jeff Bezos.

  • A group of AI experts penned an open letter to Amazon calling for the company to stop selling its facial recognition software, called Rekognition, to law enforcement.
  • The letter says there are currently no safeguards in place to stop misuse of the technology, which has shown particular levels of inaccuracy when given images of women and people with darker skin tones.
  • A number of those experts work at other major tech firms working on AI, including Google, Microsoft, and Facebook. There was also a former principal scientist from Amazon.
  • Separately, the ACLU raised concerns about Rekognition last year after it ran a test on the software and found it misidentified 28 members of Congress as people who had previously been arrested.

A group of 55 AI experts have signed an open letter to Amazon calling on the company to stop selling its facial recognition software to police.

The signatories included researchers from tech giants like Facebook, Microsoft, and Google as well as AI powerhouses DeepMind and OpenAI. A former principal scientist at Amazon Web Services, Anima Anandkumar, also signed the letter.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

"We call on Amazon to stop selling Rekognition to law enforcement as legislation and safeguards to prevent misuse are not in place," the signatories wrote, citing infringement on civil liberties as a potential consequence.

Read more: Why it's totally unsurprising that Amazon's recruitment AI was biased against women


The letter is part of an ongoing back and forth between external experts worried about whether Amazon's Rekognition facial recognition software shows bias, and Amazon scientists who deny these claims.

The debate began when researchers at the MIT and the University of Toronto published a paper in January which found Amazon's Rekognition software, when tasked with deciding whether an image of a person was a man or a woman, was more inaccurate when it came to recognising women and people with darker skin tones.

In response, Amazon Web Services' general manager of artificial intelligence Dr. Matt Wood and its VP of Global Public Policy Michael Punke wrote a series of blog posts claiming that the paper was "misleading."

Wednesday's open letter dissects Wood and Punke's objections to the paper, concluding that it finds their response to the peer-reviewed paper "disappointing."

The signatories point out that, according to the study, Amazon had an error rate of approximately 31% when given women of color. It also took issue with binary gender classification methods, as they exclude non-binary genders. And they noted that Amazon has not disclosed who its Rekognition customers are, what its error rates are across different intersectional demographics, nor is the software subject to any kind of external audit.


Separately, civil rights organisation the ACLU has raised concerns about Rekognition in the past, and ran a test of the software in which it mistakenly identified 28 members of congress as people who had previously been arrested. The false matches disproportionately affected people of color.

"If law enforcement is using Amazon Rekognition, it's not hard to imagine a police officer getting a 'match' indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification," the ACLU wrote in a blog post at the time.

Amazon's response to the ACLU's finding was that it had miscalibrated the settings, in particular interpreting the "confidence" levels the software displays.

The open letter picked up on this argument, as a similar criticism was levelled by Amazon towards the original paper. The signatories write that systems like Rekognition have to be tested in real-world scenarios, and that police forces may not use or even be trained to understand Rekognition's "confidence" ratings.

The letter cites a Gizmodo article in which a Washington County Sheriff's Office public information officer said that police there, "do not set nor [...] utilize a confidence threshold."


Amazon was not immediately available for comment when contacted by Business Insider.