British police scanned 8,600 people's faces in London without their consent, resulting in just 1 arrest and 7 false positives

Advertisement
British police scanned 8,600 people's faces in London without their consent, resulting in just 1 arrest and 7 false positives
Oxford Circus

William Perugini/Shutterstock

Advertisement

Oxford Circus, London.

  • Stats released by the UK's biggest police force show it used live facial recognition to scan the faces of 8,600 people in a busy shopping area in London last week.
  • The system threw up eight alerts from the Metropolitan Police database, but only one of these was a correct identification.
  • The Met Police announced early this year that it would start rolling out facial recognition in London despite pushback from advocacy groups.
  • Visit Business Insider's homepage for more stories.

British police used facial recognition to scan more than 8,000 people's faces in one of London's busiest shopping districts, and it yielded far more false than correct positives.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

As spotted by British privacy advocacy group Big Brother Watch, statistics released by London's Metropolitan Police show it deployed Live Facial Recognition (LFR) technology on February 27 at Oxford Circus, a busy area in the center of London frequented by tourists and shoppers.

The figures show the police scanned roughly 8,600 people, and the technology threw up eight matches from its database of faces. Seven of these were false positives however, resulting in five incorrect "engagements."

Advertisement

One of the faces flagged by the system did result in an arrest, although the Met gave no details about the individual or on what charges they were arrested.

Big Brother Watch wrote on Twitter that this meant 86% of the alerts the system threw up were false, and 71% of these misidentifications resulted in the police stopping and questioning someone. The Met Police was not immediately available for comment when asked by Business Insider what constituted an "engagement."

Big Brother Watch added: "This blows apart the Met's defence that facial recognition surveillance is in any way proportionate or that the staggering inaccuracy is mitigated by human checks.

"This is a disaster for human rights, a breach of our most basic liberties and an embarrassment for our capital city."

The Met announced in January this year that it would start rolling out live facial recognition technology in London, and last month Commissioner Cressida Dick pushed back against criticisms by advocacy groups that the technology poses threats to privacy and civil liberties.

Advertisement

Criticisms of police use of facial recognition center on its inaccuracy and invasion of privacy. In particular AI experts have pointed to racial and gender bias in facial recognition systems, which more frequently misidentify women and people of color and could contribute to overpolicing of those groups. The Met Police has said its system has been proven not to have any ethnic bias.

Some US cities including San Francisco and Oakland in California and Portland, Oregon have passed laws banning police use of facial recognition.

NOW WATCH: How running shoes can be recycled into ski boots

{{}}