Microsoft took an ethical stand on facial recognition just days after being blasted for a sinister AI project in China
Pedro Fiúza/NurPhoto via Getty Images
- Microsoft announced on Tuesday that it rebuffed a request from a US police agency to install its facial recognition software on officers' car and body cameras.
- President Brad Smith cited human rights concerns, saying that AI bias could mean a disproportionate number of women and ethnic minorities would end up being held for questioning.
- While Microsoft is taking an ethical stand on AI, less than a week ago it was accused of being complicit in helping China develop facial analysis AI, which could be used to oppressed its Uighur Muslim population.
- Visit BusinessInsider.com for more stories.
View all Offers
Amazon Brand - Solimo Accord Chair (Fabric ,black,1 Piece)₹ 4799₹ 9000Buy On
Amazon Brand - Solimo Medusa Engineered Wood Wardrobe with Mirror Wenge , 2 Doors₹ 8809₹ 15000Buy On
- 72% OFF
Savya home® Multi-Purpose New Laptop Table/Bed Table/Wooden Foldable Bed Table/LAPDESK/Study Table/Portable Table (Black Silver)₹ 699₹ 2500Buy On
- 39% OFF
AVRO FURNITURE 7756 Plastic Chair | Set of 4 | Matt and Gloss Pattern | for Dining Room, Bedroom, Kitchen, Living Room | Bearing Capacity up to 150Kg | Strong & Sturdy Structure | 1 Year Guarantee₹ 2729₹ 2999Buy On
Amazon Brand - Solimo Alpha Engineered Wood 4-Door Wardrobe (Espresso Finish)₹ 19049₹ 33499Buy On
Microsoft President Brad Smith announced on Tuesday that the company refused a request from a US police department to install its facial recognition software, citing human rights concerns, Reuters reports.
Speaking at a Stanford University conference on ethical AI, Smith said Microsoft had received the request from a California law enforcement agency to install the technology in officers' cars and body cameras.
"Anytime they pulled anyone over, they wanted to run a face scan," Smith said, adding the officer would check the person's face against a database.
He said the company concluded that the inherent bias in facial recognition - which is largely trained on white male faces - meant that it would be less accurate identifying women's and people from ethnic minorities' faces, therefore they would end up being held for questioning more frequently.
Smith called for tighter regulation on facial recognition and AI in general, warning that data-hungry companies could end up in a "race to the bottom." His comments come as pressure is building on Amazon to stop selling its facial recognition "Rekognition" software to law enforcement. Amazon shareholders are due to hold a vote on the issue on May 22.
However, Smith said Microsoft had provided the software to a US prison. Smith also told Business Insider in February that an all-0ut ban on selling facial recognition software to government agencies would be "cruel in its humanitarian effect."
Less than a week ago, the company's reputation took a bruising when it was accused of being complicit in helping a Chinese military-run university develop AI facial analysis, which critics said China could then use to oppress its citizens - specifically the country's Uighur Muslim minority.
Sen. Marco Rubio of Florida, one of the US government's most vocal China critics, described Microsoft's partnership with the Chinese military as "deeply disturbing" and "an act that makes them complicit" in China's human rights abuses.
Microsoft did not immediately respond to Business Insider's request for comment.
- HUL spent Rs 1,193 crore on advertising between October and December 2021
- Rohit Sharma, Rishabh Pant and Ravichandran Ashwin included in ICC Men's Test team of the Year 2021
- One in five Urban Company professionals took home at least ₹38,000 a month, claims the company
- Rythu Bandhu scheme - Telangana government deposits ₹7,400 crore into the bank accounts of 62 lakh farmers
- As tepid two wheeler sales hurt Bajaj Auto’s profitability, stock slides 4%