Amazon, Microsoft, and IBM are under pressure to follow Google and drop gender labels like 'man' and 'woman' from their AI
Trending News
Shona Ghosh/Business Insider
Google's API no longer uses gendered labels for photos
Microsoft, Amazon, and IBM are under pressure to stop automatically applying gendered labels such as "man" or "woman" from images of people, after Google announced in February it would stop using such tags.
All four companies offer powerful artificial intelligence tools that can classify objects and people in an image. The tools can variously describe famous landmarks, facial expressions, logos and gender, and have many applications including content moderation, scientific research, and identity verification.Now the AI researchers who helped bring about the change say Amazon's Rekognition, IBM's Watson, and Microsoft's Azure facial recognition should follow suit.
Joy Buolamwini, a computer scientist at MIT and expert in AI bias, told Business Insider: "Google's move sends a message that design choices can be changed. With technology it is easy to think some things cannot be changed or are inevitable. This isn't necessarily true."Shona Ghosh/Business Insider
Microsoft's AI continues to classify people in images by binary gender.
Buolamwini continued: "I would encourage all companies including the ones we've audited (IBM, Microsoft, Amazon, and others) to reexamine the identity labels they are using as demographic markers."
Sasha Costanza-Chock, an associate professor at MIT, added that firms should reconsider classification tags for people entirely.
"All classification tags on humans should be opt-in, consensual, and revokable," she told Business Insider.This would essentially involve dropping tags that identify people's race, class, and whether they have disabilities.
Costanza-Chock added that "binary" gender classifications were more likely to harm trans people and dark-skinned women since they are more likely to be misclassified. As one example, she pointed to transgender Uber drivers being locked out of the ridehailing app because their physical appearance no longer matched photos on file.
Asked about potential critics who might read Google's decision as a political one, she added: "If someone has never thought about the potential negative consequences of nonconsensual gender classification, this change might provide a good opportunity for them to learn more about why and how this can be harmful."Amazon pointed to its guidelines around gender classification on Rekognition, which state: "A gender binary (male/female) prediction is based on the physical appearance of a face in a particular image. It doesn't indicate a person's gender identity, and you shouldn't use Amazon Rekognition to make such a determination. We don't recommend using gender binary predictions to make decisions that impact an individual's rights, privacy, or access to services."
Get the latest Google stock price here.Copyright © 2021. Times Internet Limited. All rights reserved.For reprint rights. Times Syndication Service.
Next