Leaked Apple documents reveal that Siri was designed to deflect questions about feminism and #MeToo, report says

Advertisement
Leaked Apple documents reveal that Siri was designed to deflect questions about feminism and #MeToo, report says

apple siri

Florence Fu/Tech Insider

Advertisement
  • Apple's Siri has been re-written to "deflect" questions about feminism and the #MeToo movement, according to The Guardian.
  • The project instructs Siri developers to respond to such requests either by disengaging, deflecting, and informing because Apple wants its virtual helper to appear guarded and neutral, says the report.
  • The revelation comes after Apple and other large tech firms have come under fire for the way their digital helpers respond to sexually explicit insults.
  • The leaked information came from a former Siri grader, one of the contract workers tasked with evaluating Siri's accuracy. Apple recently halted the program after it came under scrutiny following a Guardian report revealing that contractors overheard private conversations.
  • Visit Business Insider's homepage for more stories.

Apple has instructed those working on its Siri digital assistant to design it to "deflect" questions about hot-button issues such as feminism and the #MeToo movement, according to The Guardian. The revelation comes after the iPhone maker and other tech firms had been criticized for how their virtual helpers respond to queries about sexual harassment in the past.

The internal project reported by The Guardian tells Siri's development team to design the virtual assistant's responses to such questions in the following ways: "don't engage," "deflect," and "inform." Siri's responses should be written to suggest that it is in favor of equality while avoiding the word "feminism," says The Guardian, which obtained leaked documents that had been last updated in June 2018.

That's because Apple wants Siri to remain "guarded" and "neutral" when dealing with sensitive topics, according to the report. When asking Siri if it's a feminist or what it thinks of feminism, the digital assistant will offer a response like: "I am a believer in equality, and treating people with respect."

Apple did not immediately respond to Business Insider's request for comment.

Advertisement

Read more: Apple's former Siri chief says today's digital assistants still have a long way to go before they can really understand us

The publication says it received the documents from a former Siri grader - a contractor that had been hired to evaluate Siri's responses to improve its accuracy.

Apple scrapped its grading program for Siri after a previous report from The Guardian revealed that human workers regularly overheard private conversations. The company will reinstate the program in the fall after making several changes and issuing software updates. Apple will stop recording Siri conversations by default and instead allow users to opt in to share recordings to improve Siri.

The internal project leaked to The Guardian comes after Apple and other tech firms like Amazon, Microsoft, and Google came under fire for the way their respective voice-enabled assistants respond to requests involving insults and sexual comments.

An experiment conducted by Quartz in 2017 found that such comments sometimes resulted in evasive or flirtatious answers or jokes. Following the report, a petition asking tech companies to change the way their digital assistants respond to such requests appeared on the website Care2, garnering nearly 17,000 signatures.

Advertisement

Amazon has also made changes to the way Alexa responds to such queries by creating a "disengagement mode" that enables it to say something like "I'm not going to respond to that" when presented with sexually explicit requests, according to Quartz.

Apple's guidelines for Siri that were leaked to The Guardian also note that Siri is "non human" and "doesn't have a point of view," describing the digital helper's presence as "genderless" and "playful." That echoes the approach Google has taken with the Google Assistant, as the search giant's personality team seeks to craft responses that feel human without pretending to be human, which TIME reported in 2017.

Do you or have you ever worked as a Siri grader for Apple? If so, we want to hear from you. Contact this reporter securely at lisaeadicicco@protonmail.com.

Exclusive FREE Slide Deck: Top 3 Biggest Smartphone Trends by Business Insider Intelligence

{{}}