Apple employees fear that repressive governments could exploit a new feature that scans iPhones for child sex abuse images, a report says

Advertisement
Apple employees fear that repressive governments could exploit a new feature that scans iPhones for child sex abuse images, a report says
Apple CEO Tim Cook. Karl Mondon/Digital First Media/The Mercury News via Getty Images
  • Apple employees are worried repressive governments may exploit a new iPhone feature.
  • The software, rolling out first in the US, can scan iPhones for child sex abuse images.
  • Some staff fear governments could use it to censor or arrest people, employees told Reuters.
Advertisement

Apple employees have raised concerns that repressive governments could exploit upcoming software that can scan iPhones for child sex abuse images, Reuters reported on Thursday.

The tech giant is planning to roll out software on iPhones, starting in the US later this year, that can detect photos of child sexual abuse, the Financial Times first reported on August 5.

Since then, Apple employees have created an internal Slack channel about the feature, Reuters reported. More than 800 messages were sent on the channel, staff who wanted to remain anonymous told Reuters.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

Workers are worried that the feature could be exploited by repressive governments who want to censor or arrest people in their country, workers who saw the channel told Reuters.

In the Slack channel about the scanning feature, some employees pushed back against the criticism, while others said Slack wasn't the proper forum for such discussions, workers told Reuters.

Advertisement

Apple declined to comment on the issue to Reuters.

When asked for comment, Apple directed Insider to a FAQ about the software. The FAQ says the feature only applies to photos uploaded to iCloud, and that Apple won't share any information with law enforcement or the US National Center for Missing & Exploited Children before Apple conducts a human review.

"We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands," Apple wrote in its FAQ.

"We will continue to refuse them in the future."

Past security changes at Apple have also sparked concern among employees, but the volume and duration of the new debate is surprising, staff told Reuters. Some workers Reuters spoke to worried that Apple was damaging its reputation for protecting privacy.

Advertisement

Around 5,000 organizations and individuals signed an open letter last week asking Apple to rethink its rollout of the photo-scanning feature. The letter said the software opened "a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products."

Will Cathcart, head of WhatsApp, also said in a Twitter thread on August 6 that the software was "a setback for people's privacy all over the world."

{{}}