Apple is delaying its controversial plan to scan iPhones for images of child sex abuse after privacy backlash

Apple is delaying its controversial plan to scan iPhones for images of child sex abuse after privacy backlash
Apple will install software on American iPhones that will look for child abuse imagery, the Financial Times reported. MLADEN ANTONOV/Getty Images
  • Apple's controversial safety feature that scans iPhones for iCloud images of child sex abuse is being delayed.
  • The company originally announced the plan in August, with a rollout intended for the fall with the iOS 15.
  • Several tech personalities and advocacy groups criticized the plan for its implications on user privacy.

Apple is delaying its much criticized plan to scan users' iPhones for images of child sex abuse material (CSAM), CNBC first reported.

"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," Apple told CNBC in a statement. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

The proposed software update would include an algorithm that could scan a users' iCloud account for images and compare them to a database of known CSAM material. The relevant information would then be flagged by a human reviewer and sent to authorities.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

The company said in August that it intended to roll out the feature with the iOS 15 this fall, but it has opted to delay that part of the software update with no further indication on when it would be released.

An Apple spokesperson did not immediately respond to a request for comment.


A controversial update that ignited a debate over privacy

While some politicians and law enforcement entities have long asked Apple to do more to aid authorities in their pursuit of criminals, the tech giant drew criticism from users, digital privacy advocates, and tech personalities over the proposed plan.

NSA whistleblower Edward Snowden likened the plan to "mass surveillance," turning iPhone devices into "iNarcs." Epic Games CEO Tim Sweeney described the software as "government spyware." WhatsApp founder Will Cathcart also chimed in saying the plan marked a "setback for people's privacy all over the world."

An open letter sent to Apple in August asked the company to reconsider the plan, and was signed by about 5,000 organizations and individuals, including Electronic Frontier Foundation and the Freedom of Press Foundation, the latter where Snowden serves as president.

Some Apple employees also expressed hesitations and worries that the software could provide the basis for repressive governments to use the collected information to exploit, censor, or arrest individuals. Workers in the company reportedly set up a private Slack channel to voice their concerns, sending over 800 messages debating the potential consequences of the technology.

Apple, which says its services have long championed user privacy, defended the software push, with Craig Federighi, senior vice president of software engineering at Apple, saying the technology and its purpose have been "widely misunderstood." The company reiterated the technology is limited to scanning iCloud-backed up photos, which is accessible to the company, and will only compare the data against images provided by the National Center for Missing and Exploited Children.