Apple exec defends the company's much-criticized plan to scan iPhones for child abuse images, saying the feature has been misunderstood
Advertisement
Kevin Shalvey
Aug 14, 2021, 20:23 IST
An Apple Store in Manhattan.
Mike Segar/Reuters
Apple's child-safety features have been "misunderstood," an exec told The Wall Street Journal.
Earlier this month, the company announced two features that would scan iPhone and iCloud images.
"I think in no way is this a backdoor," SVP Craig Federighi told the Journal.
Advertisement
Apple's Craig Federighi, senior vice president of software engineering, said the company's plan to scan iPhone users' photos for child sexual abuse material has been "widely misunderstood."
"We wish that this had come out a little more clearly for everyone, because we feel very positive and strongly about what we're doing, and we can see that it's been widely misunderstood," Federighi said in a video interview with The Wall Street Journal's Joanna Stern published on Friday.
Apple earlier this month announced a feature that would create digital hashes of images as they're uploaded from iPhones to iCloud accounts. Those hashes would be compared to databases of known child sexual abuse material held by anti-abuse organizations, Apple said.
Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More
"Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes," the summary said.
Critics of the plan said it was a misstep for a company that's long made privacy a selling point. The Electronic Frontier Foundation (EFF) last Thursday called the hashing-and-matching update a privacy "backdoor," which could be expanded or exploited. Some Apple employees were worried the feature could be exploited by governments, Reuters reported.
Advertisement
"I think in no way is this a backdoor," Federighi said in the Journal interview. "I don't understand - I really don't understand that characterization."
The CSAM feature was one of two photo-scanning updates the company announced. The other would scan children's incoming iMessage photos for nudity, alerting parents when children under 12 years old viewed pornographic content.
"I do believe the sound-bite that got out early was, 'Oh my god, Apple is scanning my phone for images.' This is not what is happening," Federighi told the Journal via video. "This is about images stored in the cloud."
Eva Galperin, EFF's director of cybersecurity, said via Twitter that the rollout had not been misunderstood by privacy experts.
"I'd like to take this moment to make it clear to poor Craig that no, I don't misunderstand Apple's plans to check photos in iCloud against NCMEC's database of CSAM," Galperin said.
Advertisement
"It's well-meaning but it's also creating a mechanism that Apple will be forced to use for other things," she added.
{{}}
NewsletterSIMPLY PUT - where we join the dots to inform and inspire you. Sign up for a weekly brief collating many news items into one untangled thought delivered straight to your mailbox.
3 nightmares to keep Salesforce's Marc Benioff up at night
A Clubhouse influencer with more than 120,000 followers died less than 2 weeks after being hospitalized in LA with COVID-19
Meet Shou Zi Chew, TikTok's 39-year-old CEO who got his start at Facebook and led a Chinese smartphone giant through one of the biggest tech IPOs in history