1. Home
  2. tech
  3. enterprise
  4. news
  5. Protests planned against Apple’s controversial child safety features on the eve of iPhone 13 launch

Protests planned against Apple’s controversial child safety features on the eve of iPhone 13 launch

Protests planned against Apple’s controversial child safety features on the eve of iPhone 13 launch
  • There are multiple protests being planned at Apple retail stores on September 13.
  • The protests are against Apple’s child safety features to scan images and messages in iCloud and iMessage, respectively.
  • Apple planned to release these features on iOS 15 but delayed it due to backlash.
Apple last night announced a new event for September 14 where it is expected to launch the new iPhone 13 series. It will most likely announce the release of iOS 15 on the same day. Now there are reports that something unusual is going to happen a day before the iPhone launch — i.e protests at Apple retail stores in the US. The protests are going to be against Apple’s new child safety features on iOS 15.

The protests are being organised by ‘Fight For the Future’ who have launched a website called “No Spy Phone” with the title “Tell Apple: No Spyware on my Phone”. Its description reads, “Apple is abandoning its commitment to privacy with iOS 15 by creating an unprecedented backdoor it can use to scan everything on your Apple devices, including photos and messages. They say that this is to stop child sex abuse material, but once the backdoor exists, it will be used to surveil & censor people. Don’t let Apple throw away the privacy and security of billions.”

Apple’s child safety features

Apple plans to use on-device machine learning to warn about sensitive content in iMessage. It also announced a new tool to detect known child sexual abuse material (CSAM) images in iCloud. Apple will use an on-device matching process using a database of known CSAM image hashes provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organisations. Apple then transforms this database into an unreadable set of hashes that are “securely stored on users’ devices.” The goal is to detect any image that matches the CSAM images.

Apple also said that the on-device tool will only flag an iCloud if it detects at least 30 potential child sexual abuse images. It will also first review the detected images and then send it to NCMEC. Users can file an appeal to have their account reinstated if they think it has been mistakenly flagged but Apple said that this system ensures “one in one trillion chance per year of incorrectly flagging a given account.” It also assured that there’s no scanning of images being done here and the company is only notified if users have a collection of CSAM photos in their iCloud account.

Apple announced this new feature which was supposed to be released with iOS 15 and iPadOS 15. This tool however received backlash as it could potentially violate users’ privacy on their iOS devices, and how it could be used by governments to scan for other types of content. Considering the backlash from users, advocacy groups and researchers, Apple decided to pause the feature and make improvements.

But this organisation is firm on making Apple take back their decision to introduce this tool. The website, first spotted by Input Mag, has a registration section for interested people who want to sign a petition. It requires your first name, last name, email and ZIP or postal code. It also urges people to email Apple leadership to not go through the rollout of this feature. You can find a very well-displayed list of Apple’s leadership including Craig Federighi and Dan Riccio with their designation and email address. There’s also a map with all the locations for the protests, the start time and the option to RSVP to the event.


We might see the first glimpse of Facebook and Ray-Ban’s smart glasses on September 9
Apple's big iPhone 13 reveal event is on September 14


Popular Right Now