Apple will soon start scanning iPhone photos for child abuse material

Apple will soon start scanning iPhone photos for child abuse material
Apple will scan iPhone photos for child abuse material.Unsplash
  • Apple will scan iPhone photos to find any sexually explicit content involving children.
  • It will use cryptographic technology to detect such images on a user's iPhone and report it to child protection agencies.
  • The detection tool will also work in Siri and Search, and Messages app.
Apple has announced that it will scan photos in the iPhone to watch out for any child sexual abuse material (CSAM) that refers to sexually explicit content involving a child. Apple will use a new technology to detect any CSAM images stored in the iCloud Photos.

In a statement, Apple said that it is introducing the new measures to help protect children from predators and prevent the sharing of CSAM images. The CSAM detection tool will work in three areas which includes Photos, Siri and Search, and Messages.

Here’s how the iPhone maker will use its technology to detect and prevent the spread of CSAM.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More
Here’s how Apple’s CSAM detection tech works

If a user’s device is found with any CSAM images, Apple will report them to the National Center for Missing and Exploited Children (NCMEC). For detection, Apple will use a hashing technology that will perform an on-device comparison with the database of CSAM images provided by NCMEC. Apple will then convert these images into an “unreadable set of hashes” and store them on the user’s iPhone.

A cryptographic technology will determine if an image in a user’s device matches that with the database hashes. If there is a match, then Apple will manually review the image and if the user is found guilty, their account will be disabled and a report will be shared with NCMEC. If an account is mistakenly disabled, the user can file an appeal to have their account reinstated.

Prevention of CSAM image sharing on Messages
If a sexually explicit image is sent or received by children via the Messages app, a warning would be sent to the child and the parents. If any such content is received by the child, the image will be blurred and it will accompany a warning that if they view the image, a message will be sent to their parents for safety.
Apple will soon start scanning iPhone photos for child abuse material
If a child attempts to send sexually explicit content, they will be warned and the parents will be notified if the child chooses to send it.

Siri and Search
As an expansion of its guidance, Apple will use Siri to inform users about staying safe online and how they can get help in unsafe situations. If a user decides to perform a CSAM related search, then Siri and Search will intervene and explain to users that the searched topic is problematic and harmful.
Apple will soon start scanning iPhone photos for child abuse material
These updates will roll out later this year to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It will first be introduced in devices in the US and could be expanded to other countries in the future.

The move has come under the scanner from cybersecurity experts as they claim that this could breach a user’s privacy and also lead governments to misuse the technology.

An expert Mathew Green tweeted raising concerns about the new development from Apple. In one of the tweets he says, “This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?”

Apple removes anti-vaxx dating app Unjected from the App Store for 'inappropriately' referring to the pandemic. The app's owners say it's censorship.
How to download iCloud for Windows so you can sync your files across Apple devices and a PC