Skip to main content

Apple confirms CSAM detection only applies to photos, defends its method against other solutions

Apple continues to offer clarity around the CSAM (child sexual abuse material) detection feature it announced last week. In addition to a detailed frequently asked questions document published earlier today, Apple also now confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos.

The company also continues to defend its implementation of CSAM detection as more privacy-friendly and privacy-preserving than other companies.

Apple confirmed today that, at launch, CSAM detection will only apply to photos that are stored in iCloud Photos, not videos. Given the proliferation of video in CSAM content, however, the company acknowledged that there is more it could do in the future, and it can expand and evolve its plans over time.

This makes sense when you step back and look at how Apple’s scanning for CSAM detection works. All of the matching is done on device, with Apple transforming a database of photos from the National Center for Missing and Exploited Children into an “unreadable set of hashes that is securely stored on users’ devices.” The on-device database is then checked against photos, and there is an on-device match, the device then creates a cryptographic safety voucher that encodes the match result.

Apple is also reinforcing that if a user does not use iCloud Photos, then no part of the CSAM detection process runs. This means that if a user wants to opt out of the CSAM detection process, they can disable iCloud Photos.

Finally, Apple is also doubling down on why it believes its on-device implementation of CSAM detection is far better than the server-side implementations used by other companies. Those implementations, Apple explains, require that a company scan every single photo stored by a user on its server, the majority of which are not CSAM.

Apple’s implementation of the CSAM detection does not require that Apple servers scan every photo. By moving the process on device, Apple’s method is more secure and is designed to only check the hashes of the images against the NCMEC database of images of known CSAM, as opposed to server-side scanning of all images.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Chance Miller Chance Miller

Chance is an editor for the entire 9to5 network and covers the latest Apple news for 9to5Mac.

Tips, questions, typos to chance@9to5mac.com

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing