Skip to main content

Apple delays rollout of CSAM detection system and child safety features

Last month, Apple announced a handful of new child safety features that proved to be controversial, including CSAM detection for iCloud Photos. Now, Apple has said they will “take additional time” to refine the features before launching to the public.

In a statement to 9to5Mac, Apple said:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple’s new child safety features were set to launch as part of updates to iOS 15, iPadOS 15, and macOS Monterey later this year. There is now no word on when the company plans to roll out the features. Apple’s statement today does not provide any details on what changes the company could make to improve the system.

As a refresher, here’s the basics of how the CSAM detection system would work as currently designed:

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

Upon announcement, the new CSAM detection technology received quite a bit of pushback and criticism from privacy advocates. Apple, however, doubled down on the feature multiple times, and said that its implementation would actually be more privacy-preserving than technology used by other companies like Google and Facebook.

It was also revealed through this process that Apple already scans iCloud Mail for CSAM, with the expansion applying to iCloud Photos.

Other child safety features announced by Apple last month, and also now delayed, include communications safety features in Messages and updated knowledge information for Siri and Search.

What do you make of Apple’s decision to delay the rollout of its new child safety features? Is it the right decision, or should the company have stuck to its initial plan?

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Chance Miller Chance Miller

Chance is the editor-in-chief of 9to5Mac, overseeing the entire site’s operations. He also hosts the 9to5Mac Daily and 9to5Mac Happy Hour podcasts.

You can send tips, questions, and typos to chance@9to5mac.com.

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications