Skip to main content

Apple Privacy exec details system to detect CSAM in new interview

Last week, Apple announced three new features that target child safety on its devices. While intentions are good, the new features have not come without scrutiny, with some organizations and Big Tech CEOs being against Apple’s announcement.

The company published a FAQ about all of these new features and how they will work. Now, trying to avoid more controversy, Apple Privacy head Erik Neuenschwander addressed concerns about its new systems to detect CSAM in an interview with TechCrunch.

These features to protect children use CSAM detection in iCloud Photos, Communication Safety in Messages, and Interventions in Siri and search. Although these measures were announced together and are correlated, they are used for different things. For example:

  • CSAM detection in iCloud Photos: a dectection system called NeuralHash indentify and compare with IDs from the national Center for Missing and Exploited Children to detect known CSAM content in iCloud photo libraries;
  • Communication Safety in Messages: a parent can turn on this feature in its children under 13 device and will alert when an image they’re going to view has been detected to be explicit;
  • Intervetions in Siri and search: when a user tries to search for CSAM-related terms through Siri and Search, the user will be infromed of the intervention and offer resources.

You can learn more about all of this here.

In the interview with Neuenschwander, TechCrunch addresses some of the users’ concerns. For example, Neuenschwander explains why Apple announced the Communication Safety feature in Messages alongside the CSAM detection in iCloud Photos feature:

As important as it is to identify collections of known CSAM where they are stored in Apple’s iCloud Photos service, It’s also important to try to get upstream of that already horrible situation.  (…)  It is also important to do things to intervene earlier on when people are beginning to enter into this problematic and harmful area, or if there are already abusers trying to groom or to bring children into situations where abuse can take place, and Communication Safety in Messages and our interventions in Siri and search actually strike at those parts of the process. So we’re really trying to disrupt the cycles that lead to CSAM that then ultimately might get detected by our system.

Site default logo image

Another concern centers on governments and agencies trying to find a backdoor with this measure, which Neuenschwander explains that Apple is going to “leave privacy undisturbed for everyone not engaged in illegal activity.”

Asked about whether Apple should be trusted if a government try to compromise this new system, the Apple Privacy head says:

Well first, that is launching only for US, iCloud accounts, and so the hypotheticals seem to bring up generic countries or other countries that aren’t the US when they speak in that way, and the therefore it seems to be the case that people agree US law doesn’t offer these kinds of capabilities to our government.

But even in the case where we’re talking about some attempt to change the system, it has a number of protections built in that make it not very useful for trying to identify individuals holding specifically objectionable images. The hash list is built into the operating system, we have one global operating system and don’t have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled.

Neuenschwander also reinforces that if iCloud Photos is disabled, NeuralHash will not run and will not generate any vouchers:

If users are not using iCloud Photos, NeuralHash will not run and will not generate any vouchers. CSAM detection is a neural hash being compared against a database of the known CSAM hashes that are part of the operating system image. None of that piece, nor any of the additional parts including the creation of the safety vouchers or the uploading of vouchers to iCloud Photos is functioning if you’re not using iCloud Photos.

Read more:

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications