9to5Mac

Apple Privacy exec details system to detect CSAM in new interview

By José Adorno

August 10, 2021

In the interview with Neuenschwander, TechCrunch addresses some of the users’ concerns. For example, Neuenschwander explains why Apple announced the Communication Safety feature in Messages alongside the CSAM detection in iCloud Photos feature:

As important as it is to identify collections of known CSAM where they are stored in Apple’s iCloud Photos service, It’s also important to try to get upstream of that already horrible situation.  (…)  It is also important to do things to intervene earlier on when people are beginning to enter into this problematic and harmful area, or if there are already abusers trying to groom or to bring children into situations where abuse can take place, and Communication Safety in Messages and our interventions in Siri and search actually strike at those parts of the process. So we’re really trying to disrupt the cycles that lead to CSAM that then ultimately might get detected by our system.

Asked about whether Apple should be trusted if a government try to compromise this new system, the Apple Privacy head says: Well first, that is launching only for US, iCloud accounts, and so the hypotheticals seem to bring up generic countries or other countries that aren’t the US when they speak in that way, and the therefore it seems to be the case that people agree US law doesn’t offer these kinds of capabilities to our government.

But even in the case where we’re talking about some attempt to change the system, it has a number of protections built in that make it not very useful for trying to identify individuals holding specifically objectionable images. The hash list is built into the operating system, we have one global operating system and don’t have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled.

Neuenschwander also reinforces that if iCloud Photos is disabled, NeuralHash will not run and will not generate any vouchers: If users are not using iCloud Photos, NeuralHash will not run and will not generate any vouchers. CSAM detection is a neural hash being compared against a database of the known CSAM hashes that are part of the operating system image. None of that piece, nor any of the additional parts including the creation of the safety vouchers or the uploading of vouchers to iCloud Photos is functioning if you’re not using iCloud Photos.

Apple’s implementation of this CSAM detection feature is highly technical, and more details can be learned by tapping the "Learn more" button below.