Skip to main content

Apple finally admits the CSAM scanning flaw we all pointed out at the time

Almost nine months after Apple confirmed that it had abandoned plans to carry out CSAM scanning, the company has finally admitted the flaw which so many of us pointed out at the time.

The company explained the reason it decided against scanning devices for child sexual abuse materials (CSAM) in a statement to Wired

A quick history of Apple’s CSAM scanning mess

We first learned of Apple’s CSAM scanning plans when they were leaked shortly before the company announced them in August 2021. Cryptography and security expert Matthew Green tweeted the plans, saying it was a bad idea.

I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.

The leak – which didn’t include details of the protections Apple has against false positives – meant that four concerns were raised ahead of the announcement.

Security experts continued to raise concerns even after the announcement, as did many of Apple’s own employees. Apple responded both on and off the record, before announcing that plans to roll out the feature had been paused.

Things then went very quiet for a very long time, and more than a year passed before the company finally said it had abandoned its plans.

The biggest CSAM scanning flaw

Apple addressed some of the concerns raised, but not the biggest problem of all: The potential for misuse of the feature by authoritarian governments.

As we noted at the time, a digital fingerprint can be created for any type of material, not just CSAM. There’s nothing to stop an authoritarian government adding to the database images of political campaign posters or similar.

A tool designed to target serious criminals could be trivially adapted to detect those who oppose a government or one or more of its policies. Apple – who would receive the fingerprint database from governments – would find itself unwittingly aiding repression or worse of political activists.

Apple claimed that it would never have allowed this, but the promise was predicated on Apple having the legal freedom to refuse, which would simply not be the case. In China, for example, Apple has been legally required to remove VPNnews, and other apps, and to store the iCloud data of Chinese citizens on a server owned by a government-controlled company

There was no realistic way for Apple to promise that it will not comply with future requirements to process government-supplied databases of “CSAM images” that also include matches for materials used by critics and protestors. As the company has often said when defending its actions in countries like China, Apple complies with the law in each of the countries in which it operates.

Apple has now admitted this problem

In a statement to Wired, Apple now acknowledges this.

Erik Neuenschwander, Apple’s director of user privacy and child safety, wrote […]

“It would […] inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

9to5Mac’s Take

We’ve previously argued that the whole mess was entirely predictable, and that if Apple wanted to do this at all, it should have taken a different approach.

Ironically, Apple’s best bet would have been to do something that is actually less transparent and less private, but also less controversial. That is, scan photos on iCloud against the CSAM hashes. And simply note that in the iCloud privacy policy (observing that all cloud services do this).

That would be less controversial because everyone else does it, and because security experts already knew that iCloud isn’t private: The fact that Apple doesn’t use end-to-end encryption means that we already knew it holds the key, and we already knew that it cooperates with law enforcement by handing over iCloud data on receipt of a court order.

If it had started doing this, I think most security experts would simply have shrugged – nothing new here – and it wouldn’t have come to the attention of mainstream media.

We’d acknowledged this might not have worked in the longer-term, but that would have allowed the opportunity to better explain plans for a different approach. But at least the company has finally admitted what so many of us said all along.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing