Skip to main content

CSAM scanning would be abused, says Apple – using argument it originally rejected

When Apple announced its own approach to CSAM scanning, many of us warned that the process used to check for child sexual abuse materials would ultimately be abused by repressive governments to scan for things like political protest plans.

The Cupertino company rejected that reasoning at the time, but in an ironic twist is now using precisely this argument in response to the Australian government …

Apple’s original CSAM scanning plans

Apple originally planned to carry out on-device scanning for CSAM, using a digital fingerprinting technique.

These fingerprints are a way to match particular images without anyone having to view them, and are designed to be sufficiently fuzzy to continue to match images which have been cropped or otherwise edited, while generating very few false positives.

To be clear, Apple’s proposal was a privacy-respecting approach, as scanning would be performed by our own devices, and nobody would ever look at any of our photos unless multiple matches were flagged.

The repressive government problem

The problem, as many of us observed, was the potential for abuse by repressive governments.

A digital fingerprint can be created for any type of material, not just CSAM. There’s nothing to stop an authoritarian government adding to the database images of political campaign posters or similar.

A tool designed to target serious criminals could be trivially adapted to detect those who oppose a government or one or more of its policies. Apple – who would receive the fingerprint database from governments – would find itself unwittingly aiding repression or worse of political activists.

Apple claimed that it would never have allowed this, but the promise was predicated on Apple having the legal freedom to refuse, which would simply not be the case. In China, for example, Apple has been legally required to remove VPNnews, and other apps, and to store the iCloud data of Chinese citizens on a server owned by a government-controlled company

There was no realistic way for Apple to promise that it will not comply with future requirements to process government-supplied databases of “CSAM images” that also include matches for materials used by critics and protestors. As the company has often said when defending its actions in countries like China, Apple complies with the law in each of the countries in which it operates.

Apple’s three-stage U-turn

Apple initially rejected this argument, but said that in response to widespread concern about it, it has decided to abandon its plans anyway.

The company subsequently shifted its stance to admitting that the problem existed.

Erik Neuenschwander, Apple’s director of user privacy and child safety, wrote: “It would […] inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

We’ve now reached stage three: Apple itself using the argument it initially rejected.

Apple uses argument against Australian government

The Australian government is proposing to force tech companies to scan for CSAM, and The Guardian reports that Apple is now using the slippery slope argument to fight the plan.

Apple has warned an Australian proposal to force tech companies to scan cloud and messaging services for child-abuse material risks “undermining fundamental privacy and security protections” and could lead to mass surveillance with global repercussions […]

“Scanning for particular content opens the door for bulk surveillance of communications and storage systems that hold data pertaining to the most private affairs of many Australians,” Apple said.

“Such capabilities, history shows, will inevitably expand to other content types (such as images, videos, text, or audio) and content categories.”

Apple said such surveillance tools could be reconfigured to search for other content, such as a person’s political, religious, health, sexual or reproductive activities.

The government says that it has listened to “a lot of good feedback” from Apple and others, and will “incorporate what we can” into a revised version of the plan.

Photo by FlyD on Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications