Skip to main content

Governments planned to misuse CSAM scanning tech even before Apple’s announcement

Governments were already discussing how to misuse CSAM scanning technology even before Apple announced its plans, say security researchers.

The biggest concern raised when Apple said it would scan iPhones for child sexual abuse materials (CSAM) is that there would be spec-creep, with governments insisting the company scan for other types of images, and there now seems good evidence for this …

Background

Apple insisted that it had a solid safeguard in place to protect privacy and prevent misuse. It would only match images against known CSAM databases; it would check at least two databases and require the image to be in both; action would only be triggered on 30 matching images; and there would be a manual review before law enforcement was alerted.

However, I and others were quick to point out that such promises are impossible to keep.

As the company has said in regard to previous controversies, Apple complies with local laws in each of the countries in which it operates. Indeed, it recently gave into blackmail before a new law even comes into force.

Any government could pass a law requiring tech companies to use their available capabilities (e.g., the CSAM scanning system) to look for images they say are associated with terrorism, or any type of political opposition.

Governments planned to misuse CSAM scanning tech

A new report today shows that this is far from a theoretical concern. A group of security researchers says that the European Union planned to use this technology to scan for other types of images even before Apple revealed that it had developed its own system.

The New York Times reports:

More than a dozen prominent cybersecurity experts on Thursday criticized plans by Apple and the European Union to monitor people’s phones for illicit material, calling the efforts ineffective and dangerous strategies that would embolden government surveillance.

In a 46-page study, the researchers wrote that the proposal by Apple, aimed at detecting images of child sexual abuse on iPhones, as well as an idea forwarded by members of the European Union to detect similar abuse and terrorist imagery on encrypted devices in Europe, used “dangerous technology” […]

The cybersecurity researchers said they had begun their study before Apple’s announcement. Documents released by the European Union and a meeting with E.U. officials last year led them to believe that the bloc’s governing body wanted a similar program that would scan not only for images of child sexual abuse but also for signs of organized crime and indications of terrorist ties.

A proposal to allow the photo scanning in the European Union could come as soon as this year, the researchers believe.

While the EU proposal was an independent initiative to use the same type of technology as Apple, it is not exactly a giant leap to imagine that – now the EU knows Apple possesses this capability – it might simply pass a law requiring the iPhone maker to expand the scope of its scanning. Why reinvent the wheel when a few strokes of a pen can get the job done in 27 countries?

Image databases used within the EU may well be trustworthy, but once this precedent has been set, it would be a very small step for less enlightened governments to pass equivalent laws.

The researchers say that Apple’s approach is incredibly dangerous.

“It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens,” the researchers wrote […]

“Expansion of the surveillance powers of the state really is passing a red line,” said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group […]

“It’s allowing scanning of a personal private device without any probable cause for anything illegitimate being done,” added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. “It’s extraordinarily dangerous. It’s dangerous for business, national security, for public safety and for privacy.”

Photo: Peter Forster/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications