We learned yesterday that a proposed new EU CSAM scanning law for tech giants would force Apple to revisit its own plans for detecting child sexual abuse materials. The company had quietly set these aside in response to a huge amount of controversy over its proposed approach.
Many had feared that the proposed law would involve yet another assault on end-to-end encrypted messaging, and this has now been confirmed by wording in the document …
Background
There’s no question that there is a large-scale problem with child sexual abuse materials. The National Center for Missing & Exploited Children (NCMEC) said that it last year received 29.3M reports, almost all of which came from ISPs and cloud companies as a result of CSAM scanning on their servers.
The question is how best to tackle this, without invading the privacy of innocent users. Apple last year thought it had solved this problem. It announced plans for on-device scanning in a way that meant only confirmed matches would ever be viewed by a human moderator.
However, experts and privacy campaigners quickly pointed out four problems with Apple’s approach. The company found a solution to one of these – accidental false positives – by setting a threshold of 30+ images for a report to be filed. The three other problems remain.
Proposed EU CSAM scanning law
The European Union yesterday published the draft of a new law that would require tech giants to conduct CSAM scanning. This requirement extends not just to detecting CSAM images but also grooming attempts, which would require scanning of text.
This is not currently possible with iMessage, nor other apps like WhatsApp, which use end-to-end encryption.
Wired reports:
All of your WhatsApp photos, iMessage texts, and Snapchat videos could be scanned to check for child sexual abuse images and videos under newly proposed European rules. The plans, experts warn, may undermine the end-to-end encryption that protects billions of messages sent every day and hamper people’s online privacy […]
Under the plans, tech companies—ranging from web hosting services to messaging platforms—can be ordered to “detect” both new and previously discovered CSAM, as well as potential instances of “grooming.” The detection could take place in chat messages, files uploaded to online services, or on websites that host abusive material […]
The European proposal to scan people’s messages has been met with frustration from civil rights groups and security experts, who say it’s likely to undermine the end-to-end encryption that’s become the default on messaging apps such as iMessage, WhatsApp, and Signal.
“Incredibly disappointing to see a proposed EU regulation on the internet fail to protect end-to-end encryption,” WhatsApp head Will Cathcart tweeted. “This proposal would force companies to scan every person’s messages and put EU citizens’ privacy and security at serious risk.” Any system that weakens end-to-end encryption could be abused or expanded to look for other types of content, researchers say.
Legislators have persisted in calling for backdoors into E2E encrypted messages, consistently failing to understand that it’s a technological impossibility. As University of Surrey cybersecurity professor Alan Woodward puts it: “You either have E2EE or you don’t.”
Woodward does note that there is a possible workaround: on-device scanning after the message has been decrypted. But that is precisely the same approach Apple proposed to use for CSAM scanning, which led to such a furor about the potential for abuse by repressive governments.
Photo: Ali Abdul Rahman/Unsplash
FTC: We use income earning auto affiliate links. More.
Comments