Skip to main content

Apple’s CSAM troubles may be back, as EU announces a law requiring detection [U]

Update: The EU has now announced the proposed new law. More details at the bottom.

Apple’s CSAM troubles may be back, after controversy over the issue of scanning iPhones for child sexual abuse materials led to the company suspending its plans.

A report today says that the European Union is planning a law that would require tech giants like Apple to detect, report, and remove CSAM, and that we’ll see a draft of the new law as early as this week …

Apple’s CSAM troubles

Most cloud services already scan for child sexual abuse materials. Any examples detected are reported to law enforcement.

Apple wanted to do the same, but at the same time wanted to do the scanning in a manner which protected user privacy. It therefore announced plans for on-device scanning in a way that meant only confirmed matches would ever be viewed by a human moderator.

  • Apple downloads the CSAM database hashes to your iPhone
    (digital signatures of CSAM images, not actual images, obviously).
  • An on-device process looks for matches with hashes of your photos.
  • If fewer than 30 are found, no action is taken.
  • If 30+ matches are found, low resolutions of your photos are manually examined.
  • If the photos are found to be innocent, no further action is taken.
  • If manual review confirms them as CSAM, law enforcement is informed.

However, experts and campaigners immediately pointed out potential flaws in the approach – something Apple should have expected, but apparently didn’t.

Concerns have been raised by cybersecurity expertshuman rights organizationsgovernments, and Apple’s own employees. Four main concerns have been raised, explained here:

  • Accidental false positives could ruin someone’s reputation.
    (Apple addressed this one by setting a threshold of 30+ matches.)
  • Deliberate false positives (aka collision attacks) could be created to achieve the same goal.
  • Authoritarian governments could add political posters and similar to the database.
  • The same hash-based on-device searches could be later applied to iMessage.

The company then said that it was going to take some time to rethink its plans. That was in September of last year, and eight months have passed without a single word on the subject from Apple, leading some to suspect that the company intended to simply pretend it had never happened for as long as it could. But that may not be possible for much longer …

Planned European law on CSAM detection

Politico reports that the European Union is planning on announcing a new law requiring tech giants to scan for CSAM. That would leave Apple having to figure out how to comply without reigniting the controversy.

The Commission is expected to release a draft law this week that could require digital companies like Meta Platforms, Google and Apple to detect, remove and report illegal images of abuse to law enforcement under threat of fines.

According to a leak of the proposal obtained by POLITICO on Tuesday, the Commission said voluntary measures taken by some platforms have so far “proven insufficient” to address the misuse of online services for the purposes of child sexual abuse.

The rulebook comes as child protection hotlines report a record amount of disturbing content circulating online during the coronavirus pandemic. Europe is a hot spot for hosting such content, with 62 percent of the world’s illegal images located on European data servers in 2021.

The situation is likely to get messy, as one of the key proponents of the new law appears to be opposed to end-to-end encryption. Home Affairs Commissioner Ylva Johansson said:

Abusers hide behind the end-to-end encryption; it’s easy to use but nearly impossible to crack, making it difficult for law enforcement to investigate and prosecute crimes.

We’ve been pointing out for many years that it is impossible to simultaneously protect user privacy with end-to-end encryption while also creating backdoors for law enforcement.

Update: Proposal now official

The EU has now formally announced the measure:

Today, the Commission is proposing new EU legislation to prevent and combat child sexual abuse online. With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive. The COVID-19 pandemic has exacerbated the issue, with the Internet Watch foundation noting a 64% increase in reports of confirmed child sexual abuse in 2021 compared to the previous year. The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires. Up to 95% of all reports of child sexual abuse received in 2020 came from one company, despite clear evidence that the problem does not only exist on one platform.

To effectively address the misuse of online services for the purposes of child sexual abuse, clear rules are needed, with robust conditions and safeguards. The proposed rules will oblige providers to detect, report and remove child sexual abuse material on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.

A copy of the proposed law can be downloaded here. We’ll of course see privacy, tech, and legal experts weighing in, and will report on reactions.

Photo: Christina @ wocintechchat.com/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications