Skip to main content

EARN IT bill to tackle CSAM made it through committee; threat to encryption remains

The ridiculously named EARN IT bill – the Eliminating Abuse and Rampant Neglect of Interactive Technologies Act – has made it through committee stage.

Like many measures designed to tackle Child Sexual Abuse Material (CSAM), the bill is well intentioned, but could have some very significant unintended consequences. These may include making it harder to prosecute offenders, and making end-to-end encryption less available …

Background

The bill was first proposed back in 2019, and concerns were raised that it could threaten end-to-end encryption.

The acronym is intended to suggest that tech companies should be required to “earn” the right to Section 230 protections, which mean that companies proving communication platforms can’t be held legally liable for things posted by users.

Reuters reports that the bill seeks to impose conditions on this protection, and that providing a backdoor to encryption is believed to be one of them.

Bill co-sponsor Senator Lindsey Graham has previously called on Apple to break the strong encryption used to protect iPhones.

Three concerns about the EARN IT bill

CNN reports that the bill has now made it through committee, despite concerns.

A controversial bill targeting how tech platforms and websites handle child sexual abuse material cleared a key hurdle on Thursday as a Senate panel voted to approve the legislation despite vocal objections from civil rights groups who say the proposal as written will backfire and harm all internet users.

Three separate concerns have been raised:

Threat to encryption

If the law says that online platforms are responsible for CSAM material stored on them unless they proactively take steps to remove it, it could be argued that allowing end-to-end encryption would prevent them complying with the law – and therefore oblige them to cease offering messaging services that use it.

The bill has been slightly amended in an attempt to allay this concern, but critics say it doesn’t go nearly far enough (emphasis ours).

In response to concerns that the EARN IT Act could allow government officials to ban encryption, the bill’s co-authors added a provision that prohibits encryption from being used as the sole justification for CSAM lawsuits.

But that won’t be enough, the civil society groups wrote in their Tuesday letter. A platform’s support for encryption could easily be cited in a lawsuit in a guilt-by-association manner, with its mere existence serving as suggestive evidence of a website’s wrongdoing. And that, the groups said, “will serve as a strong disincentive to deploying encrypted services in the first place.”

Could make prosecuting offenders more difficult

A letter Tuesday by dozens of groups including the American Civil Liberties Union, Human Rights Campaign and the Wikimedia Foundation (the organization behind Wikipedia) said the bill could actually result in a lack of accountability for people spreading CSAM online.

If a state law forces websites “to monitor or filter their users’ content so it can be turned over to the government for criminal prosecution, the provider becomes an agent of the government and any CSAM it finds could become the fruit of an unconstitutional warrantless search,” the letter said.

The law could be different in every state

As written, the bill doesn’t tell platform providers what they need to do. Rather, it leaves the exact standard they have to meet to be determined at state level. This could result in a chaotic patchwork of 50 different requirements, and could put at risk anyone with a website or blog that allows comments.

If a state, let’s say New York, enacted a law saying that any interactive computer service … doesn’t remove CSAM within one hour of it being posted is liable for damages up to $1 million, a church with that type of interactive website could be strictly liable under that New York law.

9to5Mac’s Take

CSAM is utterly abhorrent, so it’s natural to want to take whatever measures we can to limit its spread. However, history is littered with examples of unintended consequences to well-intentioned laws.

For example, three strikes laws have been shown to dramatically increase homicides, because if an offender risks life imprisonment for a relatively minor offense, killing a witness or a cop reduces their chances of getting caught without lengthening their jail sentence if they do.

Lawmakers are often frighteningly ignorant when it comes to technology, so the risk of unintended consequences is particularly high in the case of tech law. A specific clause stating that the use of end-to-end encryption shall have no bearing on liability would remove that concern (though the other two issues would also need to be addressed).

Apple famously hit similar risks of unintended consequences when it announced its own CSAM measures, which it later had to put on hold. Currently the company’s approach appears to be to say nothing and hope the issue goes away.

Photo: Victor Grigas

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing