The ridiculously named EARN IT bill – the Eliminating Abuse and Rampant Neglect of Interactive Technologies Act – has made it through committee stage.
Like many measures designed to tackle Child Sexual Abuse Material (CSAM), the bill is well intentioned, but could have some very significant unintended consequences. These may include making it harder to prosecute offenders, and making end-to-end encryption less available …
Background
The bill was first proposed back in 2019, and concerns were raised that it could threaten end-to-end encryption.
The acronym is intended to suggest that tech companies should be required to “earn” the right to Section 230 protections, which mean that companies proving communication platforms can’t be held legally liable for things posted by users.
Reuters reports that the bill seeks to impose conditions on this protection, and that providing a backdoor to encryption is believed to be one of them.
Bill co-sponsor Senator Lindsey Graham has previously called on Apple to break the strong encryption used to protect iPhones.
Three concerns about the EARN IT bill
CNN reports that the bill has now made it through committee, despite concerns.
A controversial bill targeting how tech platforms and websites handle child sexual abuse material cleared a key hurdle on Thursday as a Senate panel voted to approve the legislation despite vocal objections from civil rights groups who say the proposal as written will backfire and harm all internet users.
Three separate concerns have been raised:
Threat to encryption
If the law says that online platforms are responsible for CSAM material stored on them unless they proactively take steps to remove it, it could be argued that allowing end-to-end encryption would prevent them complying with the law – and therefore oblige them to cease offering messaging services that use it.
The bill has been slightly amended in an attempt to allay this concern, but critics say it doesn’t go nearly far enough (emphasis ours).
In response to concerns that the EARN IT Act could allow government officials to ban encryption, the bill’s co-authors added a provision that prohibits encryption from being used as the sole justification for CSAM lawsuits.
But that won’t be enough, the civil society groups wrote in their Tuesday letter. A platform’s support for encryption could easily be cited in a lawsuit in a guilt-by-association manner, with its mere existence serving as suggestive evidence of a website’s wrongdoing. And that, the groups said, “will serve as a strong disincentive to deploying encrypted services in the first place.”
Could make prosecuting offenders more difficult
A letter Tuesday by dozens of groups including the American Civil Liberties Union, Human Rights Campaign and the Wikimedia Foundation (the organization behind Wikipedia) said the bill could actually result in a lack of accountability for people spreading CSAM online.
If a state law forces websites “to monitor or filter their users’ content so it can be turned over to the government for criminal prosecution, the provider becomes an agent of the government and any CSAM it finds could become the fruit of an unconstitutional warrantless search,” the letter said.
The law could be different in every state
As written, the bill doesn’t tell platform providers what they need to do. Rather, it leaves the exact standard they have to meet to be determined at state level. This could result in a chaotic patchwork of 50 different requirements, and could put at risk anyone with a website or blog that allows comments.
If a state, let’s say New York, enacted a law saying that any interactive computer service … doesn’t remove CSAM within one hour of it being posted is liable for damages up to $1 million, a church with that type of interactive website could be strictly liable under that New York law.
9to5Mac’s Take
CSAM is utterly abhorrent, so it’s natural to want to take whatever measures we can to limit its spread. However, history is littered with examples of unintended consequences to well-intentioned laws.
For example, three strikes laws have been shown to dramatically increase homicides, because if an offender risks life imprisonment for a relatively minor offense, killing a witness or a cop reduces their chances of getting caught without lengthening their jail sentence if they do.
Lawmakers are often frighteningly ignorant when it comes to technology, so the risk of unintended consequences is particularly high in the case of tech law. A specific clause stating that the use of end-to-end encryption shall have no bearing on liability would remove that concern (though the other two issues would also need to be addressed).
Apple famously hit similar risks of unintended consequences when it announced its own CSAM measures, which it later had to put on hold. Currently the company’s approach appears to be to say nothing and hope the issue goes away.
Photo: Victor Grigas
FTC: We use income earning auto affiliate links. More.
Comments