Thousands of CSAM victims are suing Apple for dropping plans to scan devices for the presence of child sexual abuse materials.
In addition to facing more than $1.2B in penalties, the company could be forced to reinstate the plans it dropped after many of us pointed to the risk of misuse by repressive regimes …
The story so far
Most cloud computing services routinely scan user accounts for child sexual abuse materials (CSAM), using a digital fingerprinting method.
These fingerprints are a way to match known CSAM images without anyone having to view them, and are designed to be sufficiently fuzzy to continue to match images which have been cropped or otherwise edited, while generating very few false positives. When a positive match is found, the photo is then manually checked by a human being. If that confirms the photo is CSAM, a report is filed and passed on to law enforcement.
iCloud is one of the very few cloud services which doesn’t do this scanning, with Apple citing privacy as the reason.
In an attempt to introduce CSAM scanning in a privacy-respecting fashion, Apple proposed to run the fingerprinting tool to conduct on-device scanning on the grounds that this would be less intrusive than scanning iCloud photos. Only if multiple matches were found would a human review the photos, as a method of further reducing the risk of a false positive.
The problem, as many of us observed, was the potential for abuse by repressive governments.
A digital fingerprint can be created for any type of material, not just CSAM. There’s nothing to stop an authoritarian government adding to the database images of political campaign posters or similar.
A tool designed to target serious criminals could be trivially adapted to detect those who oppose a government or one or more of its policies. Apple – which would receive the fingerprint database from governments – would find itself unwittingly aiding repression or worse of political activists.
Apple initially said it would never agree to this, but many of us again pointed out that it would have no choice. As the company famously says every time it has to do something sketchy to comply with a law, “Apple complies with the law in each of the countries in which it operates.”
The iPhone maker initially rejected this argument, but eventually abandoned its CSAM scanning plans before belatedly acknowledging the reality of the problem. Apple subsequently used this exact argument to oppose proposed legislation.
CSAM victims sue
Arstechnica reports that CSAM victims are now suing Apple for its failure to conduct scanning.
Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM) […]
Child sex abuse survivors suing have accused Apple of using the cybersecurity defense to ignore the tech giant’s mandatory CSAM reporting duties. If they win over a jury, Apple could face more than $1.2 billion in penalties. And perhaps most notably for privacy advocates, Apple could also be forced to “identify, remove, and report CSAM on iCloud and implement policies, practices, and procedures to prevent continued dissemination of CSAM or child sex trafficking on Apple devices and services.” That could mean a court order to implement the controversial tool or an alternative that meets industry standards for mass-detecting CSAM.
Apple is accused of directing profiting from its policy.
As survivors see it, Apple profits from allowing CSAM on iCloud, as child predators view its products as a safe haven to store CSAM that most other Big Tech companies mass report. Where Apple only reported 267 known instances of CSAM in 2023, four other “leading tech companies submitted over 32 million reports,” the lawsuit noted. And if Apple’s allegedly lax approach to CSAM continues unchecked, survivors fear that AI could spike the amount of CSAM that goes unreported exponentially.
The company said in response that it does take proactive steps to address the problem.
Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts.
9to5Mac’s Take
The issue is a no-win situation for all involved. There is an unavoidable conflict between detection of a truly abhorrent crime and the risk that a repressive government would take advantage of it.
If Apple had adopted the standard practice of scanning iCloud photos from the start, the likelihood is that this would never have turned into a controversial issue. Ironically, it was the company’s attempt to achieve the same goal in a more privacy-respecting way which led to the controversy.
At this point, it would probably be in Apple’s own interests that a court rule on this. If it is forced to implement scanning, and a future government were to exploit that, the company could at least point out that it had no choice. Conversely, if Apple wins the case, it may set a legal precedent that would remove continued pressure.
FTC: We use income earning auto affiliate links. More.
Comments