Skip to main content

CSAM

See All Stories
CSAM

Apple’s efforts to detect Child Sexual Abuse Materials (CSAM).

What is CSAM?

While US federal law uses the term child pornography, the National Center for Missing and Exploited Children (NCMEC) explains why the term CSAM is preferred.

NCMEC chooses to refer to these images as Child Sexual Abuse Material (CSAM) to most accurately reflect what is depicted – the sexual abuse and exploitation of children. Not only do these images and videos document victims’ exploitation and abuse, but when these files are shared across the internet, child victims suffer re-victimization each time the image of their sexual abuse is viewed […]

While CSAM is seen and transmitted on computers and through other technology, these images and videos depict actual crimes being committed against children. The human element, children at risk, must always be considered when talking about this offense that is based in a high-tech world.

How is it usually detected?

The usual way to detect CSAM is when cloud services like Google Photos scan uploaded photos and compare them against a database of known CSAM images. This database is provided by NCMEC and similar organizations around the world.

The actual matching process uses what’s known as a hash, or digital fingerprint. This is derived from key elements of the image, and is deliberately fuzzy so that it will continue to work when images are resized, cropped, or otherwise processed. This means there will sometimes be false positives: an innocent image whose hash happens to be a close enough match to a CSAM one.

How is Apple detecting CSAM?

Apple made an announcement in early August 2021 about its own plans to begin scanning for CSAM.

Apple has chosen to take a somewhat different approach, which it says better protects privacy. This process is:

  • Apple downloads the CSAM database hashes to your iPhone
  • An on-device process looks for matches with hashes of your photos
  • If fewer than 30* are found, no action is taken
  • If 30+ matches are found, low-resolutions of your photos are manually examined by Apple
  • If the photos are found to be innocent, no further action is taken
  • If manual review confirms them as CSAM, law enforcement is informed

*Apple initially said only that there was a threshold of matching images, without revealing what it was, but Craig Federighi implied in an interview that this is 30 images.

What concerns have been raised?

Concerns have been raised by cybersecurity experts, human rights organizations, governments, and Apple’s own employees. Four main concerns have been raised, explained here:

  • Accidental false positives could ruin someone’s reputation
  • Deliberate false positives (aka collision attacks) could be created to achieve the same goal
  • Authoritarian governments could add political posters and similar to the database
  • The same hash-based on-device searches could be later applied to iMessage

Additionally, because Apple simultaneously announced an entirely separate feature designed to detect nude photos in iMessages sent or received by children, many non-technical people conflated the two, thinking Apple was scanning our photos for nudes.

How has Apple responded?

Apple engaged in a flurry of rapid-fire PR activity designed to correct misapprehensions and address genuine concerns. This included a leaked internal memo, a series of background briefings, interviews, and a six-page FAQ.

Apple said that images were only scanned if they were synched with iCloud, so customers could opt out if they wished. It added that the risk of either accidental or deliberate false positives was statistically insignificant, as it required multiple matches before an account was flagged. Even then, an Apple employee would review images before any report to law enforcement.

The company said it would roll out the feature on a country-by-country basis, and would refuse any government demand to add political images to the database – a promise it cannot realistically make.

Since then, things have gone complete quiet, with no sign of any move by Apple to actually launch CSAM scanning.

Why has this proven so controversial?

Google, Amazon, Facebook and many other tech giants already routinely scan for CSAM and report instances to law enforcement. Apple is merely joining in, and trying to use a more privacy-focused approach, by performing the actual comparison on-device. So why so much controversy?

In part, for the reason explained earlier: Apple’s mistake in simultaneously announcing two different features.

But the outrage was entirely predictable, given the years Apple has spent touting its privacy credentials.

The company has put up huge billboards. It has run amusing ads. It has an entire privacy microsite. Its CEO talks about privacy in every interview and public appearance. The company attacks other tech giants over privacy. It fought the entire ad industry over a new privacy feature.

Any risk that customer privacy will be compromised, however small the likelihood, and however well-intentioned the reason, was bound to raise eyebrows.

Apple may not be able to keep its head down much longer, however, as a UK CSAM law could force the issue.

CSAM scanning would be abused, says Apple – using argument it originally rejected

CSAM scanning would be abused | Unlocked padlocks

When Apple announced its own approach to CSAM scanning, many of us warned that the process used to check for child sexual abuse materials would ultimately be abused by repressive governments to scan for things like political protest plans.

The Cupertino company rejected that reasoning at the time, but in an ironic twist is now using precisely this argument in response to the Australian government …

Expand Expanding Close

CSAM scanning in chat apps would echo communist surveillance, and put children at risk

CSAM scanning in chat apps | Apps shown on iPhone

A planned law to require CSAM scanning in chat apps would be illegal, disproportionate, and could increase rather than decrease the risks to children, say experts. It could also see Apple withdraw iMessage from EU countries.

The warning was given by more than 20 speakers at a privacy seminar, as the European Union continues to press for a CSAM measure which would effectively outlaw end-to-end encryption in chat apps like iMessage, WhatsApp, and Signal

Expand Expanding Close

Social media roundup: CSAM on Mastodon; Zuckerberg faces contempt charge; Twitter’s bank plans

CSAM on Mastodon | Abstract image of servers

An investigation by the Stanford Internet Observatory has found worrying volumes of CSAM on Mastodon. Of particular concern is that the child sexual abuse material included many known examples that should have been automatically detected by digital fingerprinting.

Researchers say that a large part of the issue is the open and decentralized nature of the social media platform …

Expand Expanding Close

Revenge porn blocked in similar way to Apple’s abandoned CSAM scans

Revenge porn and sextortion | Frightened woman in the dark

Facebook has been working on ways to prevent the posting of so-called “revenge porn” – aka non-consensual sharing of explicit photos by former partners and others – and sextortion attempts, especially those against teenagers.

Its first tool, back in 2017, was so scary that few were willing to use it: You had to upload nudes to Facebook in order to have them tagged for blocking! But a new tool, geared initially to teenagers, will allow them to create digital fingerprints, right on their own device …

Expand Expanding Close

Banned Twitter accounts: 62K restored, only one CSAM monitor left, COVID-19 misinformation welcome

Banned Twitter accounts | Confused-looking monkey

After Musk announced that most banned Twitter accounts would be reinstated, a report says that around 62,000 accounts with more than 10K followers each have so far been restored.

Other Twitter news includes a claim that the layoffs and resignations have left the company with just one person responsible for removing child sexual abuse materials (CSAM) …

Expand Expanding Close

Apple’s CSAM approach is the right one, says British government, as it attacks Facebook

Apple's CSAM approach | Pile of photos on a table

The British government has backed a call by the country’s security services for client-side scanning for child sexual abuse material – aka Apple’s CSAM approach.

Home Secretary Priti Patel has written an op-ed in which she indicates government support for the stance, while also attacking Facebook’s plans to make all Messenger chats end-to-end encrypted by default …

Expand Expanding Close

CSAM law could force all encrypted messaging services to use Apple-style client-side scanning [U: Delayed]

Site default logo image

Update: The vote on the bill is now expected to be delayed until the fall – see end for more details.

A proposed new CSAM law in the UK could force all messaging companies to use the type of client-side scanning approach that Apple planned to launch to detect child sexual abuse material (CSAM) on iPhones.

An amendment to the Online Safety Bill has been put forward that would require tech companies to identify and remove CSAM, even in end-to-end encrypted private messages …

Expand Expanding Close

As feared, EU CSAM scanning law could outlaw end-to-end encryption of messages

Photo of iPhone with messages being shattered | EU CSAM scanning law could outlaw end-to-end encryption of messages

We learned yesterday that a proposed new EU CSAM scanning law for tech giants would force Apple to revisit its own plans for detecting child sexual abuse materials. The company had quietly set these aside in response to a huge amount of controversy over its proposed approach.

Many had feared that the proposed law would involve yet another assault on end-to-end encrypted messaging, and this has now been confirmed by wording in the document …

Expand Expanding Close

Apple’s CSAM troubles may be back, as EU announces a law requiring detection [U]

Photo of woman using a MacBook in a data center | Apple's CSAM troubles may be back

Update: The EU has now announced the proposed new law. More details at the bottom.

Apple’s CSAM troubles may be back, after controversy over the issue of scanning iPhones for child sexual abuse materials led to the company suspending its plans.

A report today says that the European Union is planning a law that would require tech giants like Apple to detect, report, and remove CSAM, and that we’ll see a draft of the new law as early as this week …

Expand Expanding Close

TikTok CSAM investigation underway by Dept. of Homeland Security; privacy feature exploited

TikTok CSAM investigation underway by Dept of Homeland Security

The Department of Homeland Security has opened a TikTok CSAM investigation, after child sexual abuse material was posted both publicly and privately on the video sharing network.

Additionally, the platform is being heavily used by abusers for grooming – the practice of befriending a child online with the intention of later abusing them, either online or offline …

Expand Expanding Close

Apple quietly removes all references to CSAM scanning, but says nothing has changed [U]

Apple removes reference to CSAM scanning

Update: As we suspected, nothing has changed. An Apple spokesperson told The Verge that the feature is still delayed, not cancelled.

Apple’s website references to CSAM scanning have been quietly removed by the company in the past few days.

The company’s child safety microsite previously described the company’s plans for scanning iPhones for Child Sexual Abuse Materials, alongside the Communication Safety in Messages feature, and warnings when someone searches for CSAM. However, the section on CSAM scanning has now been removed …

Expand Expanding Close

Governments planned to misuse CSAM scanning tech even before Apple’s announcement

Governments planned to misuse CSAM scanning tech

Governments were already discussing how to misuse CSAM scanning technology even before Apple announced its plans, say security researchers.

The biggest concern raised when Apple said it would scan iPhones for child sexual abuse materials (CSAM) is that there would be spec-creep, with governments insisting the company scan for other types of images, and there now seems good evidence for this …

Expand Expanding Close

Comment: Here’s how Apple could resolve its CSAM no-win situation

Site default logo image

Apple has really gotten itself into a CSAM no-win situation. If it presses ahead, then it will be condemned by civil rights groups and security professionals. If it doesn’t, it will be condemned by child protection groups.

The company has currently bought itself some time by delaying the rollout while it tries to think of additional safeguards, but the question remains: What could those be? …

Expand Expanding Close

Security expert says Apple giving into Russia proves CSAM assurances cannot be trusted

Apple giving into Russia proves anti CSAM case

Apple giving into Russia twice this week on key civil liberties issues proves that the company’s CSAM misuse assurances cannot be trusted, argues a high-profile security expert.

Apple today pulled from the App Store an opposition tactical voting app after the Russian government threatened specific local company employees with “punishment” if they refused. It turns out that Apple also turned off its Private Relay service in Russia just yesterday, likely also in response to government pressure…

Expand Expanding Close

UK government backs Apple, and wants to scan encrypted messages for CSAM

Scan encrypted messages for CSAM

The British government has expressed support for Apple’s now-delayed CSAM scanning plans, and says that it wants the ability to scan encrypted messages for CSAM, even where end-to-end encryption is used.

The country is offering to pay anyone who can find a way “to keep children safe in environments such as online messaging platforms with end-to-end encryption” …

Expand Expanding Close

Apple already scans iCloud Mail for CSAM, but not iCloud Photos

Apple scans iCloud Mail for CSAM

Apple has confirmed to me that it already scans iCloud Mail for CSAM, and has been doing so since 2019. It has not, however, been scanning iCloud Photos or iCloud backups.

The clarification followed me querying a rather odd statement by the company’s anti-fraud chief: that Apple was “the greatest platform for distributing child porn.” That immediately raised the question: If the company wasn’t scanning iCloud photos, how could it know this?

Expand Expanding Close

Apple’s anti-fraud chief said company was ‘the greatest platform for distributing child porn’

Apple's anti-fraud chief child porn statement

Update: A likely explanation for this comment has now emerged.

An explanation for Apple’s controversial decision to begin scanning iPhones for CSAM has been found in a 2020 statement by Apple’s anti-fraud chief.

Eric Friedman stated, in so many words, that “we are the greatest platform for distributing child porn.” The revelation does, however, raise the question: How could Apple have known this if it wasn’t scanning iCloud accounts… ?

Expand Expanding Close

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing