Skip to main content

CSAM

See All Stories
CSAM

Apple’s efforts to detect Child Sexual Abuse Materials (CSAM).

What is CSAM?

While US federal law uses the term child pornography, the National Center for Missing and Exploited Children (NCMEC) explains why the term CSAM is preferred.

NCMEC chooses to refer to these images as Child Sexual Abuse Material (CSAM) to most accurately reflect what is depicted – the sexual abuse and exploitation of children. Not only do these images and videos document victims’ exploitation and abuse, but when these files are shared across the internet, child victims suffer re-victimization each time the image of their sexual abuse is viewed […]

While CSAM is seen and transmitted on computers and through other technology, these images and videos depict actual crimes being committed against children. The human element, children at risk, must always be considered when talking about this offense that is based in a high-tech world.

How is it usually detected?

The usual way to detect CSAM is when cloud services like Google Photos scan uploaded photos and compare them against a database of known CSAM images. This database is provided by NCMEC and similar organizations around the world.

The actual matching process uses what’s known as a hash, or digital fingerprint. This is derived from key elements of the image, and is deliberately fuzzy so that it will continue to work when images are resized, cropped, or otherwise processed. This means there will sometimes be false positives: an innocent image whose hash happens to be a close enough match to a CSAM one.

How is Apple detecting CSAM?

Apple made an announcement in early August 2021 about its own plans to begin scanning for CSAM.

Apple has chosen to take a somewhat different approach, which it says better protects privacy. This process is:

  • Apple downloads the CSAM database hashes to your iPhone
  • An on-device process looks for matches with hashes of your photos
  • If fewer than 30* are found, no action is taken
  • If 30+ matches are found, low-resolutions of your photos are manually examined by Apple
  • If the photos are found to be innocent, no further action is taken
  • If manual review confirms them as CSAM, law enforcement is informed

*Apple initially said only that there was a threshold of matching images, without revealing what it was, but Craig Federighi implied in an interview that this is 30 images.

What concerns have been raised?

Concerns have been raised by cybersecurity experts, human rights organizations, governments, and Apple’s own employees. Four main concerns have been raised, explained here:

  • Accidental false positives could ruin someone’s reputation
  • Deliberate false positives (aka collision attacks) could be created to achieve the same goal
  • Authoritarian governments could add political posters and similar to the database
  • The same hash-based on-device searches could be later applied to iMessage

Additionally, because Apple simultaneously announced an entirely separate feature designed to detect nude photos in iMessages sent or received by children, many non-technical people conflated the two, thinking Apple was scanning our photos for nudes.

How has Apple responded?

Apple engaged in a flurry of rapid-fire PR activity designed to correct misapprehensions and address genuine concerns. This included a leaked internal memo, a series of background briefings, interviews, and a six-page FAQ.

Apple said that images were only scanned if they were synched with iCloud, so customers could opt out if they wished. It added that the risk of either accidental or deliberate false positives was statistically insignificant, as it required multiple matches before an account was flagged. Even then, an Apple employee would review images before any report to law enforcement.

The company said it would roll out the feature on a country-by-country basis, and would refuse any government demand to add political images to the database – a promise it cannot realistically make.

Since then, things have gone complete quiet, with no sign of any move by Apple to actually launch CSAM scanning.

Why has this proven so controversial?

Google, Amazon, Facebook and many other tech giants already routinely scan for CSAM and report instances to law enforcement. Apple is merely joining in, and trying to use a more privacy-focused approach, by performing the actual comparison on-device. So why so much controversy?

In part, for the reason explained earlier: Apple’s mistake in simultaneously announcing two different features.

But the outrage was entirely predictable, given the years Apple has spent touting its privacy credentials.

The company has put up huge billboards. It has run amusing ads. It has an entire privacy microsite. Its CEO talks about privacy in every interview and public appearance. The company attacks other tech giants over privacy. It fought the entire ad industry over a new privacy feature.

Any risk that customer privacy will be compromised, however small the likelihood, and however well-intentioned the reason, was bound to raise eyebrows.

Apple may not be able to keep its head down much longer, however, as a UK CSAM law could force the issue.

Apple CSAM system tricked, but easy to guard against [U]

Apple CSAM system tricked

Update: Apple mentions a second check on the server, and a specialist computer vision company has outlined one possibility of what this might be – described below under ‘How the second check might work.’

An early version of the Apple CSAM system has effectively been tricked into flagging an innocent image, after a developer reverse-engineered part of it. Apple, however, says that it has additional protections to guard against this happening in real-life use.

The latest development occurred after the NeuralHash algorithm was posted to the open-source developer site GitHub, enabling anyone to experiment with it…

Expand Expanding Close

Corellium will pay for security researchers to check Apple CSAM claims

Site default logo image

Security company Corellium is offering to pay security researchers to check Apple CSAM claims, after concerns were raised about both privacy, and the potential of the system for misuse by repressive governments.

The company says that there are any number of areas in which weaknesses could exist, and they would like independent researchers to look for these…

Expand Expanding Close

Apple SVP Craig Federighi responds to confusion over iOS 15 iCloud child safety policies in new interview

In a video interview with the Wall Street Journal, Apple SVP Craig Federighi discusses the reaction to the iCloud Child Safety features announced last week.

Federighi admits that the simultaneous announcement of the Messages protections for children and CSAM scanning, two similar features but work in very different ways, has caused customer confusion and Apple could have done a better job at communicating the new initiative.

Expand Expanding Close

Opinion: The Apple CSAM scanning controversy was entirely predictable

Site default logo image

Update: Within minutes of writing this piece, an interview was posted where Craig Federighi admitted that Apple should have handled things differently.

One thing about the CSAM scanning controversy is now abundantly clear: It took Apple completely by surprise. Which is a surprise.

Ever since the original announcement, Apple has been on a PR blitz to correct misapprehensions, and to try to address the very real privacy and human rights concerns raised by the move …

Expand Expanding Close

Apple Privacy exec details system to detect CSAM in new interview

Last week, Apple announced three new features that target child safety on its devices. While intentions are good, the new features have not come without scrutiny, with some organizations and Big Tech CEOs being against Apple’s announcement.

The company published a FAQ about all of these new features and how they will work. Now, trying to avoid more controversy, Apple Privacy head Erik Neuenschwander addressed concerns about its new systems to detect CSAM in an interview with TechCrunch.

Expand Expanding Close

Misusing CSAM scanning in US prevented by Fourth Amendment, argues Corellium

Site default logo image

While most of the concerns about governments misusing CSAM scanning to detect things like political opposition have related to foreign governments, some have suggested that it could become an issue in the US, too.

Matt Tait, COO of security company Corellium, and a former analyst at British NSA equivalent GCHQ, says the Fourth Amendment means that this could not happen in the US …

Expand Expanding Close

Facebook’s former security chief weighs in on Apple child protection controversy

Apple child protection controversy Alex Stamos

Five days after Apple’s child protection measures were announced, there has been no let-up in the controversy surrounding the upcoming new features. Latest to comment is Facebook’s former security chief and now Stanford cybersecurity professor Alex Stamos.

Stamos says that there are no easy answers here, and calls for more nuanced discussion than the prevailing narratives that this is either a great move or an unacceptable one …

Expand Expanding Close

Apple confirms CSAM detection only applies to photos, defends its method against other solutions

Apple continues to offer clarity around the CSAM (child sexual abuse material) detection feature it announced last week. In addition to a detailed frequently asked questions document published earlier today, Apple also now confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos.

The company also continues to defend its implementation of CSAM detection as more privacy-friendly and privacy-preserving than other companies.

Expand Expanding Close

Apple CSAM FAQ addresses misconceptions and concerns about photo scanning

Site default logo image

Apple has responded to misconceptions and concerns about its photo scanning announcements by publishing a CSAM FAQ – answering frequently asked questions about the features.

While child safety organizations welcomed Apple’s plans to help detect possession of child sexual abuse materials (CSAM), and to protect children from predators, there has been a mix of informed and uninformed criticism …

Expand Expanding Close

WhatsApp CEO calls out Apple over Child Safety tools announcement

Ever since Apple introduced the new protection tools for child safety this week, it instantly divided opinions. While some think this is a huge deal to protect children, others believe it will just create a backdoor for governments to access people’s iPhones.

Now, WhatsApp CEO Will Cathcart is the latest to join those who think the new Child Safety tools from Apple could be bad.

Expand Expanding Close

Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis

Apple’s new feature for detection of Child Sexual Abuse Material (CSAM) content in iCloud Photos will launch first in the United States, as 9to5Mac reported yesterday. Apple confirmed today, however, that any expansion outside of the United States will occur on a country-by-country basis depending on local laws and regulations.

Expand Expanding Close

In internal memo, Apple addresses concerns around new Photo scanning features, doubles down on the need to protect children

Apple yesterday officially announced a range of new features coming later this year, dubbed Expanded Protections for Children. The new features include protections for sensitive images in iMessage, iCloud Photo scanning for child sexual abuse material (CSAM) content, and new knowledge for Siri and Search.

In an internal memo distributed to the teams that worked on this project and obtained by 9to5Mac, Apple acknowledges the “misunderstandings” around the new features, but doubles down on its belief that these features are part of an “important mission” for keeping children safe.

Expand Expanding Close

Comment: Apple’s child protection measures get mixed reactions from experts

Apple's child protection measures tricky

The announcement yesterday of Apple’s child protection measures confirmed an earlier report that the company would begin scanning for child abuse photos on iPhones. The news has seen mixed reactions from experts in both cybersecurity and child safety.

Four concerns had already been raised before the details were known, and Apple’s announcement addressed two of them …

Expand Expanding Close

Opinion: Four problems with Apple’s reported approach to scanning for child abuse images

Scanning for child abuse images can be problematic

A report this morning said that Apple is set to announce that it will begin scanning for child abuse images on iPhones. Update: Apple later confirmed the report.

The method Apple is expected to use is one that maximizes privacy, but we noted earlier that there are still a number of ways in which this could go badly wrong …

Expand Expanding Close

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing