Skip to main content

Opinion: Four problems with Apple’s reported approach to scanning for child abuse images

A report this morning said that Apple is set to announce that it will begin scanning for child abuse images on iPhones. Update: Apple later confirmed the report.

The method Apple is expected to use is one that maximizes privacy, but we noted earlier that there are still a number of ways in which this could go badly wrong …

Update: Apple’s announcement says that it will require multiple image matches before the issue is flagged, mitigating the first two risks described below.

The problems with CSAM fingerprints

Johns Hopkins cryptographer Matthew Green outlined some of the problematic aspects of scanning for child sexual abuse material (CSAM) fingerprints.

False positives

CSAM fingerprints are deliberately not bit-perfect. If they only detected an exact copy of an image, then all someone would have to do is a one-pixel crop to ensure that the file no longer matched the fingerprint.

For this reason, fingerprints are designed to ensure that they can still detect images and videos which have been cropped, resized, rotated and so on. By definition, that means that the fingerprints are fuzzy, and will sometimes flag perfectly innocent files. That creates two issues.

First, even if all that happens is someone at Apple – or an independent monitoring body – reviews the photo or video and declares it innocent, a privacy breach has already occurred. A potentially very private photo or video has been viewed by a third party.

Second, and more worryingly, even suspicion of such a serious offense can cause serious disruption to someone’s life. Phones, tablets, and computers can be seized, and may not be returned for some considerable time. If anyone finds out the police are investigating, that can put someone’s job, relationship, and reputation at risk – even if they are later found completely innocent.

Collision attacks

We’ve already noted that innocent images can sometimes match CSAM fingerprints by chance, but Green points to a paper pointing out that it’s entirely possibly to deliberately create images that will generate a matching hash.

Someone who wants to create trouble for an enemy could arrange for them to be sent innocent-looking materials – which may look absolutely nothing like a problematic photo – which match known CSAM fingerprints. That then opens up the target to all the risks just covered.

Misuse by authoritarian governments

A digital fingerprint can be created for any type of material, not just CSAM. What’s to stop an authoritarian government adding to the database images of political campaign posters or similar?

So a tool that is designed to target serious criminals could be trivially adapted to detect those who oppose a government or one or more of its policies.

Apple – who would receive the fingerprint database from governments – would find itself unwittingly aiding repression or worse of political activists.

Potential expansion into messaging

Right now, this type of fingerprinting is primarily used for images – photos and videos. But the same approach can just as easily be used to match particular text. This is how most passwords are checked: The server doesn’t store your actual password, but rather a hashed version of it. Which is to say, a digital fingerprint.

Here, Apple’s approach of running the fingerprint check on your device could actually turn a privacy strength into a privacy vulnerability.

If you use an end-to-end encrypted messaging service like iMessage, Apple has no way to see the content of those messages. If a government arrives with a court order, Apple can simply shrug and say it doesn’t know what was said.

But if a government adds fingerprints for types of text – let’s say the date, time, and location of a planned protest – then it could easily create a database of political opponents.

Privacy is always a balancing act

All societies have decided that both 0% privacy and 100% privacy are bad ideas. Where they differ is on the point they choose on the scale between the two.

For example, the Fourth Amendment protects the privacy of US citizens against cops carrying out random searches of their person, vehicle, or home. However, it also sets out limitations to that right to privacy. A cop who has a reasonable basis to suspect that you have committed a crime, and that evidence can be found in your home, for example, can ask a judge to grant a search warrant – and that search is then legal. Without that exception, it would never be possible to enter a home to find stolen property or a kidnap victim.

Finding the right balance between the individual right to privacy on the one hand, and the ability of law enforcement to detect and prosecute crime, can be extremely challenging. It’s particularly tough when it comes to the two hot-button issues of child abuse and terrorism.

We face exactly the same challenges with digital privacy as with physical privacy. Apple, by taking a strong stance on privacy, and using this as a marketing tool, has placed itself in a particularly tricky position when it comes to finding this balance.

Apple has to walk a privacy tightrope

Green acknowledges that Apple might put in place safeguards. For example, the company might want to generate its own fingerprints from the source images, and it might refuse any proposal to scan text messages. But this development undoubtedly creates risks for innocent users.

It’s another example of the privacy tightrope Apple has to walk. For example, it has steadfastly refused to create any kind of government backdoor into iPhones, and uses end-to-end encryption for both iMessage and FaceTime. But iCloud backups do not use end-to-end encryption – so if the government turns up with a court order, Apple can hand over any data backed up by iCloud, and that’s most of the personal data on a phone.

Apple could easily use end-to-end encryption for iCloud backups, but chooses not to in what I’m certain is a carefully calculated decision. The company figures that this protects most users – as the company has strong internal safeguards, and only releases data when ordered to do so by a court – while at the same time limiting the pressures it faces from governments. It’s a pragmatic position, which allows it to cooperate with law enforcement while still being able to say that its devices are secure.

Could end-to-end encryption for iCloud backups be on the way?

Apple’s strong privacy messaging has meant that failing to use end-to-end encryption for iCloud backups is looking increasingly anomalous – most especially in China, where a government-owned company has access to the servers on which the backups are stored.

So one possibility is that this is the first step in a new compromise by Apple: It makes a future switch to E2E encryption for iCloud backups – which includes photos and videos – but also builds in a mechanism by which governments can scan user photo libraries. And, potentially, messages too.

What’s your view? Is Apple right to adopt this approach to scanning for child abuse images, or is this a dangerous path to tread? Please take our poll, and share your thoughts in the comments.

Photo: Pietro Jeng/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications