Skip to main content

Comment: Here’s how Apple could resolve its CSAM no-win situation

Apple has really gotten itself into a CSAM no-win situation. If it presses ahead, then it will be condemned by civil rights groups and security professionals. If it doesn’t, it will be condemned by child protection groups.

The company has currently bought itself some time by delaying the rollout while it tries to think of additional safeguards, but the question remains: What could those be? …

Having spent years touting the privacy credentials of its devices and its policies, including the famous “What happens on your iPhone, stays on your iPhone” slogan, it should have come as no surprise to Apple that the announcement proved controversial.

The company has put up huge billboards. It has run amusing ads. It has an entire privacy microsite. Its CEO talks about privacy in every interview and public appearance. The company attacks other tech giants over privacy. It fought the entire ad industry over a new privacy feature.

But for whatever reason, Apple failed to anticipate the PR disaster it was creating for itself, and now needs to find some way out of the mess.

The for and against arguments in a nutshell

The single biggest concern is that a repressive government could force Apple to add political materials to the CSAM databases it uses, or could simply provide a database and say that Apple must used that one.

Apple responded by saying it will only use “safe” databases.

Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM? Our process is designed to prevent that from happening.

CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by at least two child safety organizations.

There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos.

The counter-argument is that this is a promise Apple cannot make. As the company has said in regard to previous controversies, Apple complies with local laws in each of the countries in which it operates. Indeed, it recently gave into blackmail before a new law even comes into force.

Some reject this argument, saying that Apple can already be forced or blackmailed into compromising user privacy in other ways – as we’ve seen in China – and therefore CSAM scanning doesn’t change anything. My own view remains the CSAM scanning setup is still a particularly efficient route for a government wanting to identify political opponents, and can be done without the visibility of big-scale changes like the Chinese servers.

How could Apple resolve its CSAM no-win situation?

I can see two potential ways forward for Apple.

The first is simply to continue to delay the rollout indefinitely. That way, it doesn’t reignite all the civil liberties objections by activating the feature, nor does it anger child protection groups by announcing a U-turn. Any time it is quizzed, it can simply say that it continues to work on developing additional safeguards, and hope that people eventually get bored with asking.

I do think that could work for some considerable time – but not indefinitely. As some point, child protection groups are going to stand up and demand to know when the system is launching. Apple could not possibly get to iOS 16, for example, without either launching the feature or abandoning its plans, and it’s unlikely it would get away with it for that long.

The second, and better, route would be to announce a Facebook-style independent oversight board. The job of that board would be to verify the contents of every CSAM database used by Apple around the world. The smart thing would be for Apple to invite onto that board its most vocal CSAM critics, like cybersecurity academic Matthew Green.

In that way, we wouldn’t need to take Apple’s word that is not using compromised databases, we would have independent security experts saying so. If those independent experts comprise the people who have been most skeptical about Apple’s safeguards, then we could be extremely confident in their findings.

The first job of that oversight board would be to review Apple’s existing safeguards, and to make recommendations for additional ones. Like Facebook, Apple would commit to following the rulings of the board.

I think this could resolve the issue to everyone’s satisfaction – and, in the worst case that it doesn’t, Apple can at least insulate itself by pointing out that it is simply doing what the independent oversight board has requested.

What’s your view? Do you think this could be a good way for Apple to dig itself out of its current CSAM no-win situation? Please take our poll, and share your thoughts in the comments.

https://poll.fm/10923819 

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications