Skip to main content

Apple quietly removes all references to CSAM scanning, but says nothing has changed [U]

Update: As we suspected, nothing has changed. An Apple spokesperson told The Verge that the feature is still delayed, not cancelled.

Apple’s website references to CSAM scanning have been quietly removed by the company in the past few days.

The company’s child safety microsite previously described the company’s plans for scanning iPhones for Child Sexual Abuse Materials, alongside the Communication Safety in Messages feature, and warnings when someone searches for CSAM. However, the section on CSAM scanning has now been removed …

9to5Mac reader Florian Schimanke spotted the change.

I just found out that after releasing iOS 15.2 on Monday, Apple has removed the mentioning of its planned CSAM scans from their child safety website. Last Friday the plans were still present on the website as can be seen on the web archive: https://web.archive.org/web/20211210163051/https://www.apple.com/child-safety/.

Some are speculating that this means Apple has abandoned the plan, and is just going to hope people forget all about it. However, while I did acknowledge this possibility back in September, I also explained why I considered it unlikely.

I can see two potential ways forward for Apple. The first is simply to continue to delay the rollout indefinitely. That way, it doesn’t reignite all the civil liberties objections by activating the feature, nor does it anger child protection groups by announcing a U-turn. Any time it is quizzed, it can simply say that it continues to work on developing additional safeguards, and hope that people eventually get bored with asking.

I do think that could work for some considerable time – but not indefinitely. As some point, child protection groups are going to stand up and demand to know when the system is launching. Apple could not possibly get to iOS 16, for example, without either launching the feature or abandoning its plans, and it’s unlikely it would get away with it for that long.

I therefore proposed a second approach:

The second, and better, route would be to announce a Facebook-style independent oversight board. The job of that board would be to verify the contents of every CSAM database used by Apple around the world. The smart thing would be for Apple to invite onto that board its most vocal CSAM critics, like cybersecurity academic Matthew Green.

Whether Apple takes my advice remains to be seen, but I don’t think that removal of the plan from the company’s website means that CSAM scanning is dead – only that the company wants more time to consider its options.

Photo: Kelly-Ann Tan/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing