Update: As we suspected, nothing has changed. An Apple spokesperson told The Verge that the feature is still delayed, not cancelled.
Apple’s website references to CSAM scanning have been quietly removed by the company in the past few days.
The company’s child safety microsite previously described the company’s plans for scanning iPhones for Child Sexual Abuse Materials, alongside the Communication Safety in Messages feature, and warnings when someone searches for CSAM. However, the section on CSAM scanning has now been removed …
9to5Mac reader Florian Schimanke spotted the change.
I just found out that after releasing iOS 15.2 on Monday, Apple has removed the mentioning of its planned CSAM scans from their child safety website. Last Friday the plans were still present on the website as can be seen on the web archive: https://web.archive.org/web/20211210163051/https://www.apple.com/child-safety/.
Some are speculating that this means Apple has abandoned the plan, and is just going to hope people forget all about it. However, while I did acknowledge this possibility back in September, I also explained why I considered it unlikely.
I can see two potential ways forward for Apple. The first is simply to continue to delay the rollout indefinitely. That way, it doesn’t reignite all the civil liberties objections by activating the feature, nor does it anger child protection groups by announcing a U-turn. Any time it is quizzed, it can simply say that it continues to work on developing additional safeguards, and hope that people eventually get bored with asking.
I do think that could work for some considerable time – but not indefinitely. As some point, child protection groups are going to stand up and demand to know when the system is launching. Apple could not possibly get to iOS 16, for example, without either launching the feature or abandoning its plans, and it’s unlikely it would get away with it for that long.
I therefore proposed a second approach:
The second, and better, route would be to announce a Facebook-style independent oversight board. The job of that board would be to verify the contents of every CSAM database used by Apple around the world. The smart thing would be for Apple to invite onto that board its most vocal CSAM critics, like cybersecurity academic Matthew Green.
Whether Apple takes my advice remains to be seen, but I don’t think that removal of the plan from the company’s website means that CSAM scanning is dead – only that the company wants more time to consider its options.
Photo: Kelly-Ann Tan/Unsplash
FTC: We use income earning auto affiliate links. More.
Comments