Skip to main content

Apple faces renewed pressure to protect child safety: ‘Child sexual abuse is stored on iCloud. Apple allows it.’

Two years ago, Apple announced a number of new child safety features, including a system that would use on-device processing to scan for child sexual abuse materials. Despite the privacy-focused implementation of the feature, Apple faced enough backlash for the feature that it ended up abandoning its plans.

Now, Apple finds itself in the position of facing renewed pressure from advocacy groups and activist investors to take better action against CSAM.

As first reported by Wired, the child safety advocacy group Heat Initiative is launching a multi-million dollar campaign in which it presses Apple on this issue. We reported this morning on Apple’s response to Heat Initiative’s campaign, in which the company acknowledged the potential precedent of implementing the CSAM detection feature.

“Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types,” Erik Neuenschwander, Apple’s director of user privacy and child safety, said in an interview with Wired.

Now, Heat Initiative’s full campaign has officially launched. On the website for the campaign, Heat Initiative takes an aggressive stance against Apple. The campaign includes language such as: “Child sexual abuse material is stored on iCloud. Apple allows it.”

The campaign explains:

Apple’s landmark announcement to detect child sexual abuse images and videos in 2021 was silently rolled back, impacting the lives of children worldwide. With every day that passes, there are kids suffering because of this inaction, which is why we’re calling on Apple to deliver on their commitment.

The advocacy group says that it is calling on Apple to “detect, report, and remove child sexual abuse images and videos from iCloud.” It also wants the company to “create a robust reporting mechanism for users to report child sexual abuse images and videos to Apple.”

The campaign’s website also includes several “Case Studies” that graphically detail instances in which iCloud was used to store sexual abuse photos and videos. The site also includes a button to “Email Apple leadership directly.” This button opens an email form for a mass email sent to Apple’s entire executive team.

Heat Initiative has also sent a letter addressed to Tim Cook in which the group says Apple’s inaction puts “children in harm’s way.”

In our recent research, we have come across hundreds of cases of child sexual abuse that have been documented and spread specifically on Apple devices and stored in iCloud. Had Apple been detecting these images and videos, many of these children would have been removed from their abusive situations far sooner.

That is why the day you make the choice to start detecting such harmful content, children will be identified and will no longer have to endure sexual abuse. Waiting continues to put children in harm’s way, and prevents survivors, or those with lived experience, from healing.

Shareholder pressure

But in addition to the pressure from Heat Initiative’s looming advertising, Apple will also soon face pressure from investors on this matter. 9to5Mac has learned that Christian Brothers Investment Services is planning to file a shareholder resolution that would call on the company to take action on improving CSAM detection.

Top comment by Charles

Liked by 31 people

This seems fairly disingenuous. Are these same wackadoos sending the other major manufacturers like Google and Samsung similar demands? Because as far as I can tell Apple is doing everything that is reasonable beyond scanning through people's individual devices and violating device privacy or security or potentially finding false positives.

It just feels scammy.

Besides, if creepers want to continue being horrible people they just turn off iCloud and use a local backup or a 3rd party encrypted private service...

View all comments

Christian Brothers Investment Services describes itself as a “Catholic, socially responsible investment management firm.” The proposal is believed to play a role in Heat Initiative’s advertising campaign as well. Simultaneously, the New York Times is also now reporting that Degroof Petercam, a Belgian investment firm, will also back the resolution.

As we’ve explained in the past, this puts Apple between a rock and a hard place. Privacy advocates view the company’s initial implementation of CSAM detection as a dangerous precedent. Child safety advocates, meanwhile, say the company isn’t doing enough.

While Apple did abandon its plans to detect known CSAM images when they are stored in iCloud, the company has implemented a number of other child safety features.

Follow ChanceThreadsTwitterInstagram, and Mastodon. Donate to support St. Jude Children’s Research Hospital.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Chance Miller Chance Miller

Chance is the editor-in-chief of 9to5Mac, overseeing the entire site’s operations. He also hosts the 9to5Mac Daily and 9to5Mac Happy Hour podcasts.

You can send tips, questions, and typos to chance@9to5mac.com.

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications