Skip to main content

Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis

Apple’s new feature for detection of Child Sexual Abuse Material (CSAM) content in iCloud Photos will launch first in the United States, as 9to5Mac reported yesterday. Apple confirmed today, however, that any expansion outside of the United States will occur on a country-by-country basis depending on local laws and regulations.

As 9to5Mac reported yesterday, the new feature will allow Apple to detect known CSAM images when they are stored in iCloud Photos. The feature has faced considerable pushback from certain sources, but Apple promises that this is necessary to protect children and that everything is being done with privacy in mind.

While one of the most common concerns has been on what might happen if other governments try and take advantage of this system for other purposes, the feature will only be available in the United States at launch. The concerns have come from a variety of notable sources, such as Edward Snowden and the Electronic Frontier Foundation.

The EFF wrote:

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.

Apple confirmed to 9to5Mac today that any expansion of the CSAM detection feature outside of the United States will take place on a country-by-country basis depending on local laws and regulations. The company did not provide a specific timetable on when, or if, it will expand CSAM detection to additional countries.

Apple traditionally launches features first in the United States because the US is the company’s largest market and the market in which it is most familiar with local laws and regulations. That is again the case with the new CSAM detection system.

Apple’s implementation of this CSAM detection feature is highly technical, and more details can be learned at the links below. 

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Chance Miller Chance Miller

Chance is the editor-in-chief of 9to5Mac, overseeing the entire site’s operations. He also hosts the 9to5Mac Daily and 9to5Mac Happy Hour podcasts.

You can send tips, questions, and typos to chance@9to5mac.com.

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications