9to5Mac
By Chance Miller
August 6, 2021
As 9to5Mac reported yesterday, the new feature will allow Apple to detect known CSAM images when they are stored in iCloud Photos. The feature has faced considerable pushback from certain sources, but Apple promises that this is necessary to protect children and that everything is being done with privacy in mind.
The EFF wrote: All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.
Apple confirmed to 9to5Mac today that any expansion of the CSAM detection feature outside of the United States will take place on a country-by-country basis depending on local laws and regulations. The company did not provide a specific timetable on when, or if, it will expand CSAM detection to additional countries.
Apple traditionally launches features first in the United States because the US is the company’s largest market and the market in which it is most familiar with local laws and regulations. That is again the case with the new CSAM detection system.
Apple’s implementation of this CSAM detection feature is highly technical, and more details can be learned by tapping the "Learn more" button below.