Skip to main content

German parliament pens letter to Tim Cook with concerns over CSAM detection system

Since Apple presented the new CSAM scan feature to protect children, the announcement has generated a lot of concerns and controversies about users’ privacy. Now, the Digital Agenda committee chief of the German parliament wants Apple to reconsider its CSAM plans in a letter to Tim Cook.

As reported by Heise Online, Manuel Hoferlin, the Digital Agenda committee chairman, thinks Apple is going on a “dangerous path” while undermining “safe and confidential communication.”

Although CSAM scan is only going to be available in the US when it launches, Hoferlin says this will be “the largest surveillance instrument of history” and could make Apple lose access to large markets whether the company keeps with this strategy.

Apple, on the other hand, tried to explain that the CSAM scan is not going to analyze every people’s photos on their iPhones. Not only that, the company announced last week that the system will be able to be audited by third parties.

Apple explained that it will publish a Knowledge Base article with the root hash of the encrypted CSAM hash database. Apple will also allow users to inspect the root hash database on their device and compare it against the database in the Knowledge Base article.

Site default logo image

Apart from that, the company continues to offer clarity around the CSAM detection feature. In addition to a detailed frequently asked questions document published last week as well, Apple also confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos.

Apple also reinforced that if a user does not use iCloud Photos, then no part of the CSAM detection process runs. This means that if a user wants to opt out of the CSAM detection process, they can disable iCloud Photos.

Apple believes its on-device implementation of CSAM detection is far better than the server-side implementations used by other companies. Those implementations, Apple explains, require that a company scan every single photo stored by a user on its server, the majority of which are not CSAM.

Read more

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing