Since Apple presented the new CSAM scan feature to protect children, the announcement has generated a lot of concerns and controversies about users’ privacy. Now, the Digital Agenda committee chief of the German parliament wants Apple to reconsider its CSAM plans in a letter to Tim Cook.
As reported by Heise Online, Manuel Hoferlin, the Digital Agenda committee chairman, thinks Apple is going on a “dangerous path” while undermining “safe and confidential communication.”
Although CSAM scan is only going to be available in the US when it launches, Hoferlin says this will be “the largest surveillance instrument of history” and could make Apple lose access to large markets whether the company keeps with this strategy.
Apple, on the other hand, tried to explain that the CSAM scan is not going to analyze every people’s photos on their iPhones. Not only that, the company announced last week that the system will be able to be audited by third parties.
Apple explained that it will publish a Knowledge Base article with the root hash of the encrypted CSAM hash database. Apple will also allow users to inspect the root hash database on their device and compare it against the database in the Knowledge Base article.
Apart from that, the company continues to offer clarity around the CSAM detection feature. In addition to a detailed frequently asked questions document published last week as well, Apple also confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos.
Apple also reinforced that if a user does not use iCloud Photos, then no part of the CSAM detection process runs. This means that if a user wants to opt out of the CSAM detection process, they can disable iCloud Photos.
Apple believes its on-device implementation of CSAM detection is far better than the server-side implementations used by other companies. Those implementations, Apple explains, require that a company scan every single photo stored by a user on its server, the majority of which are not CSAM.
Read more
- Opinion: The Apple CSAM scanning controversy was entirely predictable
- Apple announces new protections for child safety: iMessage features, iCloud Photo scanning, more
- Apple confirms CSAM detection only applies to photos, defends its method against other solutions
- Apple CSAM FAQ addresses misconceptions and concerns about photo scanning
- Apple Privacy exec details system to detect CSAM in new interview
FTC: We use income earning auto affiliate links. More.
Comments