Skip to main content

Report: Apple to announce client-side photo hashing system to detect child abuse images in users’ photos libraries

Update: This has now been officially announced: notably your phone will only be scanning photos uploaded to iCloud, in line with policies of all major social networks and web services. (Original story below for context.)

Apple is reportedly set to announce new photo identification features that will use hashing algorithms to match the content of photos in users’ photo libraries with known child abuse materials, such as child pornography.

Apple’s system will happen on the client — on the user’s device — in the name of privacy, so the iPhone would download a set of fingerprints representing illegal content and then check each photo in the user’s camera roll against that list. Presumably, any matches would then be reported for human review.

Apple has previously said it employs hashing techniques as photos are uploaded to iCloud. This new system would be done on the client side, on the user’s device. Apple is yet to officially announce this new initiative, and the details will matter.

At a high level, this kind of system is similar to the machine learning features for object and scene identification already present in Apple Photos. Analysis happens on-device, and users can take advantage of better search functionality.

However, cryptography and security expert Matthew Green notes that the implications of such a rollout are complicated. Hashing algorithms are not foolproof and may turn up false positives. If Apple allows governments to control the fingerprint content database, then perhaps they could use the system to detect images of things other than clearly illegal child content, such as to suppress political activism.

However, note that all photos uploaded to iCloud Photos for backup and sync are not stored end-to-encrypted anyway. Photos are stored in an encrypted form on Apple’s server farms, but the keys to decrypt are also owned by Apple. This means that law enforcement agencies can subpoena Apple and see all of a user’s uploaded photos. (This is not unusual, all third-party photo services work this way.)

It is possible that in the future Apple could roll out similar systems to scan content on the client side, that would later be stored on a server in an end-to-end encrypted manner. Many governments have campaigned for such a system from E2E private messaging apps like iMessage and WhatsApp as they are worried that the increasing shift to encrypted communications will make it harder for law enforcement to find and prosecute child abuse cases.

Green speculates that Apple wouldn’t have invested in developing this system if applying it to end-to-end encrypted content wasn’t a long term goal.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Benjamin Mayo Benjamin Mayo

Benjamin develops iOS apps professionally and covers Apple news and rumors for 9to5Mac. Listen to Benjamin, every week, on the Happy Hour podcast. Check out his personal blog. Message Benjamin over email or Twitter.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications