Skip to main content

In internal memo, Apple addresses concerns around new Photo scanning features, doubles down on the need to protect children

Apple yesterday officially announced a range of new features coming later this year, dubbed Expanded Protections for Children. The new features include protections for sensitive images in iMessage, iCloud Photo scanning for child sexual abuse material (CSAM) content, and new knowledge for Siri and Search.

In an internal memo distributed to the teams that worked on this project and obtained by 9to5Mac, Apple acknowledges the “misunderstandings” around the new features, but doubles down on its belief that these features are part of an “important mission” for keeping children safe.

Apple has faced a significant amount of pushback for these features, including from notable sources such as Edward Snowden and the Electronic Frontier Foundation. The criticism centers primarily on Apple’s plans to scan iCloud Photos to check against a database of child sexual abuse material (CSAM) for matches and the potential implications of such a feature.

The memo, which was distributed late last night and obtained by 9to5Mac, was written by Sebastien Marineau-Mes, a software VP at Apple. Marineau-Mes says that Apple will continue to “explain and detail the features” included in this suite of Expanded Protections for Children.

Marineau-Mes writes that while Apple has seen “many positive responses” to these new features, it is aware that “some people have misunderstandings” about how the features will work, and “more than a few are worried about the implications.” Nonetheless, Marineau-Mes doubles down on Apple’s belief that these are necessary features to “protect children” while also maintaining Apple’s “deep commitment to user privacy.”

Here is the memo in full:

Today marks the official public unveiling of Expanded Protections for Children, and I wanted to take a moment to thank each and every one of you for all of your hard work over the last few years. We would not have reached this milestone without your tireless dedication and resiliency.

Keeping children safe is such an important mission. In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal, Product Marketing and PR. What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple’s deep commitment to user privacy. 

We’ve seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we’ve built. And while a lot of hard work lays ahead to deliver the features in the next few months, I wanted to share this note that we received today from NCMEC. I found it incredibly motivating, and hope that you will as well.

I am proud to work at Apple with such an amazing team. Thank you!

The memo also includes a message from the National Center for Missing and Exploited Children, signed by Marita Rodriguez, executive director of strategic partnerships. Apple is working closely with NCMEC on the new iCloud scanning features.

Here is the full text of the note from NCMEC sent to the team at Apple working on these features:

Team Apple,

I wanted to share a note of encouragement to say that everyone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection.

It’s been invigorating for our entire team to see (and play a small role in) what you unveiled today.

I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.

Our voices will be louder.

Our commitment to lift up kids who have lived through the most unimaginable abuse and victimizations will be stronger.

During these long days and sleepless nights, I hope you take solace in knowing that because of you many thousands of sexually exploited victimized children will be rescued, and will get a chance at healing and the childhood they deserve.

Thank you for finding a path forward for child protection while preserving privacy.

What do you think of Apple’s announcements around expanded protections for child safety? Let us know down in the comments!

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Chance Miller Chance Miller

Chance is an editor for the entire 9to5 network and covers the latest Apple news for 9to5Mac.

Tips, questions, typos to chance@9to5mac.com