Five days after Apple’s child protection measures were announced, there has been no let-up in the controversy surrounding the upcoming new features. Latest to comment is Facebook’s former security chief and now Stanford cybersecurity professor Alex Stamos.

Stamos says that there are no easy answers here, and calls for more nuanced discussion than the prevailing narratives that this is either a great move or an unacceptable one …

Stamos used a Twitter thread to make his points, starting with the need for a balanced understanding of the issues.

In my opinion, there are no easy answers here. I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies. Nuanced opinions are ok on this.

He said those stating that the measures are completely unacceptable need a better understanding of the horrific nature and terrifying scale of the problem of online child sexual abuse.

First off, a lot of security/privacy people are verbally rolling their eyes at the invocation of child safety as a reason for these changes. Don’t do that. The scale of abuse that happens to kids online and the impact on those families is unfathomable.

Earlier this year I sat a in a courtroom with a dozen teenage girls and women who were victimized on my watch. I heard them explain the years of psychological terror they experienced. I saw the self-cutting scars on their arms and legs. Don’t minimize their pain. Just don’t.

He said that Facebook – which scans uploaded photos – caught 4.5M users posting known matches for CSAM, and that is likely just a proportion of the total number of offenders.

But he says that those dismissing privacy concerns are being “both harmful and unfair.”

I have friends at both the EFF and NCMEC, and I am disappointed with both NGOs at the moment. Their public/leaked statements leave very little room for conversation, and Apple’s public move has pushed them to advocate for their equities to the extreme.

He is also critical of Apple unilaterally announcing its own approach after declining to participate in discussions between child safety experts, tech platforms, and cybersecurity academics.

For the last couple of years, our team at @stanfordio has been hosting a series of conferences on how to balance the safety and privacy aspects of E2EE products. We have seen really productive conversations between advocates, platforms and academics.

Apple was invited but declined to participate in these discussions, and with this announcement they just busted into the balancing debate and pushed everybody into the furthest corners with no public consultation or debate.

He said this was frustrating and damaging.

I am both happy to see Apple finally take some responsibility for the impacts of their massive communication platform, and frustrated with the way they went about it. They both moved the ball forward technically while hurting the overall effort to find policy balance.

Apple yesterday published a FAQ intended to address some of the misconceptions and concerns about the photo scanning, but failed to adequately address the biggest remaining issue: That it has no way to guard against future misuse by governments.

You can listen below to the first reactions of Stamos, two of his Stanford colleagues, and Johns Hopkins cryptographer Matthew Green.

FTC: We use income earning auto affiliate links. More.


Check out 9to5Mac on YouTube for more Apple news:

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Ben Lovejoy's favorite gear