More than 90 civil rights groups around the world have signed an open letter objecting to what they call iPhone surveillance capabilities, asking Apple to abandon its plans for CSAM scanning.
Additionally, they would also like the iPhone maker to drop plans for the iMessage nude detection, as this could place young gay people at risk.
Signatories to the letter include the American Civil Liberties Union (ACLU), the Canadian Civil Liberties Association, Australia’s Digital Rights Watch, the UK’s Liberty, and the global Privacy International.
The letter highlights the primary risk many have raised, of misuse by repressive governments.
Once this capability is built into Apple products, the company and its competitors will face enormous pressure – and potentially legal requirements – from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable.
Those images may be of human rights abuses, political protests, images companies have tagged as “terrorist” or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them.
And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance, and persecution on a global basis.
But it also says that the separate scanning of children’s iMessage accounts for nudes, another form of iPhone surveillance, could put children at risk.
The system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organizer of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk.
It says the organizations respect Apple’s intent, but the company should stand by its privacy values.
We support efforts to protect children and stand firmly against the proliferation of CSAM. But the changes that Apple has announced put children and its other users at risk both now and in the future. We urge Apple to abandon those changes and to reaffirm the company’s commitment to protecting its users with end-to-end encryption. We also urge Apple to more regularly consult with civil society groups, and with vulnerable communities who may be disproportionately impacted by changes to its products and services.
It follows the German parliament writing a similar letter to Apple a couple of days ago.
Via Reuters
FTC: We use income earning auto affiliate links. More.
Comments