I’m a huge privacy advocate who’s written a lot about the topic because it’s a massively important issue.
It’s important for two reasons: First, because the kind of technology we have available to us today poses privacy risks never before imagined. China, for example, has demonstrated the ability of its massive network of facial-recognition cameras to track one individual among millions as they travel from one side of a city to the other.
Second, because once you allow something to happen, it is very, very hard to roll it back. Crises are particularly dangerous in this respect, because it’s easier to justify extreme measures at extreme times – like the coronavirus – but once a government goes down a particular road, it’s vanishingly rare that they abandon the approach once the crisis has passed…
That’s why I’ve consistently and repeatedly argued in favor of privacy protections even when an exception might, to some, appear to be justified.
I’m a strong advocate of end-to-end encryption, for example.
You want to ban civilian use of end-to-end encrypted messaging, you say? Think about the impact on journalism. Think about the massive criminal opportunities you would be creating for identity theft and other forms of fraud. Above all, please think about the fact that you are telling your citizens they are no longer entitled to have private conversations using any electronic means, nor to privately share their photos with their partner, friends, or family. Think about what kind of regime wants that.
I argued that Apple should resist any government calls to compromise the privacy protections built into iOS both before and after the San Bernardino case. In particular, I’ve cautioned against the slippery slope of a knee-jerk response to an emergency.
Again, however, we need to look beyond what is being asked for in the short term to what is likely to follow. In this particular case, the FBI wants Apple to unlock the phone. (Technically, Apple would remove the safeguards and the FBI would unlock the phone, but that’s a semantic argument.) Apple continues to hold the key.
But it is an extremely short distance from there to arguing that there will be some very time-critical cases where the delay involved in knocking on Apple’s door is too damaging. The classic ‘time-bomber in custody’ scenario. That the FBI needs to hold the key to prevent delay. It still wouldn’t do so without a court order, so where’s the harm? It would simply be cutting out the middleman.
So soon, the FBI would hold the key. Then other law enforcement agencies. In time, that key would be held in every police precinct house. We would then be trusting more than a million people with access to that key to abide by the rules. Government agencies don’t always have the best of track-records in doing that.
I could go on. The reason I’m stressing my stance on this is because I don’t want my position here to be misunderstood. I am categorically not arguing that the coronavirus crisis is a reason to sacrifice privacy.
I do not support contact tracing apps that collect personal information, track locations or store data in a central server, for example: I instead favor the Apple/Google approach, with its eight safeguards.
I also favor sanity. Right now, some privacy advocates seem to be adopting positions that defy sanity.
I see a lot of people saying that they would refuse to install contact tracing apps even if they use the Apple/Google API. That is, frankly, nuts. The API has no fewer than eight privacy safeguards:
- You choose whether or not to participate
- No personally identifiable data is used
- No location data is captured or stored
- No data goes to your government without your permission
- No one will know who infected them
- Only official government apps can access the data
- Apple and Google can disable the system at any time
- All of these claims are independently verifiable
I’m a privacy nut, yet I would happily install a contact tracing app using this API without any qualms at all.
We’re seeing some privacy advocates — like Human Rights Watch — argue that we shouldn’t be promoting “untested technology.” Or that, because it won’t be available to everyone, we should hesitate to make it available to anyone.
While protecting human life and public health is a paramount concern of policymakers everywhere, Human Rights Watch warned that governments and the private sector should not promote or use unproven and untested technology […]
Human Rights Watch also cautioned that over-reliance on mobile location tracking for COVID-19 responses could exclude marginalized groups who may not have reliable access to the internet and mobile technology, putting their health and livelihoods at risk.
And yesterday we had the absurdity of the German government expressing concerns that taking people’s temperature before they enter an Apple Store may violate privacy rules.
We need a sense of perspective here. No, crises don’t justify invasion of privacy. No, we shouldn’t give powers to a government for use during an emergency if we wouldn’t want them to be able to use those powers in normal times. No, we shouldn’t be tracking anyone’s location.
But we also shouldn’t be crying wolf over an API that poses no threat to anyone’s privacy — and we shouldn’t be stopping a retailer taking a simple, sensible step to make sure people aren’t feverish before letting them mix with staff and other customers.
Taking an extremist position not only makes it harder to implement completely sensible and innocuous responses to the pandemic, but also makes it harder for us to be taken seriously when making legitimate privacy-protection arguments.
That’s my view — what’s yours? Would you object even to the Apple/Google API, and to having your temperature taken when entering an Apple Store? Or do you agree that these are perfectly sensible measures that pose no threat to privacy? Please share your thoughts in the comments.
FTC: We use income earning auto affiliate links. More.