The UK’s National Health Service has announced plans to adopt the Apple-Google coronavirus API jointly announced last week, but in a weakened form.
Even without the UK’s planned compromise, the idea of this type of contact tracing has come under criticism from a University of Cambridge computing professor …
Apple-Google coronavirus API: UK plan
The API is designed to allow apps to detect when you have been in close proximity to someone who subsequently tested positive for COVID-19, but a severe lack of testing in the UK means that the NHS version will adopt a weak compromise.
In the UK, COVID-19 tests are only administered to those in a sufficiently serious condition to be admitted to hospital – as well as politicians and celebrities, who appear able to get tested with only mild symptoms.
The BBC reports that the NHS app will therefore allow people to self-diagnose.
At present, the idea is that people who have self-diagnosed as having coronavirus will be able to declare their status in the app.
The software will then send the equivalent of a yellow alert to any other users who they have recently been close to for an extended period of time.
If a medical test confirms that the original user is indeed infected, then a stronger warning – effectively a red alert – will be sent instead, signalling that the other users should go into quarantine.
To report testing positive, the user would have to enter a verification code, which they would have received alongside their Covid-19 status.
This will result in both under- and over-reporting. Under-reporting because many of those infected can be either asymptomatic, or have such mild symptoms that they do not suspect the coronavirus. Over-reporting because online and telephone diagnoses are erring on the side of caution, and telling anyone with possible symptoms that they are likely to have it, in order to ensure maximum compliance with self-isolation regimes.
Criticisms of contact tracing apps
The University of Cambridge’s Professor Ross Anderson has published a blog post outlining what he believes are seven problems with the approach.
Some of the criticisms don’t relate directly to the API. For example, he says it would be wrong for an app to promise anonymity for what is a ‘notifiable disease.’ This means that while the app won’t reveal your identity, health officials will still ask you about your contacts, and will call those people.
Another generic criticism is the possibility of trolling.
Anyone who’s worked on abuse will instantly realise that a voluntary app operated by anonymous actors is wide open to trolling. The performance art people will tie a phone to a dog and let it run around the park; the Russians will use the app to run service-denial attacks and spread panic; and little Johnny will self-report symptoms to get the whole school sent home.
But he argues that there are also inherent weaknesses in a system which relies on being within Bluetooth range.
On Friday, when I was coming back from walking the dogs, I stopped to chat for ten minutes to a neighbour. She stood halfway between her gate and her front door, so we were about 3 metres apart, and the wind was blowing from the side. The risk that either of us would infect the other was negligible. If we’d been carrying bluetooth apps, we’d have been flagged as mutual contacts […]
Bluetooth also goes through plasterboard. If undergraduates return to Cambridge in October, I assume there will still be small-group teaching, but with protocols for distancing, self-isolation and quarantine. A supervisor might sit in a teaching room with two or three students, all more than 2m apart and maybe wearing masks, and the window open. The bluetooth app will flag up not just the others in the room but people in the next room too.
It’s also clear from my own conversations with non-tech friends that there is a huge public education job to be done here. Non-techies don’t understand how these apps work, and are unlikely to trust the privacy unless the protections built into the Apple-Google coronavirus API are very carefully explained in non-technical terms.
A poll we ran prior to the Apple/Google announcement, proposing the idea, found that many more people trusted the two tech giants more than their own government.
FTC: We use income earning auto affiliate links. More.