The latest Apple/FBI war of words in the Pensacola case has once again highlighted the huge challenge Apple has in communicating the reality of the debate, in a world in which most people have no understanding of the core issue.
To a non-technical person, the debate appears to be a moral one. The FBI says that it needs access to data from terrorists and criminals, and Apple wants to prevent this. FBI, good; Apple, bad.
To anyone who understands the technology, the debate is very different …
The debate is this:
Do you want secure iPhones, which will sometimes mean that law enforcement agencies cannot access all the data they want (but will usually be able to access most of it)? Or do you want iPhones to contain a deliberate insecurity which would allow them to be accessed by the FBI – but which will inevitably be discovered and used by criminals and foreign states?
In simpler terms, do you want your iPhone to be secure or not?
That is the entire Apple/FBI debate.
How can Apple successfully communicate the reality of the debate in terms anyone can understand? Just as I did for the Apple/Google contact tracing API, here is my attempt at it.
Apple already helps the FBI
Every time the FBI has a court order allowing it to access personal data belonging to a suspected terrorist or criminal, Apple offers all of the assistance it can.
It will hand over a copy of the iCloud backup of the iPhone. This doesn’t contain all of the data stored on the phone, but it does have most of it – including the Notes that enabled the FBI to link the Pensacola shooter to Al-Qaeda (top photo).
Apple will also provide account information, transactional data, and so on. In short, Apple will – when given legal authority – hand over all of the information it possesses about an iPhone and its owner.
For most iPhones, the FBI doesn’t need Apple’s help
The FBI can break into most iPhones without Apple’s help.
Apple is engaged in a constant game of cat-and-mouse with hackers – some of them state entities operating with essentially unlimited resources. The hackers try to find vulnerabilities in iOS, and Apple tries to block them.
Keeping pace requires Apple to regularly update iOS with new security patches. Sometimes hardware vulnerabilities are discovered, which means that hackers can get access even when the iPhones are running the latest version of iOS. At any given moment, there will be vulnerabilities Apple doesn’t yet know about and so can’t yet fix.
Most of the vulnerabilities discovered are sold to companies like Zerodium, which will, in turn, make them available to anyone who wants to pay for them. Including the FBI.
This was true in both high-profile cases
In both cases, Apple handed over all of the data it had. And in both cases, the FBI was able to use commercial services to crack the phones to get the rest of the data.
Apple cannot just unlock phones for the FBI
There is no magic way to unlock an iPhone. The safeguards Apple builds into iOS are designed to stop anyone unlocking an iPhone without the owner’s permission. Apple is no more able to unlock your iPhone than I am.
Why not provide a backdoor?
It is impossible to provide a backdoor that can be used only by the good guys. Any weakness Apple builds into iOS for use by the FBI will inevitably be discovered by others, and used by the bad guys.
An iPhone can be secure against everyone, or secure against no-one; there is no in-between option here.
Apple/FBI debate in summary
Your iPhone can be secure, or insecure. The FBI wants it to be insecure. Apple wants it to be secure. What do you want?
That, then, is my attempt to explain the debate in terms normal people can understand. Perhaps you could test it on some of your non-technical friends and let me know whether it does the job?
FTC: We use income earning auto affiliate links. More.