Discussing the FBI case with a friend yesterday, one distinction that seems worth addressing is the difference between a backdoor into iPhones – which is what law enforcement agencies have so far been calling for – and what we might term a master key, which is what the FBI is calling for in this particular case.

Law enforcement agencies have so far been calling for Apple to abandon its use of strong encryption. Technically, they want Apple to build in a backdoor route into that encryption for use by law enforcement agencies, but that’s the same thing: strong encryption with a built-in flaw is not strong encryption. It’s only a matter of time before hackers find and exploit it.

What the FBI is asking in the San Bernardino case is quite different. Instead of asking Apple to weaken the encryption, they want it to weaken the lock guarding access to the phone by removing the auto-wipe and time-delay functions. That would leave the phone vulnerable to a brute-force attack.

And, the FBI assures us, it isn’t asking Apple to do this for all iPhones, just this one specific device. It’s a very different scenario, and one that sounds superficially harmless …

Again, I emphasise that I fully sympathize with the FBI’s intentions in the San Bernardino case. If there are other terrorists out there waiting to launch similar attacks, we want to know about them. If this iPhone could lead to those possible terrorists, it doesn’t seem unreasonable to agree to this one request.

The problem, however, is that it is never ‘just this once.’ Any time we give up one of our civil liberties in a good cause, the legal precedent is set. There will be a second exception, and a third and … Effectively, if we permit just a single exception to a Constitutional right, we give up that right forever.

Some are arguing that there are sufficient safeguards in this case such that we don’t need to worry. Only Apple would hold the key, and it would only use it when presented with a court order. The FBI and other agencies would have no ability to carry out warrantless searches, and it would likely only seek court orders in the most serious of cases.

Again, however, we need to look beyond what is being asked for in the short term to what is likely to follow. In this particular case, the FBI wants Apple to unlock the phone. (Technically, Apple would remove the safeguards and the FBI would unlock the phone, but that’s a semantic argument.) Apple continues to hold the key.

But it is an extremely short distance from there to arguing that there will be some very time-critical cases where the delay involved in knocking on Apple’s door is too damaging. The classic ‘time-bomber in custody’ scenario. That the FBI needs to hold the key to prevent delay. It still wouldn’t do so without a court order, so where’s the harm? It would simply be cutting out the middleman.


So soon, the FBI would hold the key. Then other law enforcement agencies. In time, that key would be held in every police precinct house. We would then be trusting more than a million people with access to that key to abide by the rules. Government agencies don’t always have the best of track-records in doing that.

Ok, you could argue, but where’s the harm? This is the ‘nothing to hide’ argument. That if we’re all law-abiding people, why should we fear the government snooping in our phones? As I’ve argued before, however, that’s a silly argument. If you take that line, then you could equally argue that everyone should be fitted with a GPS chip embedded in their skin to track their movements, and we should have blanket CCTV coverage on every street and in every building.

And lots of people have perfectly legitimate things to hide, from a partner sending intimate photos to cheer up a soldier serving overseas to journalists with contact details for confidential sources.

But even if we would trust our government with that much power, it isn’t just one government we have to consider. The USA almost certainly has agreements with friendly governments to share certain technologies, and the iPhone master key could well join the list. History shows that a country considered an ally today may well be an enemy tomorrow.

Even if you were prepared to risk that, there is no getting away from the fact that terrorists may be evil, but they are generally not stupid. In the San Bernardino case, it appears the shooters destroyed their own phones and hard drives, and the FBI is somehow hoping they might still have left incriminating evidence on a work phone. Do we doubt that they had the wit to use a strong passcode too? Terrorists and major criminals use burner phones to plan their attacks, not their own iPhones, registered in their own names and using their own Apple IDs.

So the arguments I made before this all happened haven’t changed. We would still be asked to sacrifice our right to privacy. We would still have no control over who ends up with the ability to access our devices. And it would still achieve nothing worthwhile.

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news:

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!

Ben Lovejoy's favorite gear