See italicised updates below, with statements from both the Department of Justice and Apple.
The battle between the FBI and Apple over accessing a work phone used by one of the San Bernardino terrorists started as headline news and ended in a rather anti-climactic fashion.
The high-profile congressional hearing was due to be followed by a big showdown in court. Instead, the FBI asked that the hearing be vacated, and later quietly announced that it had, with help, managed to gain access to the phone. Nothing to see here, move along.
But while this particular case may be settled, it’s extremely unlikely that this will be the end of the matter – for two reasons …
First, CNN reported today that the method used works only on the particular model in question, an iPhone 5c. While we may or may not get to find out what that method was, the smart money seems to be on the approach described by Edward Snowden.
Apple protects iPhones against brute-force passcode attacks by limiting the number of attempts to ten. But the passcode attempt counter is stored in NAND flash memory. If you copy the contents of that memory, make your ten attempts and then copy it back again before the next round of attempts, you can repeat that process as many times as needed to reach the correct passcode. Provided only a 4- or 6-digit numeric code has been used, it’s a relatively trivial process.
But the approach relies on the fact that the iPhone has no way of knowing that the counter has been reset. This is true of the iPhone 5c, but may not be true of later models. The reason? The Secure Enclave.
Nobody seems to know for sure at this stage, but it appears that the Secure Enclave may have the ability to log the number of passcode attempts in its own non-volatile storage – which would prevent that method of working on the iPhone 6 and up. From ArsTechnica:
Apple implies that the A7 processor—the first to include the “Secure Enclave” function—does have some form of non-volatile storage of its own. On the A6 processor and below, the time delay between PIN attempts resets every time the phone is rebooted. On the A7 and above, it does not; the Secure Enclave somehow remembers that there has been some number of bad PIN attempts earlier on. Apple also vaguely describes the Secure Enclave as having an “anti-replay counter” for data that is “saved to the file system.” It’s not impossible that this is also used to protect the effaceable storage in some way, allowing the phone to detect that it has been tampered with.
So if the FBI or any other U.S. law enforcement agency wants to access a more recent iPhone, it may very well head straight back to court to make the same demand that Apple write a ‘GovtOS’ to bypass that protection.
I don’t expect it to do this anytime soon. The FBI clearly thought that, in a high-profile terrorism case, it would have overwhelming public support. That turned out not to be the case – in large part because it seems incredibly unlikely that there’s any incriminating data on the phone in question. Law enforcement agencies are likely to keep their heads down for a while, and ensure that the next case they bring is a far more compelling one. But it’s only a matter of time before that happens.
Update: The Department of Justice has confirmed my view in a statement to ArsTechnica:
“It remains a priority for the government to ensure that law enforcement can obtain crucial digital information to protect national security and public safety, either with cooperation from relevant parties, or through the court system when cooperation fails,” Melanie Newman, a Justice Department spokesman, wrote in an e-mail to Ars. “We will continue to pursue all available options for this mission, including seeking the cooperation of manufacturers and relying upon the creativity of both the public and private sectors.”
Apple responded by saying:
This case raised issues which deserve a national conversation about our civil liberties, and our collective security and privacy. Apple remains committed to participating in that discussion
If the Secure Enclave does work in the way ArsTechnica speculates, I’m not clear whether Apple could even load new firmware onto an existing locked phone. If it can, you can be sure that’s a hole Apple will seek to plug next time around.
The second reason the matter is unlikely to end here is that a national debate has now begun, and Congress has taken an interest. Even if another court case isn’t brought, it’s almost certain that Congress will be pressed to pass legislation to address the issue.
This legislation could take the form of affirming that device manufacturers can be required to assist law enforcement agencies to access devices when presented with a court order. It could even go further than this, and ban manufacturers from making devices that cannot be breached. In the FBI’s language, outlaw the creation of ‘warrant-proof spaces.’
We’re already seeing the UK head in this direction, using very similar phrasing. Prime Minister David Cameron has said that the government wants to ban messaging services that use end-to-end encryption because they provide terrorists with a “safe space to communicate.” The government has put forward proposals in an Investigatory Powers Bill which, if passed, would outlaw services like iMessage and FaceTime. The bill is commonly referred to in the UK as the Snooper’s Charter.
Whether either country would go so far as to pass such legislation is as yet unknown, but the fact remains that the attempt is being made in the UK and we can expect a similar attempt in the USA.
So one way or another, this issue will be back. Apple will, I’m sure, be making whatever hardware and software changes are necessary to ensure that it is simply unable to load a ‘GovtOS’ onto future devices, but it can only do what is permissible within the law. If the law is changed to say that manufacturers are not allowed to make hack-proof devices, Apple would have no choice but to comply.
There is one piece of good news in all this: the debate is now a very public one. I’ve made my own views known, both before and after the San Bernardino shootings. For me, the bottom line is that any time you build in a vulnerability designed to be used by the good guys, it’s only a matter of time before it is discovered and exploited by the bad guys.
But while my own view is clear, I do still see the other side. I can absolutely appreciate the frustration law enforcement officials feel if placed in a situation where not even court orders will allow them to unlock a device they believe to be key to either solving a case or – more urgently – preventing a future attack. There are no absolute rights and wrongs here. It’s right that we should debate the pros and cons of the two sides, and it’s absolutely right that the debate take place out in the open, rather than behind the closed doors of secret courts.
When the issue is finally decided – either by the Supreme Court or by Congress – which way do you think it will go? Take our poll, and please share your thoughts in the comments.