Discussing the FBI case with a friend yesterday, one distinction that seems worth addressing is the difference between a backdoor into iPhones – which is what law enforcement agencies have so far been calling for – and what we might term a master key, which is what the FBI is calling for in this particular case.
Law enforcement agencies have so far been calling for Apple to abandon its use of strong encryption. Technically, they want Apple to build in a backdoor route into that encryption for use by law enforcement agencies, but that’s the same thing: strong encryption with a built-in flaw is not strong encryption. It’s only a matter of time before hackers find and exploit it.
What the FBI is asking in the San Bernardino case is quite different. Instead of asking Apple to weaken the encryption, they want it to weaken the lock guarding access to the phone by removing the auto-wipe and time-delay functions. That would leave the phone vulnerable to a brute-force attack.
And, the FBI assures us, it isn’t asking Apple to do this for all iPhones, just this one specific device. It’s a very different scenario, and one that sounds superficially harmless …
Again, I emphasise that I fully sympathize with the FBI’s intentions in the San Bernardino case. If there are other terrorists out there waiting to launch similar attacks, we want to know about them. If this iPhone could lead to those possible terrorists, it doesn’t seem unreasonable to agree to this one request.
The problem, however, is that it is never ‘just this once.’ Any time we give up one of our civil liberties in a good cause, the legal precedent is set. There will be a second exception, and a third and … Effectively, if we permit just a single exception to a Constitutional right, we give up that right forever.
Some are arguing that there are sufficient safeguards in this case such that we don’t need to worry. Only Apple would hold the key, and it would only use it when presented with a court order. The FBI and other agencies would have no ability to carry out warrantless searches, and it would likely only seek court orders in the most serious of cases.
Again, however, we need to look beyond what is being asked for in the short term to what is likely to follow. In this particular case, the FBI wants Apple to unlock the phone. (Technically, Apple would remove the safeguards and the FBI would unlock the phone, but that’s a semantic argument.) Apple continues to hold the key.
But it is an extremely short distance from there to arguing that there will be some very time-critical cases where the delay involved in knocking on Apple’s door is too damaging. The classic ‘time-bomber in custody’ scenario. That the FBI needs to hold the key to prevent delay. It still wouldn’t do so without a court order, so where’s the harm? It would simply be cutting out the middleman.
So soon, the FBI would hold the key. Then other law enforcement agencies. In time, that key would be held in every police precinct house. We would then be trusting more than a million people with access to that key to abide by the rules. Government agencies don’t always have the best of track-records in doing that.
Ok, you could argue, but where’s the harm? This is the ‘nothing to hide’ argument. That if we’re all law-abiding people, why should we fear the government snooping in our phones? As I’ve argued before, however, that’s a silly argument. If you take that line, then you could equally argue that everyone should be fitted with a GPS chip embedded in their skin to track their movements, and we should have blanket CCTV coverage on every street and in every building.
And lots of people have perfectly legitimate things to hide, from a partner sending intimate photos to cheer up a soldier serving overseas to journalists with contact details for confidential sources.
But even if we would trust our government with that much power, it isn’t just one government we have to consider. The USA almost certainly has agreements with friendly governments to share certain technologies, and the iPhone master key could well join the list. History shows that a country considered an ally today may well be an enemy tomorrow.
Even if you were prepared to risk that, there is no getting away from the fact that terrorists may be evil, but they are generally not stupid. In the San Bernardino case, it appears the shooters destroyed their own phones and hard drives, and the FBI is somehow hoping they might still have left incriminating evidence on a work phone. Do we doubt that they had the wit to use a strong passcode too? Terrorists and major criminals use burner phones to plan their attacks, not their own iPhones, registered in their own names and using their own Apple IDs.
So the arguments I made before this all happened haven’t changed. We would still be asked to sacrifice our right to privacy. We would still have no control over who ends up with the ability to access our devices. And it would still achieve nothing worthwhile.
FTC: We use income earning auto affiliate links. More.
There is no evidence that this phone, that belongs to the gunman’s employer, contains anything useful to the FBI. That the gunman destroyed his other phones and the drive from his computer suggests that this phone was not used for any conspiratorial communication.
There is no evidence that the FBI will be able to enter this phone. If the gunman was smart enough to destroy his other phones, and remove the drive from his computer, I’d bet money on his having used more than a four number passcode. That makes brute forcing it a project that would take far too long for the FBI to complete.
Even if Apple is forced to open up this and therefor all phones, the bad guys, who already use burner phones and spoof e-mail drops, will start using encrypted and coded communications. They likely already are.
This case is only about one thing: The government wants to spy on its citizens. You mention the FBI sharing the key with other law enforcement agencies. I suspect a copy would go to the NSA the day it was made. They have several decades of history with illegal spying on Americans.
I agree. This is sounding less like a back door into the iPhone, and more like a back door around privacy rights.
The Gubbermint has misused, abused, and engaged in some of the most OUTRAGEOUS illegal conduct over the last 10 years to wit— NSA, DEA, FBI, IRS, Lerner, Attkinson, DOJ, Tea Party supression etc etc etc……….
Their unrepentant, unabated conduct has now earned them the universal “Hell NO” response……..you know that if the tech industry crumbles then the “me-too” BS will start unabated….Never mind the Chinese, Russians, and assorted other 3-4th world dictators, despots, and mutants….!!!!
Encryption and “burner phones” have been around for YEARS!!
And for the TSA–“it’s for our safety” crowd I say BS.
Feds get off your azzzzzzzzzzzz and go recruit sources/informants and metadata sources; get out of the office and away from your desktop….
Use your GS-1811 taxpayer provided G-car for something other than stopping for groceries on the way home or dropping kids at school…..
Why can’t the government just hand the phone over to Apple and let Apple extract the data (even if under the supervision of FBI agents/specialists)? Then the government gets their data and the software tools Apple uses doesn’t get out into the public domain. Granted, Apple will have to create the software to make this work , but the government appears to be willing to pay for that.
Because once it exists and has been used once, everything I described follows. There is no such thing as a one-off exception in law.
There’s rarely a “just this once” in any facet of life. Whether it’s having just one drink, or just one affair, or just one potato chip, it is in our nature that once we do something, it makes it easier for us to do it again, and the more we do it, the easier it is to continue doing it.
Sure – but as long as Apple kept in house and was the sole provider of the service, which Apple would obviously only do when ordered to do so by a court, then it solves the problem (other than courts and the government snooping into people’s lives). Apple already provides data it can access when required to do so by a court order. Apple’s compliant here is that turning over any tools to the government that can crack the iPhone security puts it at risks of abuse by the government and risks it entering into the public domain.
There are also chain of evidence issues in that the evidence was handed over to a third party, and the FBI might have trouble authenticating anything that was pulled from the phone.
Were I a lawyer, that’s’ what I’d argue. 😜
Hypothetically, let’s say you were on the dev team asked to build this signed, compromised firmware. There might be 10 people on the team who would need to have access to it in order to build, test, and ship it. Knowing such a thing exists, all of them would know how valuable such a thing would be on the black market, and also to the NSA which we may as well consider the same thing given their proven track record.
Anyway, we’d be asking them to generate an .ipsw file, a couple gigs, which would in the wrong hands, render every iPhone on the planet using a four or six digit passcode trivially crackable. And we’d be asking all of them to promise not to accidentally let a copy walk out the door and make its way to be sold on the darknet.
Can you think of anything which could go wrong?
Meanwhile, again, anyone with something criminal to actually hide can still use a few minutes with OpenSSL to create encrypted data that the NSA couldn’t crack in ten lifetimes I matter how they transmit it. The only victim is normal citizens who can get hacked trivially to steal their identities (or their private photos).
There is nothing that could be said that would make me support a master key or backdoor option. Plain and simple this case is perfect for the FBI to try and use to accomplish their goal. Then it will go from “just this once”, to “we will follow the right procedures” and on and on. FBI agents are human and as long as human nature is a factor there will be abuses of the technology, guaranteed. Only in this case no prosecution will come from the abuse, FBI agents doing the same thing a hacker does but which one goes to jail?
I can foresee the Master Key for sale on eBay.
You can’t put the genie back in the bottle – Once it’s out, It’s OUT!
The good of the many versus the good of the one.
I vote that Apple fights this all the way to the Supreme Court. Hopefully by then we’ll have 9 Justices!
Would you REALLY be concerned to debate “should a terrorist’s iPhone be allowed to self reset or not? Should it be allowed to cancel evidences and hide possible conspirators?”
I don’t think so.
Here a judge asked the company to create a tool to overwrite the iPhone firmware to prevent auto reset. That is the matter, the subject of the request.
All the rest is easy unconstrained libertarian babbling.
It is so sad to see the Facebook-who-cares-privacy generation writing tons of articles debating about a request to help justice to gather evidences on terrorist crimes.
The violation of your own home was and still is FAR more dangerous and intrusive than the violation your tinder activity on your phone.
But even for the Constitution, for the public safety and order, this right can be temporary emended by the order of a judge.
You can’t be so hypocrite, come on.
If the FBI can brute force phones terrorists will just use strong passwords to protect sensitive data. The only people who will give up something (privacy in this case) are the honest citizen. Whoever gives up freedom for security ends up with neither of them.
Yes I would argue that a Terrorist’s phone should not be cracked. I wouldn’t want Osama Bin Ladin’s phone to be cracked. They can get everything they need in other ways and the damage to all of us is FAR too great. Terrorism is whatever government doesn’t want people doing. Today’s psycho school shooter may or may not be a terrorist depending on what he prays to and the color of his skin. Don’t bore me with the “it’s to stop terrorism” argument. Never was. Never will be. Freedom is freedom. Tyranny is tyranny.
And don’t you call me Libertarian. I find that insulting.
Amen Doug. I would add Fear is Fear, and is the only tool ever used to effectively destroy freedom.
Well I am a frequent poster – and I don’t have a Facebook account or, for that matter, any other “social interaction” web site accounts. It’s not that I’m hiding anything – it’s that most of my life is too boring and who the-hell-would-be-interested in what I do day-to-day? I don’t feel the need to post up my daily bathroom routine. So, that precludes much of your post about the “Sharing Generation.”
I do, however, have a great stake in seeing the Constitution and Bill of Rights being followed which, if you take the time to read and understand so much of what has been published over the last 4 or 5 years, you will see that the US Government has either rode rough-shod over, pulled legal loopholes around or just flat out circumvented those governing rules in their attempts to “Know All – See All – Predict All” under the guise of National Security or some other name that’s designed to make us feel safer.
I don’t feel it has worked and I am unwilling to just give up what those very documents that are the basis of my governments operating charter have promised so that my governments knows all my personal details and can “protect me.”
I think you’re missing the bigger point here. It’s not just the personal data like Tinder and Facebook activity on the phone that would be at jeopardy. If someone were able to access my phone via master key or backdoor they would immediately gain access to: Credit cards I use with Apple Pay, access to my iCloud storage that contains my tax records and other private data, access to my email accounts that could be used to gain access to my bank accounts, etc. Pretty much if someone can get into your cell phone, they can have access to almost everything about you and your identity. THAT is what I don’t want anyone to have access to except for me.
Exactly. Its what I’ve been saying all along. Its the issue of our devices (including our cloud accounts) becoming vulnerable to hackers, especially ones from other countries. Last the in the world I need is my identity stolen. I’m sure all the banks are getting with their lawyers to cover their asses in the event it gets real easy to has our accounts.
“The problem, however, is that it is never ‘just this once.’ Any time we give up one of our civil liberties in a good cause, the legal precedent is set.”
/end
And only Timmy can carry the master key..around his neck.
“The problem, however, is that it is never ‘just this once.’ Any time we give up one of our civil liberties in a good cause, the legal precedent is set. There will be a second exception, and a third and … Effectively, if we permit just a single exception to a Constitutional right, we give up that right forever.”
Enough said. If you want to tackle terrorism, go find and figure the root cause, not simply trying to patch up the “symptoms” that really only weaken everyone’s (the World’s) privacy.
Every step we take towards making the State our Caretaker of our lives, by that much we move toward making the State our Master.
Dwight D. Eisenhower
They who would give up an essential liberty for temporary security, deserve neither liberty or security.
Benjamin Franklin
If Tyranny and Oppression come to this land, it will be in the guise of fighting a foreign enemy.
James Madison
.. Should I go on? These are only a few quotes from many that warn us of an unchecked government.
So we are supposed to believe this one phone holds the key to uncovering homegrown terrorist plots, something the FBI themselves has deemed practically impossible? So, the 300 – 400 pieces of evidence and live, cooperative Co-conspiritor aren’t enough to provide clues or tips to other plots, but this single iPhone is? Seriously the Judge should be removed, for granting the order! This is clearly BS and the attempt to piggyback The FBI’s goals on a tragic but poorly planned and executed event.
Ben, good to hear someone finally talking about master keys not the rather silly idea of a ‘back door’
But here’s how I differ …
Firstly you shouldn’t give up any of your constitutional rights – nor should we on the other side of the atlantic give up our art.8 ECHR right to privacy. Any access should be on lawful court order. No warrant no access. And I mean a warrant to access a specific phone/phones not some general warrant to access what they want. That is clearly covered in our ‘human rights convention’ and I understand search warrant is covered in the US constitution.
Second any master key should be in the hands of Apple, not the FBI. If they want access to a specific phone then send the phone and the warrant to Apple. No warrant no access. I’d like to see act of congress or act of parliament over here saying the key stays with Apple so FBI/NSA can’t demand it in the future.
NSA etc were wrong when they were mass monitoring every phone. Now Apple is wrong when they are saying access to zero phones – even on court order. Both sides need their heads knocking together until they find a middle ground which protects ordinary peoples privacy whilst allowing access to the phone of criminals.
How long do you think that master key would remain in one place only (“safely on file at Apple”), after it exists?
Files can be copied. How many employees work at Apple? How much would organized crime or the NSA (roughly the same idea) pay for a master key that enables nearly any iPhone to be trivially cracked into?
It’s about as irresponsible as you can get to create this. And the number with access and familiarity will only go up as they continue to maintain this branch of the code through iOS upgrades. The only thing that makes Apple any better than the FBI to hold this is their corporate incentive to not want this out there; however the personal incentive to leak it is still there, and these aren’t people with a security clearance and background checks.
If this question has already been raised and answered, I apologize in advance. If we go back in time to before the shooting, didn’t Apple software engineers have the ability to create this master key? Didn’t they also have the opportunity to sell it on the dark net or to another country? No disrespect, but maybe this has already happened. So long as humans write the software, the security of software security will always be a problem. I hate to admit this but I may be arguing for a benevolent AI to take over…
Your headline shows a lack of understanding. An Apple master key is the same thing as having a back door.
Any key, means there is a door that anyone can try to hack. Eventually hackers find a way. Hense, Apple’s approach to security.
First of all, I should say that I think that Tim Cook’s statement and course of action with respect to this demand is the right way to go. However, I disagree with the author about why it is the right way to go. The author of this article, as well as many of the commentators, are basing their arguments on flawed principles about justice and government process.
First of all, while the importance of rights cannot be emphasized enough, they are not trumps (lowercase–not “Trumps”) and they have never have been. Rights are expectations about our lives and liberty that should be met, ABSENT a good reason. The more compelling the right is under the circumstances, and the more drastic the potential consequences, the more compelling the reason must be. In this way, the law does not care about outcome as much as it cares about process. That is, what were the reasons for the action (e.g., demanding apple create a master key), and how careful were the deliberations? Many rights expressly stated in the Constitution have built in hedge words (e.g. “unreasonable” search). Other rights which are at least partially a product of judicial creativity, such as privacy and “fundamental rights” are implicitly “hedged” by judicial practice. In that sense, as long as the judiciary approves of the government action, no right has been “infringed.” The expectation of privacy was respected, but superseded (allegedly at least) by compelling circumstances. Often people confuse the term “rights” with their own personal ideals they think a nation should observe. I can’t blame them, I was probably saying the same stuff a few years ago. However, the more I learn about rights, the more I understand how unworkable and flawed a truly unyielding right could be.
Second, the article uses a flawed narrative device commonly employed by politicians, and even sometimes by legal scholars. That device is known as the “slippery slope.” The idea is that, while the immediate proposal may not be extremely offensive, it marks the beginning of the end. It’s an effective tool, but logically unsound, a little lazy, and unrealistic. If the Supreme Court used the same reasoning to this degree, any change to the status quo would be unacceptable. Any opponent could argue that slightly increased taxes on the wealthiest Americans is unacceptable because it will start a pattern of burdensome tax increases on wealthy Americans until there are none left. Others may use the slippery slope argument to rebut proposals to ease the sentences for non-violent drug offenders, because it symbolizes the beginning of an era in which the United States embraces and encourages substance abuse and addiction. The fact is, the government is called upon to make choices with controversial overtones, and we can only ask that they apply their best judgment.
Here’s why I agree with Tim Cook. To quote his message: “We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone TO STEP BACK AND CONSIDER THE IMPLICATIONS.” (emphasis added). He is not saying he will never comply with such an order, he is appealing for more deliberation, commentary, and oversight. He believes that the right at stake is important enough to challenge in order to provoke such deliberation and oversight. And he’s absolutely right. The heightened expectation of privacy for digital devices was recently discussed by the Supreme Court in Riley v. California. I’m sure Tim Cook will continue to challenge as long as either until he is comfortable with the increased attention and oversight, or until he, or the company, face potential consequences that he cannot afford to accept. Explicit in his message is an acknowledgement and respect for the compelling circumstances at hand. Sure, there’s no evidence yet that the phone has anything useful. But the fact that no such evidence has been produced yet does not necessarily mean that the government’s interest in the device is not compelling. To claim the contrary, from an armchair, is fair… but weak.
Personally, I would be happy to see some more judicial oversight. I would be happy if a judicial body struck down the FBI’s demand. But lets not defame the FBI for making an open request subject to public scrutiny.
I’m certainly not defaming the FBI. As I say, I am in sympathy with its position, and believe it has good intentions in this case. But the slippery slope argument absolutely applies here because there is no realistic prospect at all that this would be a one-off case, nor that it would not lead in the long run to the tool filtering out to other law-enforcement bodies.
On the wider point, I agree that we often have to make difficult decisions. This is one of them, and I absolutely appreciate that it’s possible for someone else to come down on the other side of the argument. I wouldn’t call their argument logically unsound, unrealistic or lazy, and I’d appreciate the same courtesy in return.
Ben, you’re right–my criticism was a tad harsh. And I don’t think that you are defaming the FBI, but a number of commentators on this article are.
To be fair, the slippery slope argument ALWAYS applies. That’s not the problem. The problem is that it is a largely discredited logical tool that is more a product of a knee-jerk reaction than a product of reasoned judgment. When disfavored policies are proposed, the emotion often comes before the reasoning. We feel icky, and our brain scrambles for a justification for our icky feeling. I don’t want to be a “Wikipedia scholar” but just look up “slippery slope.” You will see what I mean. It says it’s a device to incite fear, and is supported, at most, by a few historical anecdotes. The Wikipedia article also has some great citations to UCLA and Harvard Law review articles that point out the same flaw with the slippery slope. It’s not just my opinion that slippery slope arguments are invalid. It’s widely accepted that slipper slope arguments are generally invalid. All I’m saying is, the opinion article would be much better if it relied on more credible principles. I think you are a balanced writer.
Again, I agree with you on the larger issue. I fully support Apple’s challenge. And I also completely sympathize with others who think this is a reasonable exception that is not likely to lead to large-scale policies that erode our privacy. But I think you (and I) should reserve the right to point out flaws in argument. Donald Trump’s argument was, stated simply “Who do they think they are?” I would call that lazy and unreasonable. Lots of other people would, too. Some of his supporters may think that his argument is completely called for, and possibly the MOST reasonable position. It’s not that I think that you can’t be on the other side and still be a highly intelligent, compassionate person. As I said in my earlier reply – it’s about process.
Thanks. I absolutely agree the slippery slope argument can often be wildly exaggerated, but I don’t think that applies in this case, as the sequence of events I suggest is, in my view, entirely plausible.
And yes, I can absolutely respect views on both sides of the argument.
They’re always plausible, Ben. But compounding a series of plausible events becomes a shot in the dark. And you can’t support a statement like “everything I said will follow” with a shot in the dark. I understand the impulse, but I can guarantee that this slippery slope argument is nothing special. It’s pretty text book.
Legal writing can be obnoxious and painstaking, but using good argumentative tactics creates more powerful arguments and ideas. Of course, the slippery slope argument will never die.
Which step or steps do you consider unlikely?
Let me start by saying that I agree with you that the scenario wherein the FBI has possession of a locked phone belonging to a terrorist, or a suspected terrorist, is probably an inevitability. So let’s assume that this case goes to the Supreme Court and they uphold the FBI’s demand, and Apple complies…
The biggest jump you gloss over is that the FBI will certainly (eventually) ‘hold the key’ to unlocking devices to facilitate administrative convenience. There are a lot of missing steps here. Although privacy rights yield to circumstances of emergency, the law still favors privacy over administrative convenience. Urging Apple to unlock the device is not the same as requiring the ‘master key’ to be in possession of the FBI. This would be a long up-hill battle for the FBI, even if that was their goal. And although the FBI is interested in efficient investigations, I don’t think it’s fair, at this point, to assume that they would even ask for indefinite possession of the ‘master key.’ Yes, everyone will point out the dragnet, etc. to show that government agencies have no respect for privacy rights, but that’s a bit prejudicial and overbroad. The FBI has more respect for the legal process and civil liberties than some think. Even if the FBI did eventually ask to have possession of a master key without the assistance of the company in question, that is a whole other legal battle of its own. The FBI’s unbridled possession of the master key is not an inconsequential revision of the initial decision to unlock a single phone. It’s not impossible, and it may be proposed. But that ‘step’ alone is actually a big series of steps, each one of which I’m fine calling ‘plausible.’
Then the possibility of other FEDERAL law enforcement having possession of the key. Not implausible, but far from a done deal. This would be another uphill battle. Then state and municipal agencies having possession of the key. Another uphill battle.
Then you deviate slightly from your main point to posit that the “nothing to hide” argument automatically favors ubiquitous monitoring of every citizen, through the use of tracking devices, etc. I don’t like the “nothing to hide” argument either, but there are much more valid arguments against it that can come out with a little more thought. You achieve this in your article with a few hypotheticals about how law abiding citizens sometimes DO have something worth hiding.
It takes too many assumptions with HUGE margins of error to come to the conclusion that this action means that your privacy, and my privacy will be affected. Those margins of error accumulate, and that’s what i mean by “shot in the dark.” I’m not saying that any single one of the steps is absurd. They are all plausible.
The immediate case is not really about privacy. The government has been able to get warrants to search electronic devices for a long time. The government has a warrant for the iPhone, and will not violate a privacy right by inspecting its contents. The right to privacy is an amalgam of other rights, either expressly written in the Constitution or created by Judges, that is hard to define, even for law professors. Of course, lots of people have heard the phrase “right to privacy” and have devised their own conclusion about what it means. It’s not hard to understand why those individuals would be upset when government action violates what they thought was part of the right to privacy. I’m not saying that our nation’s idea of privacy is ideal. It was made up… by lots of smart people, of course. And it seems pretty critical to good civic life.
The immediate case is more about whether the government can force a company (or individual) to assist it with the inspection of an article within its possession. And whether that ability has limits, especially where the company’s assistance forces it to disobey its own standard of conduct towards its consumers, and where the assistance could potentially pose a risk to other consumers’ data.
Thanks for your thoughtful response. It’s Friday night here, so I’ll just address one point briefly …
“Yes, everyone will point out the dragnet, etc. to show that government agencies have no respect for privacy rights, but that’s a bit prejudicial and over broad.”
But that is *precisely* what the government has repeatedly demonstrated. PRISM, Stingray, etc. Give a government the technical means to employ a dragnet approach and it will do exactly that.
One more thing about Stingray, PRISM, etc. There is a significant difference between those secret uses of government power, and a demand made through legal process, subject to judicial discretion and public scrutiny. Yes, some branches of the government have grossly abused power and resources without complying with the legal requirements (e.g. obtaining warrants). Here, the government has a warrant.
I look forward to an opinion that either strikes down the FBI demand, or, if the demand is allowed, makes the process so unpleasant for the FBI that it’s not likely to use it often.
There are three probabilities we should be thinking now about first:
1. Likelihood that this hacked firmware will leak to general law enforcement to be used unsupervised in dragnet operations like NSA would love to do
2. Likelihood that this hacked firmware will leak outside the government too for total criminals or foreign corrupt governments to use against law-abiding citizens
3. Likelihood that creating this Back door will actually on its own save any lives or be the deciding factor allowing us to convict a dangerous criminal.
4. Likelihood that if we do NOT have this back door, that not having it would be the deciding factor causing something super scary to happen.
My estimates of those probabilities: 1: 100%, 2: 100%, 3: 2%, 4: 0.05%
It just seems to me like the danger of this leaking (and it WILL leak if it’s made) is far greater than even the likelihood of it hypothetically helping ANYONE, because only a stupid criminal would use a 4-6 digit passcode to protect a phone with anything incriminating or useful on it when he could use a complex alphanumeric one, or, better yet, use the easily available tools to encrypt his important communications himself on a desktop computer or simply a third-party app, again, where no third party will ever be able to decrypt it.
The basic premise of the FBI is that we need to do this thing which WILL dramatically weaken all phones to protect against some hypothetical threat, when we have a very real threat (hacking, voyeurism, and identity theft) already, which will only get worse after that hacked firmware leaks.
“We the People” petition has started over at https://petitions.whitehouse.gov/petition/apple-privacy-petition
I predict that Apple will lose this battle. If not in the US then in China or some other important market.
There are two (actually more than two) extreme and legitimate sides to this debate
1) A terrorist’s captured phone (upon exercise of a legal warrant) may provide access to a network that is planning an imminent attack
2) A phone confiscated from a political dissident or anti-corruption whistleblower can give an oppressive government “legal” access to the dissident’s network of sympathizers
The technical issues are not difficult. Apple could maintain a highly secure list of passwords hidden behind several layers of anonymized unique identifiers.
Data protected by this “key” mechanism would be significantly more secure than any of our medical or financial information and much easier to protect from government abuse compared to our homes or even our bodies. And, by appointing Apple as our gatekeeper, there is a possibility to hold government entities accountable for abuses after the fact (in the event that fraud is used to obtain a warrant). As we all know, with “terrorism” as an excuse, we need new laws to prohibit blanket and perpetually “classified” warrants. Obviously, once Apple admits that it can be a “gatekeeper”, this will be demanded by all governments including those with no due process barriers. However, since Apple can clearly perform this function (better than any other custodian of our private data), it will be forced to.
There are some practical benefits to making Apple a gatekeeper of legal access to the private data on our phones:
Heirs could retrieve their digital property (e.g. the deceased’s last photos) from an inherited phone
Health care proxies could access vital medical information to help an incapacitated iPhone owner
Both of these objectives would be achieved more securely than they can be now (we can distribute passwords to a list of “trusted” friends, but this forces us to broadcast these passwords and discourages frequent password changes)
In addition, victims of crime (victims of peeping toms or industrial spies) would have a better chance of retrieving information that was obtained illegally (and tracing who it may have been distributed to)
The irony is that more sophisticated criminals will use custom encryption methods once the “retail” option has been removed, so government abuses will be as likely to happen as legitimate government successes in preventing major crimes.
I predict that Apple will lose.
This will have practical benefits as outlined above (more convenient access to our own information and thwarting small-scale crime)
Terrorist masterminds and privacy absolutists will have to accept that Apple can’t promise to keep their secrets and they will have to store them outside the Apple ecosystem. It will be a bit more work, and the UI may not be as esthetically pleasing… The rest of us are so bad at protecting our data that it will have no impact
And when the day comes that a terrorist sets off a dirty bomb in the middle of a big city, we can all find comfort in knowing Apple protected our privacy.
You’re sarcastic statement is implying that Apple protecting our privacy would be the cause of a bomb going off in the middle of a big city…
Why don’t the FBI give Apple the iPhone in question and let Apple themselves retrieve the data, if any? That way the Deed is done in house and the method is kept in the family.
That question is answered in the piece.
There may be a middle ground on the issue of phone encryption. Instead of a master key or back door, the encryption itself could be adjusted such that it has a known level of quality. The encryption level would be selected such that only a few extremely powerful government computers could decrypt it in a reasonable amount of time. This way a government could decrypt a cell phone’s data but it would only be able to do so for a limited number of devices per year. Over time the encryption level would increase. I don’t mind the idea that the government could decrypt my data if it really wanted to. I just don’t want it to decrypt all data nor do I want anyone else to be able to do it on their personal computers, or at least not for a long time.
Correct me if I’m wrong, but this method would only work on iOS7. I heard on a security podcast that this would not (and could not) apply to iOS8+. If that’s the case, what’s the harm? How many iPhones are there out there on 7? The danger to privacy is negligable.
This method would work with any version of iOS. iOS 8 and up are encrypted, but this is allowing access via the passcode.
Pretending that there is a practical difference between a “backdoor” and a “master key” is unhelpful. That makes the FBI’s case. Encryption without a way to prevent the device from wiping itself is not encryption you can trust. And there is no way to ensure that such a key is not stolen or misused (not to mention the fact that once it exists, other countries will demand the same thing) And, while this was written before this was confirmed, it is now very clear that this is *NOT* about a single device being requested to be bypassed.
I agree entirely that neither are acceptable – which is the case I make in the piece – but it’s also important to recognise the distinction between the two. Otherwise you can simply argue that this wouldn’t create a backdoor and is therefore fine.