Skip to main content

Security firm shows how Apple could bypass iPhone security to comply with FBI request

A security firm says that while Apple may fight hard to resist a California court order to help the FBI to break into an iPhone, it would be technically able to do so.

Apple had so far seemed to be in possession of the ultimate trump card in this situation: since iOS 8, it has been able to simply shrug and say that iPhones are encrypted and Apple doesn’t have the key. Even if a court ordered it to break into an iPhone, it would be unable to do so.

But while this is correct, security company Trail of Bits has described in a blog post how Apple could still make it possible for the FBI to hack into the phone …

It’s already possible to hook up an iPhone to a device which tries to brute-force the passcode by simply starting at 0000 and working through to 9999. The problem for the FBI is that iOS has a couple of security systems designed to defeat this.

First, you can set your iPhone to automatically erase all data after 10 failed passcode attempts (Settings > Touch ID & Passcode > Erase Data). Any tech-savvy terrorist or criminal is going to have this turned on.

Second, iOS enforces increasing delays between failed passcode attempts:

  • 1-4 attempts: no delay
  • 5 attempts: 1 minute
  • 6 attempts: 5 minutes
  • 7-8 attempts: 15 minutes
  • 9 attempts: 1 hour

This explains why the FBI’s attempts to gain access in this way have still not succeeded some two months after they began.

But, argues Trail of Bits, it would be possible to put the iPhone into DFU mode and then overwrite the firmware with a version that has neither the auto-erase mode nor delays between passcode attempts. The FBI could then trivially brute-force its way into the phone.

The FBI can’t overwrite the firmware because the device checks for a valid Apple signature. The FBI doesn’t have this. But Apple does. Apple could thus create signed firmware without the protections designed to defeat brute-force attacks, and hand the phone back to the FBI.

All this supposes that iPhone is only protected by a 4-digit passcode, however. If a complex password was used, no-one in the FBI would live long enough to gain access.

Trail of Bits goes on to argue that the Secure Enclave would further complicate things on some devices. This wouldn’t apply in this particular case – as the iPhone 5c doesn’t have a Secure Enclave – but the company suggests that on later devices this would prevent Apple changing the firmware on a locked phone, and that the Secure Enclave itself cannot be overwritten without effectively erasing the device.

Others, however, have said this part of the blog post isn’t accurate. John Kelly, head of info security at Square, who previous worked for Apple on embedded security and thus presumably knows his stuff, says that it is perfectly possible for Apple to overwrite the Secure Enclave firmware without preventing access.

All of which means that Apple can no longer rely on claiming that it cannot assist law enforcement agencies to break into iPhones: it will instead have to fight in court on the merits of the argument that it should not do so. That’s going to be a tough argument to win, but Apple appears determined – and I think the company is absolutely right in its stance.

Image: GSM Hosting/sadewphone via Business Insider

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

  1. LOL they simply could create new iTunes version which accept the connected device even if it has passcode then they can use … (and make iPhone connect normally to Pc/Mac) then use iTools or iFunbox to get data ) or even jailbreak it to gain fully access 😘

    • degraevesofie - 8 years ago

      Access to the phone is not up to iTunes; it’s up to the device. That’s (among other things) why when you have a password-locked iPhone and try to update it using iTunes, iTunes will ask you to unlock the device.

    • andy o (@ao9news) - 8 years ago

      Hmmm… tough choice on who to believe on how difficult it is to break into the iPhone. Pretty much every security expert and Apple themselves, or random internet commenter that starts his argument with “LOL they simply could…”

      • Max - 8 years ago

        How dare you doubt that Random Internet Commenter knows better than the FBI? :P

    • rnc - 8 years ago

      It’s not iTunes that decides that the phones does allow or doesn’t.

    • srgmac - 8 years ago

      iTunes? Really? Lol.

    • Greg Nice (@gnice3d) - 8 years ago

      You literally just made all that up in your head based upon things you never read. That is not how exploiting a phone works.
      Also: The question at hand is should Apple do it, not could Apple do it… and Apple should not do this. If they were to give such a signed firmware over to the government, it would jeopardize the security of millions of iOS devices. The NSA has abused every power provided them and even powers they were not with zero ambiguity or accountability. This would essentially be like handing a serial rapist the keys to your home.

      Tim Cook is doing the right thing… It’s time to revoke the “Because terrorism” card that the government uses on a daily basis.

  2. Catherine Rot - 8 years ago

    The weather tomorrow could deliver snow and heavy fog, but it’s gonna be tough since the forecast points to warm, clear and sunny. Meteorologist Chance Winterkommen, experienced forecaster, told us that you never know for sure, thus acknowledging the possibility of snowfall and foggy unpleasantry.

  3. 89p13 - 8 years ago

    Looking at the device in that picture – shows a 4 digit pass code – it wouldn’t help with my iPhone as my password is 12 characters. Yes – It could be cracked, but it would sure take awhile.

    1000% Behind Tim Cook and Apple on their reluctance to submit to some Federal Justice, who is just a lapdog for our over reaching government!

    • srgmac - 8 years ago

      The article says if it has a complex passcode then it’s going to take a lot longer to brute force.

  4. nickman55 - 8 years ago

    My understanding of Apple’s argument isn’t that they CAN’T make a custom version of iOS, it’s that they WON’T. In their open letter, Apple never says that they are not capable of making a backdoor… Apple has pretty much admitted that they are technically capable of complying with the court order using the exact method that Trial of Bits suggests, they just aren’t going to.

  5. example8009 - 8 years ago

    Thank you Tim Cook and Apple for making it easier for the terrorist to kill us and harder for the government to protect us. Last year I switched from Microsoft Windows 8 to a Mac. I will going back to MSFT and have bought my last apple device.

    • 89p13 - 8 years ago

      And goodbye – You vote with your Dollars and it appears that you already have. See if helps to “protect and keep you safe.”

    • Jake Becker - 8 years ago

      Good, gotta cull the herd. Keep worrying about them there ‘terists, “our” government is good and loves you and the other nations of the world. All the conflict arises completely out of nothing.

    • morganhighley - 8 years ago

      Bye! Happy trolling!

    • just-a-random-dude - 8 years ago

      You’re far more likely to be killed by Americans/Cops than terrorists, more Americans have been killed by Americans in the last 5 years than the entire history of all terrorist attacks combined.

      Even if Apple makes it easy to break into the iPhone, terrorists are using their own encryption tools that Apple has nothing to do with and will not prevent anything. The US government will never ever be able to get away from encryption, more encryption tools has been made available outside of US that no government on the planet can stop the usage of it.

      The moment we give up our right to privacy (via encryption) is the moment we give up our rights to be protected. Sorry, giving up our encryption nor privacy is ever going to justify backdoors.

      FBI and the court is absolutely in the wrong here.

    • I distinctly remember the terrorists using guns to kill people. You should be saying, “Thank you gun manufacturers, and gun merchants, and gun merchants for helping the terrorists to murder those people.”

    • srgmac - 8 years ago

      Bye bye don’t let the door hit you on your scared p***y coward ass on the way out.

    • Jon G. - 8 years ago

      You’re right – Tim Cook is a friend to terrorists, and he has done nothing but laugh in the face of the United States of America (the inventors of nuclear weapons that could destroy every human being on the face of the Earth) by not allowing them to hack into the iPhone, thus preventing our most noble and honest government from protecting us from the enemy… Burn him at the stake I say, and anyone that advocates privacy should be labeled as a traitor and convicted of treason!

    • mytawalbeh - 8 years ago

      Happy trolling LOL!

  6. Rich Davis (@RichDavis9) - 8 years ago

    I still don’t know why the FBI doesn’t use one of the fingers of the people who’s phone they want to hack into for using with the fingerprint sensor? They probably use one of a couple of fingers. That wouldn’t take that long. They did have access to their bodies, right?

    • Mario Gaucher - 8 years ago

      the iPhone 5c in question do not have a fingerprint sensor.
      also, if the device has rebooted for any reason, you have to enter the password at least one time before using the fingerprint sensor again.

      • kyle3lias - 8 years ago

        And after 48h of standby.

  7. To answer the previous point — the phone in question is a 5c, so there’s no TouchID. Beyond that, I love how this security firm is like ‘hey, this is how it could be done’. Apple probably has several different theoretical ways to do it. The point is that it’s not the right thing to do, and once done you cannot un-create what has been created. Have we learned nothing from Pandora’s Box?

    • Jake Becker - 8 years ago

      It’s America, so no.

    • alanaudio - 8 years ago

      If the security company Trail of Bits thinks they know how to crack this iPhone, then why hasn’t the FBI asked them to do it instead of taking Apple to court?

      There might be two reasons. One is that they don’t believe that Trail of Bits can really do it. The more important issue is that the FBI want tho create a precedent using a clear cut, high profile case where everybody agrees that the suspect is without doubt a bad guy and where it’s in the public interest to know his associates. Of course, if Apple complied in this exceptional case, then the FBI will then insist that Apple does the same whenever they make a subsequent request concerning a much lesser crime.

      • Ben Lovejoy - 8 years ago

        Trail of Bits can’t do it – they are arguing that Apple can.

      • Mario Gaucher - 8 years ago

        what if all what the terrorist did was to put his porn pics and vids on his phone?
        if I was really worried about not being tracked or anything, I wouldn’t use an iPhone or any other smartphones.

      • George Pollen - 8 years ago

        Modified software can’t be installed on the iPhone without Apple’s confidential signing certificate.

      • srgmac - 8 years ago

        Trial of Bits can’t create a custom iOS firmware that is signed with a valid certificate — Apple would have to do that (assuming this is even possible, what they suggest, removing the auto-erase and bruce force prevention timeouts)

    • JBDragon - 8 years ago

      Apple can break into this iPhone because it’s running iOS7 I believe. They are just refusing to do so. Apple can’t do anything with iOS8 and newer, court order or not.

      • bipolarsojourner - 8 years ago

        Latest news is that it is running iOS 9.

      • Tim LeVier - 8 years ago

        You’re thinking of the other case in New York, which was making the headlines until this one popped up out of the blue.

  8. lofye - 8 years ago

    If you had read the letter Tim Cook wrote, you would have seen him specifically address that very issue. He said Apple doesn’t want to create a custom version of iOS, because it would endanger their customers.

  9. I’m wondering what Apple would do if the Chinese government asked for the same thing due to a terrorist attack with the sanction that if he doesn’t play well with them Iphones will be banned from the chinese market and considered how big that market is hmm i think Apple will think twice before refusing and communist are not really good in discussing for an agreement they are more like “I gave you an order so do it”, Google used to be banned somehow in the past, Russia is capable of the same thing as well, what Apple will do in such case…

  10. viciosodiego - 8 years ago

    Lol.
    They are assuming that apple will do this..
    Bunch a idiots.

  11. Doug Aalseth - 8 years ago

    Nobody is asking the big question: WHY does the FBI want to get into the phone?
    To find out who they called? They already have all the phone records.
    To see their e-mail? Those are on servers they already have access to.
    To see texts and IMs? There are other ways.
    This is just a power play by the FBI and the Justice Department to establish a precedent. They want it on record that Apple will unlock devices on request. Then they have the right to demand any access to anyone’s data.

    • Bardi Jonssen - 8 years ago

      Doug Aalseth : Exactly. This has little to do with the “terrorist” case and everything to do with hacking into any phone, at will.

    • George Pollen - 8 years ago

      Agree with everything… except the ability to access iMessage texts through any other means.

      • kyle3lias - 8 years ago

        Apple can and does extract them from iCloud backups… And the FBI can extract them from iTunes backups. Of course only if those exist.

      • Scott (@ScooterComputer) - 8 years ago

        This was my thought too…and as @kyle3lias also mentions “only if those exist”. I’d imagine that even IF iCloud Backup was turned on, it would have fired that morning and the FBI is interested in knowing who the San Bernardino shooter talked to immediately before and after the shooting. Also, possibly getting any geo-location data FROM the phone, since there is a missing chunk of location/time.

      • George Pollen - 8 years ago

        Anyone savvy about privacy won’t use iCloud for much of anything, if anything at all.

    • srgmac - 8 years ago

      Best comment :)

    • alanaudio - 8 years ago

      Why are the FBI so keen to hack this particular iPhone? On the face of it, it was used by the terrorists and the FBI would have us believe that it contains all the details of known sympathisers, but this particular iPhone was actually provided by their employers. Their personal cellphones were destroyed prior to the attack and are unreadable. The hard drive on their computer was also removed and has never been found.

      These people were sufficiently tech savvy to understand that forensic techniques can recover deleted data on phones and hard drives, so they destroyed or discarded them. The fact that they didn’t do the same with the company iPhone very strongly suggests that they knew there was no sensitive information on it. If I were planning a terrorist outrage, I certainly would not use a company phone where I could have no idea of what records and backups the company might keep.

      That then puts a big question mark over why the FBI is using it’s biggest guns in this case. There is very little likelihood of finding any useful information on this iPhone relating to terrorism, therefore it’s clear that the true motivation is to create a precedent which can then be exploited in future with regard to less emotive cases.

  12. airmanchairman - 8 years ago

    A fair number of extremely powerful forensic tools already exists to do the work required to unlock the phone in question.

    The authorities are merely seeking the power to transfer the onus, responsibility and cost of doing so on to the device manufacturers, so that all that is required to unlock a device is a warrant, or if challenged a court order using the San Bernardino precedent, to coerce compliance any time it is required.

  13. Paul Van Obberghen - 8 years ago

    All this has potentialy gigantic financial implications for Apple. Because, if they make it so to grant access to the FBI only (?), Apple would become legaly responsible of making it so that no one else could access it. But that is impossible: one day someone will find the way to do it also, and it will be on the Internet right away. If so, the original owners of the iPhones could sue Apple for not having protected access to personal data effecitvely enough. In the wake of which could ensue legal actions that could cost Apple billions, if not kill the company completely.
    Also because it’s not only the US. Imagine that US non-friendly governements start asking Apple to unlock iPhones for legal actions in their respective countries (for reasons that may be very distant from democracy) and threaten to impeach Apple from selling their devices in these lands, if not complying. I’m speaking China in particular, but also Russia.
    If Apple complies and provide the FBI with a backdoor, it will demonstrate that it is possible and these governements could use this argument to demand Apple to provide the backdoor to them as well, and if not, have a good reason to stop Apple from selling their iPhones in the country, with the “valid” argument that Apple by not doing so is acting against said countries security interests.

    • airmanchairman - 8 years ago

      “But that is impossible: one day someone will find the way to do it also, and it will be on the Internet right away. ”

      Well, at least (definitely) on the Dark Web…

  14. jelockwood - 8 years ago

    If your iPhone has a recent iOS version and you have turned on ‘Find my iPhone’ then you cannot wipe the iPhone by putting it in DFU mode unless you first unlock the iPhone and turn off Find my iPhone. As per this article “Any tech-savvy terrorist or criminal is going to have this turned on.”

    • kyle3lias - 8 years ago

      Nope you can whipe a phone by putting it in recovery mode. But then the iPhone cannot be activated unless you provide the correct Apple ID creds.

  15. George Pollen - 8 years ago

    In a bionic future, will the government be allowed to interrogate our thoughts?

  16. John Smith - 8 years ago

    Very important.

    I’ve now seen opinions from multiple credible people saying it should be possible.

    I don’t see Cook claiming this CANNOT be done – he’s apparently just refusing to do it.

    FBI picked this case well. It’s not something Apple cannot do, it’s something they are refusing to do.

    Apple has the right to seek a stay on this and to contest it in court. But if they continue to refuse, I say treat them like anyone else who deliberately defies a court order: jail them.

    Greedy corporations with incomes greater than most countries on earth can’t be allowed to be above the law – even they think they are.

  17. Jirka Stejskal - 8 years ago

    And the picture shows iPhone 3G – maybe 3GS. Interesting

    • Ben Lovejoy - 8 years ago

      That’s just to show what the kit looks like.

    • alanaudio - 8 years ago

      The picture I’m seeing doesn’t have the rounded edges of a 3 or 3GS. It has the straight edges as introduced on the 4 series. You could easily stand a 4 series iPhone upright on a table, but couldn’t do that with a 3 series iPhone.

  18. Scott (@ScooterComputer) - 8 years ago

    I still haven’t seen any OFFICIAL report on the version of iOS the iPhone is running. I find it odd that the court’s order goes into such specific detail like serial number and IMEI/ESN/MEID, but didn’t mention the iOS version. And I’m pretty sure that’s discernible from DFU mode. Knowing the version of iOS currently ON the device would offer a lot of clues as to WHAT can be done to it, even by Apple. It is a 5c, which is good for the Feds. (If a 5s…oooh, that would hurt.)

  19. I think FBI could:
    a. disasembly and take the nand flash storage off the iPhone
    b. clone this disc, untouched
    c. create a image of the cloned disc and then replicate to several virtual appliances
    d. run bruteforce on each appliance

    • Swordmaker - 8 years ago

      Won’t work. If it’s encrypted it would take several quadrillion years to go through even a part of the potential passcode UID entangled keys.

  20. “it would be possible to put the iPhone into DFU mode and then overwrite the firmware with a version that has neither the auto-erase mode nor delays between passcode attempts.”

    And if that firmware ever got out (which it will eventually) then EVERYONE’S personal information is at risk of being exposed to anyone… Duh!

    Apple is correct. If you create a backdoor, it will be discovered and used by others.

    • focher - 8 years ago

      Apple could sign the firmware and then revoke the certificate after it’s use on that particular phone.

      Don’t be surprised if iOS security is ultimately adapted to close that option. As it stands, there’s a high degree of likelihood that Apple has the technical means to bypass the passcode on this particular iPhone (5c running iOS 7). Policy merits aside, I don’t foresee Apple winning unless they can demonstrate this is significantly burdensome to them.

      • Swordmaker - 8 years ago

        Latest reports are the iPhone 5C is running iOS 9.

  21. Kris Hall - 8 years ago

    This is also assuming that the owner of the device isn’t using an alphanumeric password which you can do on the iPhone.

    • Ben Lovejoy - 8 years ago

      “All this supposes that iPhone is only protected by a 4-digit passcode, however. If a complex password was used, no-one in the FBI would live long enough to gain access.”

  22. Ah thanks for the technical explanation – I must have missed this earlier. Apple could also presumably open a hacker API to allow the FBI to try millions of iCloud passwords against its servers.

  23. Thomas Farral - 8 years ago

    Just to weigh in here – being from the computer security world – this issue has never been if Apple can or cannot. We already know they can. Here are the issues:

    1. The precedent of the FBI or any federal agency coercing a private company to do its bidding.
    2. Even if they made the new software, used it, and removed it, the software has been created. In this day and age, I know as well as many, that once it’s created, it will always be out there. So effectively, we’re talking about a universal key to break into iPhones, not just this one iPhone. Also, if you think the government is good a keeping important information and software away from attackers, please read this article:
    http://www.bloomberg.com/news/articles/2015-07-09/hackers-stole-government-data-on-25-7-million-people-u-s-says

    Being from the security field, I know first hand how many attacks daily happen to various companies (they hire my company to watch their traffic) and I know how much our federal government lacks security. Apple has helped the FBI on numerous occasions, and they have assisted plenty in this case. However, is this phone – which the FBI already does have a 44 day old backup of the phone (44 days prior to the shooting) – worth risking security of anybody who uses an iPhone? Apple wants to keep its integrity, and I’m sure that is very money driven, however, the effects are still very real. Once a tool is created, anybody can get their hands on it, especially if our federal government ever gets a copy of it. But even then, Apple has the possibility of losing this software as well.

    Data exfiltration happens often, and if the attacker is smart enough, it can be undetected, or hard to detect. The FBI, could bypass the TPM on the iPhone and create copies to bruteforce, however, they apparently would not like to go that route. The FBI has chosen to fight Apple, just as much as Apple has chosen to fight the FBI. We’re in the middle of a pissing contest between privacy and “security”.

    I am on the side of Apple in this case, because, being from the security industry, I know how important this actually is. Please take a moment to stop and think about the impact if Apple does decide to help.

    • AbsarokaSheriff - 8 years ago

      I love Ben Franklin’s quote “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.”

      To coerce any company to doctor its products in the interest of security is a tool that knows no bounds.

      A person in Oklahoma plowed into a parade of people, killing four. Why couldn’t car manufacturers be compelled to put in a remote controlled kill switch. We have had incidents in Colorado with people using cars as weapons against police to escape. Even municipalities directing police not to shoot at such cars. A kill switch would be great.

      Many guns have fingerprint guards that could be remotely triggered to allow no fingerprints. It’s just a firmware modification. Then all guns could be mandated to have these guards.

      With a cell phone, we pretty much have a geographic tracker, a payment tracker all built into one device. Even if I’m not doing anything wrong I don’t want that information exposed.

      Yes, law enforcement wants all of its capabilities. That is its job. But as a society that doesn’t want the end product, a police state, there has to be a balance struck between privacy and law enforcement. We have to be able to say no.

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear