One of the changes Apple has made in iOS 12 is much tighter protection against devices designed to brute-force iPhone passcodes. Unless the device has been unlocked within the past hour, the USB port will be restricted to charging, requiring the phone to be unlocked before it will permit data access.
From much of the reporting on this, you could easily get the impression that Apple’s aim here is to thwart law enforcement investigations – and that simply isn’t the case …
Let’s look at a few examples of the coverage this is getting …
The New York Times: Apple to Close iPhone Security Hole That Police Use to Crack Devices
Apple is closing the technological loophole that let authorities hack into iPhones, angering police and other officials and reigniting a debate over whether the government has a right to get into the personal devices that are at the center of modern life.
Reuters: Apple to undercut popular law-enforcement tool for cracking iPhones
Apple said on Wednesday it will change its iPhone settings to undercut the most popular means for law enforcement to break into the devices.
The Verge: Apple will update iOS to block police hacking tool
For months, police across the country have been using a device called a GrayKey to unlock dormant iPhones, using an undisclosed technique to sidestep Apple’s default disk encryption. The devices are currently in use in at least five states and five federal agencies, seen as a breakthrough in collecting evidence from encrypted devices.
But according to a new Reuters report, Apple is planning to release a new feature to iOS that would make those devices useless in the majority of cases, potentially sparking a return to the encryption standoff between law enforcement and device manufacturers.
Mashable: Apple’s officially making it harder for cops to bust into your iPhone
Apple intends to update its iOS with a new feature that will make it significantly more difficult for law enforcement agencies to access data on locked iPhones.
I could go on (and on), but you’ve probably seen other headlines yourself.
To be fair, many pieces that start in this vein do go on to point out that tools like GrayKey are used by criminals as well as law enforcement. But the overwhelming impression given is that Apple is out to make life hard for law enforcement.
The reality is that Apple is dealing with one simple fact known to every security professional but seemingly not to the law enforcement agencies that are complaining about the move: you cannot have a security hole that is used only by the good guys. Anything law enforcement can use with good intentions, criminals can use with bad intentions.
You could argue that Apple could have a special law enforcement mode, but again: any backdoor into iOS intended for use by the good guys will inevitably fall into the wrong hands.
Some persist, suggesting Apple could have this mode require a special device available only in a locked strongroom at Apple Park, with law enforcement agencies having to go there (with a court order) to access it. But, as I’ve said before, this simply isn’t a realistic scenario.
It is an extremely short distance from there to arguing that there will be some very time-critical cases where the delay involved in knocking on Apple’s door is too damaging. The classic ‘time-bomber in custody’ scenario. That the FBI needs to hold the key to prevent delay. It still wouldn’t do so without a court order, so where’s the harm? It would simply be cutting out the middleman.
So soon, the FBI would hold the key. Then other law enforcement agencies. In time, that key would be held in every police precinct house. We would then be trusting more than a million people with access to that key to abide by the rules. Government agencies don’t always have the best of track-records in doing that.
And even if we assumed not one single bad apple among those million people, you’d also be trusting every courier not to lose one in transit – and if you do that, I want access to your courier companies!
But it’s worse than this. The very fact that a backdoor exists means that hackers know it can be done. Sooner or later, they are going to figure out how, and then they can create their own devices.
So no, this cannot be safely done. Apple has no desire to hinder the work of law enforcement agencies, but it’s not them the company is trying to thwart: it’s the bad actors who would use the same vulnerability for nefarious ends. That’s why Apple is doing this – not to make life harder for cops.
Check out 9to5Mac on YouTube for more Apple news:
FTC: We use income earning auto affiliate links. More.
Comments