A new report today in the MIT Technology Review dives into Apple’s continued work on device and software security and the potential unintended consequences. While almost all experts agree that the walled garden approach to iPhone has solved major security issues, some are sharing the concern that it’s also giving the world’s top hackers a better place to hide.
“It’s a double-edged sword,” says Bill Marczak, a senior researcher at the cybersecurity watchdog Citizen Lab. “You’re going to keep out a lot of the riffraff by making it harder to break iPhones. But the 1% of top hackers are going to find a way in and, once they’re inside, the impenetrable fortress of the iPhone protects them.”
His main concern with the direction of increasingly locked-down Apple devices is that it’s becoming more and more difficult for security researchers to discover malicious activity.
He argues that while the iPhone’s security is getting tighter as Apple invests millions to raise the wall, the best hackers have their own millions to buy or develop zero-click exploits that let them take over iPhones invisibly. These allow attackers to burrow into the restricted parts of the phone without ever giving the target any indication of having been compromised. And once they’re that deep inside, the security becomes a barrier that keeps investigators from spotting or understanding nefarious behavior—to the point where Marczak suspects they’re missing all but a small fraction of attacks because they cannot see behind the curtain.
While Apple’s updates fix security flaws and bugs, they can also break the tools used by researchers.
Sometimes the locked-down system can backfire even more directly. When Apple released a new version of iOS last summer in the middle of Marczak’s investigation, the phone’s new security features killed an unauthorized “jailbreak” tool Citizen Lab used to open up the iPhone. The update locked him out of the private areas of the phone, including a folder for new updates—which turned out to be exactly where hackers were hiding.
Faced with these blocks, “we just kind of threw our hands up,” says Marczak. “We can’t get anything from this—there’s just no way.”
MIT also talked with a security researcher who has much more rare access to an Apple-approved research app called iVerify:
Ryan Stortz is a security engineer at the firm Trail of Bits. He leads development of iVerify, a rare Apple-approved security app that does its best to peer inside iPhones while still playing by the rules set in Cupertino. iVerify looks for security anomalies on the iPhone, such as unexplained file modifications—the sort of indirect clues that can point to a deeper problem. Installing the app is a little like setting up trip wires in the castle that is the iPhone: if something doesn’t look the way you expect it to, you know a problem exists.
But like the systems used by Marczak and others, the app can’t directly observe unknown malware that breaks the rules, and it is blocked from reading through the iPhone’s memory in the same way that security apps on other devices do. The trip wire is useful, but it isn’t the same as a guard who can walk through every room to look for invaders.
While Stortz admits the challenges with discovering vulnerabilities on Apple devices, he thinks the locked-down approach is the right one. “As we lock these things down, you reduce the damage of malware and spying,” he says.
And as we saw last fall with the arrival of the first Apple Silicon M1 Macs, the company’s notebooks and desktops have leveled-up security.
“iOS is incredibly secure. Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction,” says security researcher Patrick Wardle.
Macs were moving in this direction for years before the new hardware, Wardle adds. For example, Apple doesn’t allow Mac security tools to analyze the memory of other processes—preventing apps from checking any room in the castle aside from their own.
Wardle added that “Security tools are completely blind, and adversaries know this,” which means the high-stakes game of hide and seek between Apple and hackers evolves and continues.
Others expect Android and Windows to follow Apple’s locked-down device approach to security.
It’s just not Apple, says Aaron Cockerill, chief strategy officer at the mobile security firm Lookout: “Android is increasingly locked down. We expect both Macs and ultimately Windows will increasingly look like the opaque iPhone model.”
Finally, the report talks about an approach where Apple could theoretically give limited entitlements to researchers so they have more access to discover hidden flaws or malicious exploits. But the trouble there is the same as Apple has talked about since the San Bernardino case, if it creates an exception or back door for researchers, it will eventually be exploited by nefarious hackers.
Apple and independent security experts are in agreement here: there is no neat fix. Apple strongly believes it is making the correct trade-offs, a spokesperson said recently in a phone interview. Cupertino argues that no one has convincingly demonstrated that loosening security enforcement or making exceptions will ultimately serve the greater good.
As for the future, Ryan Stortz from Trail of Bits believes we’re shifting toward average users sticking with mobile devices:
“We are going to a place where only outliers will have computers—people who need them, like developers. The general population will have mobile devices which are already in the walled-garden paradigm. That will expand. You’ll be an outlier if you’re not in the walled garden.”
- Mysterious macOS malware discovered with M1 optimization, threat remains unclear
- Apple acts to prevent further spread of Silver Sparrow Mac malware
- First Apple Silicon optimized malware discovered in the wild
- Apple launches 2021 Platform Security guide with iOS 14, macOS Big Sur, Apple Silicon deep dive
FTC: We use income earning auto affiliate links. More.