Earlier this week, Apple denied claims that it had hidden secret backdoors in its iOS platform that could allow the government or malicious users to extract a variety of critical and personal details about a device’s user from an iPhone or iPad, sometimes storing that data in unencrypted formats.
Today, the company published a new document on its support website explaining the diagnostic tools that iOS uses to collect data for troubleshooting and other purposes. According to the document:
1. com.apple.mobile.pcapd
pcapd supports diagnostic packet capture from an iOS device to a trusted computer. This is useful for troubleshooting and diagnosing issues with apps on the device as well as enterprise VPN connections. You can find more information at developer.apple.com/library/ios/qa/qa1176.
2. com.apple.mobile.file_relay
file_relay supports limited copying of diagnostic data from a device. This service is separate from user-generated backups, does not have access to all data on the device, and respects iOS Data Protection. Apple engineering uses file_relay on internal devices to qualify customer configurations. AppleCare, with user consent, can also use this tool to gather relevant diagnostic data from users’ devices.
3. com.apple.mobile.house_arrest
house_arrest is used by iTunes to transfer documents to and from an iOS device for apps that support this functionality. This is also used by Xcode to assist in the transfer of test data to a device while an app is in development.
As can be gathered from the text of the note, the files in question are used in troubleshooting issues with iOS devices, interoperating with iTunes for Mac and PC, app development through Xcode, AppleCare support calls, and internal testing on unreleased software.
The document notes that if a device has been enabled for wireless syncing with an iTunes library, these services can be activated over a Wi-Fi network, but only by the trusted computer that the device is set to sync with. The data is encrypted during the transfer between the device and the computer, and not even Apple is able to access the data in transit.
FTC: We use income earning auto affiliate links. More.
No one sane is questioning the legitimate purpose of these services but that is not the issue here. The issue is these services are poorly designed and provide a backdoor for hackers and government agencies to access personal data.
Trust pairing offers very little security at all. It is simply something on your Mac that gives unrestricted access to your iOS device. It can be had as easily as copying one file from your Mac.
Don’t get me wrong. I don’t think there is some great conspiracy here. Apple isn’t in bed with the NSA or the Illuminati or whatever nonsense crackpots want to believe. But there is a design flaw that allows for unintended access to private data and that needs to be addressed. I’m sure it is.
Can you back up ‘poorly designed’ with technical proof? And, you mention ‘trust pairing’ is unsafe. In what ways?
“Hacking” a device in this way (even if I believe your unsupported statements about how easy it would be to put a computer in the “trusted” list), would still require physical access to the computer. That seems like a valid line in the sand to me.
All this means (IMO of course and based on what has been said), is that if you have a “work iPhone” that syncs with the work network or has been accessed by IT, then you should assume that IT has access to what you are doing on it. In other words, normal behaviour.
It also means that if someone comes into your house, and has unfettered access to your home PC, that this person *could* nefariously hack your other devices. Again, normal behaviour.
OK, I totally understand the fuss behind all stories like this one, but I have to ask this: does anyone really believe that a networked device (from a pager to a laptop to a phone to a whatever) is ever going to be “safe”? No. For me, once a device gets an IP, it’s game over for privacy.
The only thing companies can do is make it harder for outsiders to get my data. And it’s a cat-mouse game, too. Remember how Apple would patch the iPhone and the jailbreak community would find another way? Well, no matter what companies do, NSA or other organizations or mere hackers WILL always find ANOTHER way to get my data if they want it so much.
Some might say that it’s just a modern times’ problem, we carry around “beacons” that are capable of recording too much of our lives, and that info is easily grabbed/intercepted. Well even a damned Nokia 3210 in 1999 could be location-monitored (cell tower triangulation, anyone?), or eavesdropped. And don’t get me started on the SMS service.
I don’t like it, you don’t, and I’m willing to believe Tim Cook doesn’t like it. Though, what I find MORE alarming is that our KIDS will grow up in a world that being monitored will be the norm. And they won’t even care. If you are a parent of a teenager, just look at what YOU do online and how much info you give away, than look at you kids and what they give away. Whole different picture.
I don’t care.
Until it can be shown that pairing is easily spoofed without prior knowledge of the pair key or physical access, this is not a concern to me. Even if that is possible, what info would a hacker have on me? If it’s only non personally identifying info, I probably don’t care. Without knowing that it’s hard to put this in perspective. Could be no big deal, could be a real concern, but in any case, it’s not deliberate, which is what some people are inferring.
Sometimes when I open my mouth,,, all my privacy goes out the backdoor. I trust that Apple will provide me with more privacy than I provide to myself.
If you have a concerns about data such as your address book data, try an App called – ContactShield which encrypts your address book in such a way that even Apple can not read it. http://www.infoshields.com/contactshield.html