Skip to main content

Apple Intelligence privacy can be independently verified thanks to an ‘extraordinary step’

Apple Intelligence privacy is a key differentiator for the company’s own AI initiative, with the company taking a three-step approach to safeguard personal data.

But Apple says we won’t have to take the company’s word for it: It is taking an “extraordinary step” to enable third-party security researchers to fully and independently verify the privacy protections in place …

Apple Intelligence privacy starts here

Apple applies a three-stage hierarchy to running AI features:

  1. As much processing as possible is done on-device, with no data sent to servers
  2. If external processing power is needed, Apple’s own servers are the next resort
  3. If they can’t help, users are asked for permission to use ChatGPT

Apple’s own AI servers have five protections

When Apple’s own servers are used, this is done using using an approach Apple has dubbed Private Cloud Compute (PCC). This is a cloud-based AI system which the company says is built around five safeguards.

Personal data has the strongest possible protections

Any personal data sent to PCC uses end-to-end encryption, so that not even Apple has access to it – but the company goes further than this. It uses an approach known as ‘stateless computation,’ which means that once processing is complete, the personal data is completely wiped from the system. The moment processing is complete, it’s as if it never existed in the first place.

Enforceable guarantees

Apple doesn’t rely on privacy policies; it instead ensures that all the technology used is not technically capable of leaking personal data. For example, some types of load-balancing and troubleshooting technologies might in some circumstances capture some user data, so Apple doesn’t use any of these. Independent security researchers can verify these facts.

No privileged runtime access

Another potential security hole in cloud servers are steps that could be taken by on-site engineers to escalate their privileges or bypass protections in order to resolve a problem. PCC doesn’t include capabilities which could be used in this way.

No targetability

Even if an attacker gained physical access to an Apple PCC facility, there is no technical means by which they could target an individual user’s data.

But it’s the fifth step where Apple goes way beyond anything anyone has ever done before …

The ‘extraordinary step’ of verifiable transparency

Apple says that, in principle, the “enforceable guarantees” step already allows independent security researchers to verify the company’s claims. They can see for themselves what capabilities PCC does and doesn’t have, and therefore determine what an attacker would or wouldn’t be able to achieve.

But it wants to do more, so it is making its software totally transparent.

When we launch Private Cloud Compute, we’ll take the extraordinary step of making software images of every production build of PCC publicly available for security research. This promise, too, is an enforceable guarantee: user devices will be willing to send data only to PCC nodes that can cryptographically attest to running publicly listed software […]

Every production Private Cloud Compute software image will be published for independent binary inspection — including the OS, applications, and all relevant executables, which researchers can verify against the measurements in the transparency log […]

In a first for any Apple platform, PCC images will include the sepOS firmware and the iBoot bootloader in plaintext, making it easier than ever for researchers to study these critical components.

Apple’s security blog post goes into a lot more detail, and security researchers will no doubt welcome the opportunity to put all the company’s claims to the test.

Photo by Matthew Henry on Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing