Apple software SVP Craig Federighi says that the Private Cloud Compute servers used for Apple Intelligence features are really basic – and with good reason.
The exec says it’s one of a number of decisions the company made to ensure that it’s AI cloud servers form a “hermetically sealed privacy bubble” with your iPhone …
Apple’s Private Cloud Compute servers
We’ve talked before about Apple’s three-stage hierarchy when it comes to Apple Intelligence features:
- As much processing as possible is done on-device, with no data sent to servers
- If external processing power is needed, Apple’s own servers are the next resort
- If they can’t help, users are asked for permission to use ChatGPT
We’ve also discussed the five safeguards Apple applies to its own servers, which includes the “extraordinary step” of verifiable transparency.
Deliberately bare-bones
In an interview with Wired, Federighi says that part of the privacy protection is achieved by making the PCC servers really basic (even if the chips aren’t). It’s hard to imagine a data center server without hard drives or SSDs for storing user data, but that’s exactly what Apple has created.
PCC servers are as bare-bones as possible. For example, they don’t include “persistent storage,” meaning that they don’t have a hard drive that can keep processed data long-term.
Additional features further ensure there is no way for data to survive a reboot.
They do incorporate Apple’s dedicated hardware encryption key manager known as the Secure Enclave, and randomize each file system’s encryption key at every boot up as well. This means that once a PCC server is rebooted, no data is retained and, as an additional precaution, the entire system volume is cryptographically unrecoverable. At that point, all the server can do is start fresh with a new encryption key.
How Apple solved the end-to-end encryption problem
One weakness that used to exist with iCloud is that data was encrypted, but did not use end-to-end encryption – meaning that Apple, or a hacker who gained access to Apple servers, could read the data.
Apple has been gradually rolling out E2E encryption for more and more iCloud data (though you do need to enable it), but that posed a problem for PCC servers.
“What was really unique about the problem of doing large language model inference in the cloud was that the data had to at some level be readable by the server so it could perform the inference. And yet, we needed to make sure that that processing was hermetically sealed inside of a privacy bubble with your phone,” Federighi says. “So we had to do something new there. The technique of end-to-end encryption—where the server knows nothing—wasn’t possible here, so we had to come up with another solution to achieve a similar level of security.”
The company’s solution was two-fold. First, all the usual server tools that might allow an administrator (or hacker) access to your data, like load balancers and data loggers, sit outside of the protected area, so cannot decrypt the data. Second, that lack of persistent storage: once the response is sent back to your phone, it is deleted and can never be recovered.
Anyone can check this
The “extraordinary step” Apple referenced previously is that absolutely anyone can check that the system works the way the company says it does
Apple is making every production PCC server build publicly available for inspection so people unaffiliated with Apple can verify that PCC is doing (and not doing) what the company claims, and that everything is implemented correctly.
All of the PCC server images are recorded in a cryptographic attestation log, essentially an indelible record of signed claims, and each entry includes a URL for where to download that individual build. PCC is designed so Apple can’t put a server into production without logging it.
And in addition to offering transparency, the system works as a crucial enforcement mechanism to prevent bad actors from setting up rogue PCC nodes and diverting traffic. If a server build hasn’t been logged, iPhones will not send Apple Intelligence queries or data to it.
That’s an unprecedented step for any cloud company, says Apple.
“Creating the trust model where your device will refuse to issue a request to a server unless the signature of all the software the server is running has been published to a transparency log was certainly one of the most unique elements of the solution—and totally critical to the trust model.”
While the interview mostly recapped information already known, the iPhone 16 launch naturally means a lot more people will be paying attention.
Photo by Alexander Huayhua on Unsplash
FTC: We use income earning auto affiliate links. More.
Comments