One of the main new features announced with iOS 18 and macOS 15 at WWDC 2024 is Apple Intelligence – a set of AI-based tools. While these features won’t be available to users until later this year, Apple’s head of software engineering Craig Federighi discussed the future of Apple Intelligence in an interview with Fast Company.
Craig Federighi on Apple’s partnership with OpenAI
With Apple Intelligence, users can now ask the system to rephrase a text, summarize messages or emails, generate images and even emoji. Siri has also been updated with AI, so that it now understands the context of what’s on the screen and lets users control more aspects of the device.
Apple is using both on-device and cloud processing for Apple Intelligence features. The company has also partnered to integrate ChatGPT into Siri. Although Apple has put a lot of effort into creating its own language models, Federighi acknowledges that there are other good LLMs available out there.
“These very large frontier models have interesting capabilities that some users appreciate, and we saw that integration into our experiences could make [those capabilities] much more accessible than [they are] today,” the executive said. According to Federighi, GPT-4o is one of the best LLMs currently available, which is why Apple wanted to integrate it into Siri.
To make things transparent to users, Siri will ask for permission to send requests to ChatGPT every time Apple Intelligence is unable to answer a question. Federighi also says that Apple plans to add support for more third-party language models in the future, so that users can choose what works best for them.
Apple Intelligence in China
China is one of Apple’s most important markets, and also one of the most heavily regulated. These regulations make it much more difficult for non-Chinese companies to offer their AI models there – but how does this affect Apple? Without providing too many details, Federighi confirmed that the company is trying to “find a way” to bring Apple Intelligence to China.
“We don’t have timing to announce right now, but it’s certainly something we want to do.” Apple Intelligence will first be available in US English, but it’s unclear whether users in other regions will be able to experience it.
More from Craig Federighi’s interview
Part of the requests sent to Apple Intelligence are processed by Apple’s AI servers called Private Cloud Compute, or PCC. The company has found a way to store the data requests and cryptographically destroy them after they have been processed – and nothing is seen by Apple.
Although Federighi hopes that future chips will be able to run larger language models, he notes that having an online model is still important for providing updated information.
“I couldn’t rule it out,” he says, “…but even in that world, I think that you would expect that at times your device is going to, in servicing your request, reach out at least to knowledge stores that are outside the device. So even in that future, I think there’s going to be a role for contacting external services.”
Be sure to check out the full interview on the Fast Company website.
FTC: We use income earning auto affiliate links. More.
Comments