The iPhone 16 line-up will be unveiled on Monday, and Apple’s Glowtime event name and graphics seem to be pretty clear pointers to a focus on Apple Intelligence. Anyone who is running the current iOS 18.1 beta will have instantly recognised the event graphics as replicating the colorful glow seen when using Siri.
The problem for Apple, however, is that the cool animation is almost the only thing that’s new about Siri, and that isn’t going to change for quite some time …
The beta Apple Intelligence features are limited
If you’re not currently using the beta, you’ve probably still seen video footage of the new Siri animation.
But as both Fernando and myself noted, while invoking Siri looks completely different, the actual performance isn’t very different at all.
I suggested at the time that this could create unrealistic expectations.
First, I absolutely love the new UI. It’s honestly one of the most beautiful pieces of software design I’ve seen in a long time.
However … It’s also confusing. We’re seeing a brand new animation for what is, currently, mostly the old Siri. Effectively the UI change appears to signal a big change, but only delivers a small one. It would be far better, in my view, for the new UI to wait for the full capabilities of the new Siri.
To be fair to Apple, the Apple Intelligence features we’ve been able to test so far do represent a solid start. Siri doesn’t yet have many new capabilities, but it is way better at handling verbal stumbles, and it’s now great at answering support questions about Apple products.
The new writing tools seriously impressed me; the call transcription tools far less so, but I’m sure these will improve rapidly.
How will Apple handle this?
If Apple weren’t going all-in on Apple Intelligence in the keynote, the limited features iPhone owners will get at the launch of the iPhone 16 line-up wouldn’t be such an issue. It would be fine to point to cool new hardware features, and note that the phones will get smarter over time.
But there don’t seem to be many other ways to interpret the event name and graphics. If Apple were going to do something wild, like introducing a glowing Apple logo on the back of the iPhone like MacBooks of old, it’s hard to imagine that wouldn’t have leaked by now. (Though full marks to Apple if it has managed to keep that secret!)
So assuming we’re not all missing something, and the event does rely heavily on previewing Apple Intelligence, the company is going to have to manage expectations very carefully. Else a lot of people are going to unbox their shiny new iPhone 16, find that it doesn’t look notably different to the iPhone 15, and doesn’t behave significantly differently – then wonder what it is they’re getting for their money.
AI camera features could be the answer
We haven’t yet seen any clues to significant new camera functionality in the iPhone 16 line-up, nor anything dramatic in the way of AI features.
While the new Photos app claims to be smarter, in my experiences the differences aren’t huge, and the new AI memories feature isn’t yet generating worthwhile results.
But computational photography is one area where Apple has been using machine learning for literally a decade or more, and has consistently delivered really impressive results from near-invisible technology.
So perhaps this is one surprise Apple has managed to keep. Some significant new camera capabilities based on Apple Intelligence features which are good enough to wow users at launch, and leave them happy to wait for the new Siri?
What are your thoughts and expectations for next month’s keynote? Please share them in the comments.
Image: 9to5Mac composite using images from Apple and mymind on Unsplash
FTC: We use income earning auto affiliate links. More.
Comments