Tim Cook recently called out one specific AI feature—visual intelligence—as being among the most popular Apple Intelligence capabilities so far. And rumors indicate that later this year, it could get a lot more powerful.
Visual intelligence is an early hit among Apple users, says Tim Cook

Visual intelligence first debuted on the iPhone 16 as part of the Camera Control button.
With all AI-enabled iPhone models, you can long-press Camera Control to enter the visual intelligence mode. Or optionally, set up a Control Center or Lock Screen button to do the same.
Visual intelligence essentially combines your camera with AI to provide new features, such as:
- translating a street sign into your native language
- adding an event from a flyer to your calendar
- seeing restaurant reviews, photos, and more
When iOS 26 launched, it expanded visual intelligence in a big way. No longer is it limited to the camera. Now, anything you see on your iPhone can benefit from AI via screenshots.
Take a screenshot in iOS 26, and as part of the redesigned screenshot viewer, you’ll now find visual intelligence options like the ones above.

One of my go-to uses has been taking a screenshot of plain text URLs to turn them into tappable links.
Last week on Apple’s quarterly earnings call, Tim Cook specifically called out visual intelligence as one of the most popular Apple Intelligence features so far. He said:
One of our most popular features is visual intelligence, which helps users learn and do more than ever with the content on their iPhone screen, making it faster to search, take action, and answer questions across their apps.
Why did Cook choose to specifically tout visual intelligence’s success? We can’t know for sure.
However, one reason might be that Cook knows Apple has big plans to expand the feature soon.
Rumors say new AirPods Pro and Apple Glasses will use visual intelligence

Visual intelligence is nice as a camera feature, and I’ve especially found it useful as a screenshot option.
But rumors indicate the feature will expand to two new platforms soon that could make it truly shine.
A new high-end AirPods Pro 3 model and Apple Glasses are both expected to be unveiled later this year.
Both products will have built-in cameras tied to a key AI feature. You guessed it: visual intelligence.
Here’s Mark Gurman writing about the new AirPods:
Apple’s ultimate plan for Visual Intelligence goes far beyond the iPhone. The company wants to put the feature at the core of future devices, including the camera-equipped AirPods that I’ve been writing about for several months.
Similarly, he says about Apple Glasses:
the idea is to turn glasses into an Apple Intelligence device. The product will analyze the surrounding environment and feed information to the wearer
In other words, Apple plans for visual intelligence to be a big part of its wearable offerings in the not-too-distant future.
So if there was any AI feature that Cook might want to draw extra attention to, it’s no surprise visual intelligence was it.
What do you use visual intelligence for on your iPhone today? Let us know in the comments.
Best iPhone accessories
- Apple’s new AirTag 2 (1-pack / 4-pack)
- 10-year AirTag battery case 2-pack
- MagSafe Car Mount for iPhone
- AirPods Pro 3
- 100W USB-C fast charging power adapter
FTC: We use income earning auto affiliate links. More.

Comments