Skip to main content

Stanford tech may help the transition from Vision Pro to Apple Glasses

While an expensive, bulky Vision Pro headset undoubtedly has its place for some, Apple’s longer-term goal is believed to be a product dubbed Apple Glasses: bringing AR capabilities into something with a similar form-factor and weight to conventional eyeglasses or sunglasses.

Squeezing that much tech into a much smaller device is, of course, a huge challenge – but researchers at Stanford’s Computational Imaging Lab may have come up with at least part of the solution …

Apple has already been working on a technological approach known as waveguides, to change the way in which images are perceived by the eye.

The light engine includes a series of optical waveguides with holographic or diffractive gratings that move the light from the light sources to generate beams at the appropriate angles and positions to illuminate the scanning mirrors; the light is then directed into additional optical waveguides with holographic film layers recorded with diffraction gratings to expand the projector aperture and to maneuver the light to the projection positions required by the holographic combiner.

What Stanford has developed is a version of this tech – referred to as inverse-designed metasurface waveguides – which fits into a much smaller space.

The result, reports The Verge, is a thin stack of holographic components capable of projecting realistic, color, 3D images in a device small enough to fit into a unit not much larger than a pair of standard glasses frames.

Almost inevitably, part of the key to this is AI.

Researchers say they’ve developed a unique “nanophotonic metasurface waveguide” that can “eliminate the need for bulky collimation optics,” and a “learned physical waveguide model” that uses AI algorithms to drastically improve image quality. The study says the models “are automatically calibrated using camera feedback”.

Although the Stanford tech is currently just a prototype, with working models that appear to be attached to a bench and 3D-printed frames, the researchers are looking to disrupt the current spatial computing market that also includes bulky passthrough mixed reality headsets like Apple’s Vision Pro, Meta’s Quest 3, and others.

I’m current testing the latest version of Ray-Ban Meta glasses, which recently got a software upgrade to offer AI-based scene recognition. I’ll give a full report on these shortly, but one of the things that most impresses me is that they both look and feel like absolutely standard sunglasses. There’s no feeling of being weighed-down by them, and few friends who’ve seen them realised they were anything out of the ordinary.

This, to me, is the holy grail of vision tech – squeezing as much Vision Pro tech as possible into something we can wear as casually as a pair of sunglasses – and it does sound like Stanford just brought us closer to that.

Photo: Andrew Brodhead/Stanford Computational Image Lab

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications