Skip to main content

Vision Pro display resolution shown off in Sony video

The Vision Pro display resolution is one of the many benefits of the device over its rivals, and we yesterday explained the technology that makes it possible – and how it differs from AR/VR headset displays used by other companies.

We didn’t know at the time that they would be used in Apple’s spatial computer, but Sony actually showed off the displays a full year ago – and it turns out I was being a little unfair to the Cupertino company…

I pointed to the fact that Apple is using a rather confusing term for the display tech. The company refers to it as “micro-OLED,” which has led some to mistake it for microLED. I said it would be better to use the correct term, which is OLED on Silicon, aka OLEDoS.

Apple must take much of the blame for this. First, you have to delve into the small print to find that the displays use a technology known as OLEDoS. Second, the company says the displays use “micro-OLED technology” – that is, “micro hyphen OLED”, not “micro no hyphen no O LED”!

However, it turns out that Sony – which makes the displays – also engages in this marketing-name stuff. Sony refers to it as “OLED Microdisplay.” That doesn’t create the same confusion as Apple’s term, but still…

Vision Pro display resolution

Sony showcased the tech in a promo video shared at a technology day last year.

Our Head-Mounted Display achieves 4K with one eye, and 8K with both eyes […] From detailed letters and material textures, you can tell the image is so close to the reality.

If pixel dots are recognized users feel like they are looking at a display, due to pixelization. For a realistic visual experience, it requires many dots to be invisible when they are enlarged. The panel size also needs to be small enough to fit into the limited form factor.

For these two requirements we developed a 4K OLED microdisplay which is an ultra high 4K per inch resolution. This 4K OLED microdisplay has more than double the number of dots compared to smartphone OLED and reduces the panel size by almost 20 times.

For a high number of pixels in a smaller display, we use Sony’s fine processing and advanced packaging technologies gained from our CMOS image sensors.

But latency key too – with a clever approach

The company explained that ultra-low-latency is just as important as resolution. The illusion of reducing this tenfold is achieved by noting where the user is looking, and filling out this part of the image first.

Normally, this resolution takes about 0.1 seconds for processing. To avoid a user feeling dizzy, the processing time must be shorter than 0.01 second. Perceived delay is reduced by combining multiple sensor data and latency compensation technology. The image is converted according to the latest position or direction of user’s head before display projection.

Check out the video demo below.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications