Skip to main content

FaceTime eye contact correction in iOS 13 uses ARKit

We reported yesterday one funky new feature in iOS 13 beta 3: FaceTime eye contact correction.

Currently, when you look at the screen to see the face of the person you’re talking to, they will see your eyes looking down as the camera is above the screen. But the latest iOS 13 beta offers a feature called FaceTime Attention Correction…

What this does is make real-time adjustments to the image of your eyes so that you appear to be looking at the camera rather than the screen. This means you’ll appear to be making eye contact.

People have been tweeting examples, like that above from Will Sigmon, and it really does look to make an incredible difference to the experience.

Mike Rundle, who was on the other end of the call, was impressed.

https://twitter.com/flyosity/status/1146145234801307650

Predicted as a far-off feature

Rundle says he predicted back in 2017 that this feature would exist ‘in years to come,’ but is astonished that it’s already here. Incidentally, his other predictions back then were these:

  • Making the iPhone aware of its surroundings with always-on cameras continuously scanning, mapping and tracking objects in 3D space that are near the iPhone
  • Eye-tracking that allows for software anticipation, making facets of a software interface be guided completely by gaze (Apple acquired SensoMotoric Instruments earlier in 2017, a world-leader in eye-tracking technologies.)
  • Biometric and health information derived from camera data of a user’s face (what’s my pulse, etc.)
  • Advanced image-manipulation algorithms that make sure FaceTime calls always show your eyes looking at the other person
  • Machine learning advances allowing for instant counting of objects near the iPhone (how many people are in this classroom, how many cars are between me and the stop light, how many pencils are on the table, how many shirts are folded in my closet, etc.)
  • Instant measuring of object and space dimensions without the need for gimmicky AR rulers (how long is that wall, how wide is that opening, how tall is that lamp, etc.)

FaceTime eye contact correction uses ARKit

Dave Shukin says that FaceTime eye-contact correction uses ARKit, and posted a video demo.

The warping can be seen in action as he passes the arm of a pair of glasses across his face. The straight arm of the glasses is warped by the feature adjusting his eyes.

I can only echo the words of iOS developer Gualtiero Frigerio:

Tech nowadays is so cool and advanced we can describe the implementation of a feature like this starting with the word “simply.”

Developer Aaron Brager notes that the feature appears to rely on one of the ARKit 3 APIs, hence being limited to the iPhone XS/Max and XR, and not working on the iPhone X.

FTC: We use income earning auto affiliate links. More.

Hyper Cube automatic iPhone backups
You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Check out 9to5Mac on YouTube for more Apple news:

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications