We reported yesterday one funky new feature in iOS 13 beta 3: FaceTime eye contact correction.

Currently, when you look at the screen to see the face of the person you’re talking to, they will see your eyes looking down as the camera is above the screen. But the latest iOS 13 beta offers a feature called FaceTime Attention Correction…

NordVPN

What this does is make real-time adjustments to the image of your eyes so that you appear to be looking at the camera rather than the screen. This means you’ll appear to be making eye contact.

People have been tweeting examples, like that above from Will Sigmon, and it really does look to make an incredible difference to the experience.

Mike Rundle, who was on the other end of the call, was impressed.

Predicted as a far-off feature

Rundle says he predicted back in 2017 that this feature would exist ‘in years to come,’ but is astonished that it’s already here. Incidentally, his other predictions back then were these:

  • Making the iPhone aware of its surroundings with always-on cameras continuously scanning, mapping and tracking objects in 3D space that are near the iPhone
  • Eye-tracking that allows for software anticipation, making facets of a software interface be guided completely by gaze (Apple acquired SensoMotoric Instruments earlier in 2017, a world-leader in eye-tracking technologies.)
  • Biometric and health information derived from camera data of a user’s face (what’s my pulse, etc.)
  • Advanced image-manipulation algorithms that make sure FaceTime calls always show your eyes looking at the other person
  • Machine learning advances allowing for instant counting of objects near the iPhone (how many people are in this classroom, how many cars are between me and the stop light, how many pencils are on the table, how many shirts are folded in my closet, etc.)
  • Instant measuring of object and space dimensions without the need for gimmicky AR rulers (how long is that wall, how wide is that opening, how tall is that lamp, etc.)

FaceTime eye contact correction uses ARKit

Dave Shukin says that FaceTime eye-contact correction uses ARKit, and posted a video demo.

The warping can be seen in action as he passes the arm of a pair of glasses across his face. The straight arm of the glasses is warped by the feature adjusting his eyes.

FaceTime Attention Correction

I can only echo the words of iOS developer Gualtiero Frigerio:

Tech nowadays is so cool and advanced we can describe the implementation of a feature like this starting with the word “simply.”

Developer Aaron Brager notes that the feature appears to rely on one of the ARKit 3 APIs, hence being limited to the iPhone XS/Max and XR, and not working on the iPhone X.

Hyper Cube automatic iPhone backups


Check out 9to5Mac on YouTube for more Apple news:

About the Author

Ben Lovejoy's favorite gear