We reported yesterday one funky new feature in iOS 13 beta 3: FaceTime eye contact correction.
Currently, when you look at the screen to see the face of the person you’re talking to, they will see your eyes looking down as the camera is above the screen. But the latest iOS 13 beta offers a feature called FaceTime Attention Correction…
What this does is make real-time adjustments to the image of your eyes so that you appear to be looking at the camera rather than the screen. This means you’ll appear to be making eye contact.
People have been tweeting examples, like that above from Will Sigmon, and it really does look to make an incredible difference to the experience.
Mike Rundle, who was on the other end of the call, was impressed.
https://twitter.com/flyosity/status/1146145234801307650
Predicted as a far-off feature
Rundle says he predicted back in 2017 that this feature would exist ‘in years to come,’ but is astonished that it’s already here. Incidentally, his other predictions back then were these:
- Making the iPhone aware of its surroundings with always-on cameras continuously scanning, mapping and tracking objects in 3D space that are near the iPhone
- Eye-tracking that allows for software anticipation, making facets of a software interface be guided completely by gaze (Apple acquired SensoMotoric Instruments earlier in 2017, a world-leader in eye-tracking technologies.)
- Biometric and health information derived from camera data of a user’s face (what’s my pulse, etc.)
- Advanced image-manipulation algorithms that make sure FaceTime calls always show your eyes looking at the other person
- Machine learning advances allowing for instant counting of objects near the iPhone (how many people are in this classroom, how many cars are between me and the stop light, how many pencils are on the table, how many shirts are folded in my closet, etc.)
- Instant measuring of object and space dimensions without the need for gimmicky AR rulers (how long is that wall, how wide is that opening, how tall is that lamp, etc.)
FaceTime eye contact correction uses ARKit
Dave Shukin says that FaceTime eye-contact correction uses ARKit, and posted a video demo.
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
— Dave Schukin (@schukin) July 3, 2019
The warping can be seen in action as he passes the arm of a pair of glasses across his face. The straight arm of the glasses is warped by the feature adjusting his eyes.
I can only echo the words of iOS developer Gualtiero Frigerio:
Tech nowadays is so cool and advanced we can describe the implementation of a feature like this starting with the word “simply.”
Developer Aaron Brager notes that the feature appears to rely on one of the ARKit 3 APIs, hence being limited to the iPhone XS/Max and XR, and not working on the iPhone X.
FTC: We use income earning auto affiliate links. More.
Comments