A tech demo video by VFX artist Elisha Hung shows how data from the iPhone X’s TrueDepth camera system could be used to animate 3D characters and objects in a CGI movie.

Apple gives developers access to the same face mesh that Animoji uses to animate pigs, rabbits and piles of poo. The data streaming from the iOS API can be transformed into a format that traditional 3D editing software can interpret, as shown by Hung in this demo:

Rather than using expensive motion capture equipment, Hung coded an ARKit app to record his live-updating face mesh as he made various expressions. He then used the depth map to animate a 2D texture of his face.

The end result is surprisingly realistic and mimics the sort of animated face reproduction seen in AAA games titles.

The fidelity of TrueDepth may not be enough to be useful for Hollywood budget movies, but it could open a new avenue for amateur and prosumer videomakers and animators.

It also shows how a third-party iPhone app could be made that recreates the Animoji experience but with human (celebrity?) faces rather than emoji creatures.

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news:

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Benjamin Mayo

Benjamin develops iOS apps professionally and covers Apple news and rumors for 9to5Mac. Listen to Benjamin, every week, on the Happy Hour podcast. Check out his personal blog. Message Benjamin over email or Twitter.