Adobe has just announced a major update to its Character Animator desktop app, which lets designers combine layers from Photoshop and Illustrator to create animated puppets. Features such as Speech-Aware Animation and Lip Sync are now available as a beta preview.

Adobe Character Animator is part of the Creative Cloud app suite for macOS and Windows, and it offers specific tools to simplify the work of artists who create animated characters. As Adobe pointed out, the new features added to Character Animator are important for this time when the production of live action content has become more difficult, which reinforces the need for better technologies to create animations.

Animation is having a major moment. At a time when live action content is challenging to produce, animation allows us to create without restraints and with nothing more than our imagination, no matter what is going on outside. More and more artists and studios are turning to Emmy-Award-winning Adobe Character Animator to accelerate traditional animation workflows, capturing performances in real-time, and even livestreaming animation. 

With its latest update, which is being rolled out as a public beta, Adobe Character Animator now includes the following new features:

  • Speech-Aware Animation uses the power of Adobe Sensei to automatically generate animation from recorded speech and includes head and eyebrow movements corresponding to a voice recording.
  • Limb IK (Inverse Kinematics) gives puppets responsive, natural leg motion for activities like running, jumping, tug-of-war, and dancing across a scene. Limb IK controls the bend directions and stretching of legs, as well as arms.
  • Timeline organization tools include the ability to filter the Timeline to focus on individual puppets, scenes, audio, or keyframes. Takes can be color-coded, hidden, or isolated, making it faster and easier to work with any part of your scene. Toggle the “Shy button” to hide or show individual rows in the Timeline.
  • Lip Sync, powered by Adobe Sensei, has an improved algorithm and machine learning to deliver more accurate mouth movement for speaking parts. 
  • Merge Takes allows users to combine multiple Lip Sync or Trigger takes into a single row, which helps to consolidate takes and save vertical space on the Timeline.
  • Pin Feet has a new Pin Feet When Standing option. This allows the user to keep their character’s feet grounded when not walking.
  • Set Rest Pose now animates smoothly back to the default position when you click to recalibrate, so you can use it during a live performance without causing your character to jump abruptly.

Adobe offered the beta version of Character Animator for Nickelodeon, which used the software to remotely produce a half-hour special of The Loud House & The Casagrandes. Adobe Character Animator allowed the animators to save time with Lip Sync’s tools and more.

Users can get the latest beta version of Character Animator through the Creative Cloud Desktop app. A free trial version is available for new users with a puppet library and multiple tutorials.

FTC: We use income earning auto affiliate links. More.


Check out 9to5Mac on YouTube for more Apple news:

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author