Detailed Lip Sync Animation With Maya & Unreal Engine 5

Beatrice Schmidt continues to experiment with mocap-based 3D animations.

Technical Artist, Animator, and Cinematographer Beatrice Schmidt continues to experiment with motion-capture-based 3D animations, showcasing the results of the latest experiment in a series of videos shared over on ArtStation.

This time, the author focused on studying custom character facial animation, paying particular attention to lip sync movements of the mouth. To animate the digital model's face and lips, the artist utilized a combination of Maya and motion capture data captured with Faceware Analyzer and Retargeter. Additionally, Beatrice employed a facial rig created by Maxim Andreev, as well as Unreal Engine 5, which was used to handle the rendering tasks.

Back in December, the creator also showcased a short work-in-progress sequence featuring incredibly realistic and fluid hand and face movements, set up using the Xsens suit and Faceware's facial motion capture system to capture the basic movements:

And last week, the artist officially introduced Zua, a custom digital character brought to life inside Unreal Engine 5, whom you can see in all of the demos attached above. You can learn more about Zua and how she was made by checking out a detailed breakdown shared by its creators on Behance.

We highly encourage you to check out the artist's ArtStation page to see more of Beatrice's animation experiments. Also, don't forget to join our 80 Level Talent platform and our Telegram channel, follow us on InstagramTwitter, and LinkedIn, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 1

  • Anonymous user

    I’d like to see a demo of this with a professional actor, someone who has actual command of their facial expressions, and is able to produce subtle but understandable expressions that convey meaning.  Put it on Ian McKellen, Helen Mirren, Tom Hanks, Christian Bale, Daniel Day-Lewis, or really any good character actor.  Epic really dropped the ball on their Maya demo by using a young staffer who had no acting training and was incapable of making a realistic looking facial expression.  Her “angry” looked like an infant filling a diaper.  As such, it failed to capture any nuance that animators are doing “by hand” currently.  This technology only has application when it can take things to the next level.  Even though I suspect the technical ability is there, we need a human with facial expression skill to “drive” this workflow, and that isn’t the technical folks… that isn’t in their skillset.  I think when we do get this on the head of a trained actor, we could see whether  we can get character models that can, for the first time, emotionally move us: move us to cry, to smile, to feel love, or hate, to feel empathy, to care.  Then the characters, and this the story, will come to life, and video games can truly become art.  Or movies and shows using UE5, like The Mandalorian, can utilize this to create different looks, like alien or fantasy creatures, without wooden CGI masks, and with empathetic meaningful expression.

    -1

    Anonymous user

    ·3 months ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more