logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Realistic Hand & Face Animations Achieved With Mocap & UE5

The movements were captured using the Xsens suit and Faceware's facial motion capture system.

Beatrice Schmidt, a Technical Artist, Animator, and Cinematographer specializing in motion capture animations, has shared a sneak peek at a short work-in-progress sequence featuring incredibly realistic and fluid hand and face movements.

Powered by Unreal Engine 5, the animation was brought to life through the mocap technology, with the Xsens suit and Faceware's facial motion capture system having been utilized to capture the basic movements of the sequence. The captured data, the artist noted, was then retargeted to the character's skeleton, with additional adjustments made to ensure smoother finger movements.

"It's interesting to note that the head and eye movements in the animation don't always match those in the video," commented Beatrice. "I used a backwards solve control rig to make an additive edit on top of the mocap animation data, both for the face and the body, to better match the vision of the project's creative director. If you want to know a bit more about this method, it's explained in more detail in my ArtStation."

As mentioned above, the project is still WIP, with the creator planning to demonstrate the final results and reveal more information in the coming days. We highly encourage you to follow Beatrice on LinkedIn so as not to miss the reveal. In the meantime, you can also check out the artist's ArtStation page to see more of Beatrice's earlier works.

Also, don't forget to join our 80 Level Talent platform and our Telegram channel, follow us on InstagramTwitter, and LinkedIn, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more