logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Tracking Facial Expressions in UE5 With the Live Link Face App

This amazing project was made by Charlie Kim to explore the capabilities of UE5 and Live Link.

Facial Modeler at Weta FX Charlie Kim presented a neat setup that allows one to track facial expressions and animate a digital avatar in real-time with the captured data. Made with Unreal Engine 5, the setup utilizes Epic Games' Live Link Face, the company's app for iOS devices with ARKit capabilities, which enables artists to drive complex facial animations on 3D characters inside Unreal Engine, recording them live on a phone and in the engine. The digital monkey itself was created using Maya, Substance 3D Painter, and XGen. According to Kim, the goal of the project was to explore the capabilities of UE5 and Live Link.

"I built a Python script to automate the process for splitting shapes and connecting corrective shapes and in between shapes," comments the artist. "In Unreal Engine, I made an Animation blueprint to drive the corrective shapes and in between shapes."

You can find the artist's original post here. Also, don't forget to join our Reddit page and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more. 

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more