Facial Animation and Deformation in Real-Time
Events
Subscribe:  iCal  |  Google Calendar
Marina Del Rey US   10, Dec — 13, Dec
Las Vegas US   8, Jan — 12, Jan
Zürich CH   31, Jan — 4, Feb
Leamington Spa GB   31, Jan — 3, Feb
Bradford GB   6, Feb — 11, Feb
Latest comments

Amazing work!

Great great stuff, thanks alot for this, cleared up a lot for me.

by petr luchko
10 hours ago

awesome work!such works inspire

Facial Animation and Deformation in Real-Time
9 February, 2018
Animation
Character Art
Interview

Izabela Zelmańska shared some thoughts about the recent real-time character study. 

Project 

I was inspired by very original and quirky characters from Tim Burton’s animated movies, plus trailer from upcoming “Alita: Battle Angel” reminded me how enormous eyes can be awesome. I decided to try my own spin on cartoon/realistic mix.

After gathering enough reference I started sculpting neutral face and expressions. I started out with my own head scan, but sadly I don’t look much like the cartoon character, so it took some work to get the desired look. Later I used my good old, battle-tested base mesh with nice topology and started matching it to expression_s to create blendshapes. This part is tricky – you can quickly confirm one shape to the other, but it’s all about the proper movement of skin/muscles and it can take a while before you have realistic movement instead of artificial morphing.

Hair

Hair model was pure ZBrush hand-sculpt, instead of using Fibermesh as usual – I needed thick cartoonish strands instead of zillion thin splines. This actually simplified engine work –  I didn’t have to work with multiple hair cards, transparency, depth maps, sorting issues etc – just made simple SSS shader for thick hair and voila. 

Eyes

Part of Unreal Engine’s appeal is access to tons of example content. My fiancee (much more experienced with real-time) helped me base my own eye shader off of Photorealistic Character template supplied by Epic. We used our own textures and made some shader modifications, especially to make the eye reflection really pop. We had to mix actual scene reflections with a bit of fake reflection map and tweak refraction to get that dreamy wet look. As for the transformation bit, it was actually possible thanks to Epic’s shader bits, though I’ve been using them not NOT as intended – scaling UV’s in a rather crazy way.

Facial animation

No bones were harmed (or used at all) during the making of this animation. Just blendshapes, everything (apart from shaders and lighting) was assembled in 3ds Max, and I used a nice speed boost to get everything inside Unreal – alembic format. Instead of exporting geometry, skeleton, animation etc. separately and re-assembling in Unreal, I let alembic export/import_ handle everything. This may not be the most ‘lightweight’ solution as resulting files are big, but you can see your MAX scene in Unreal in no time, even without knowing much about meshes, morphs, skeleton setups, collision geometry and animation assets. 

Materials  

Typically I would morph expression_s normal maps as well as geometry, but this time my mesh was dense enough to handle detailed skin movement/wrinkles without blending normal maps. I built my skin shader around the one from Photorealistic Character template and just focused on getting my diffuse/specular/sss/roughness right. It was a bit confusing at first but my fiancee knows his stuff and I keep learning from him. He also helped me tweak lighting and scene reflections to get the most out of my shaders. 

Lessons

Mainly I learned that I am unable to produce genuinely cute and innocent characters – they always end up at least a little bit creepy. Seriously though, apart from refreshing my knowledge of Unreal, I was really happy to see huge productivity boost compared to offline rendering. You can iterate so much more on shaders and lighting when you don’t need to wait for sequence to render out. Also, the latest tool in Unreal’s arsenal – Sequencer turned out to be a robust, powerful animation tool which makes additional animation like eye transformation a breeze. Will definitely keep using!

Izabela Zelmańska, Character Artist

Interview conducted by Kirill Tokarev 
Comments

Leave a Reply

6 Comments on "Facial Animation and Deformation in Real-Time"

avatar
myjma@live.com
Member
myjma@live.com

Great lighting, skin shader and expressions!

Izabela Zelmańska
Guest
Izabela Zelmańska

@Greg
Script w/ vertex shader? Can you elaborate? Always happy to expand my knowledge and toolset 🙂

Izabela Zelmańska
Guest
Izabela Zelmańska

Greg – Script w/ vertex shader? Can you elaborate? Always happy to expand my knowledge and toolset 🙂

Greg
Guest
Greg

@Sean Rove
Thanks for the tip! I also thought the linear blending looks a tad weird.

From my part, also, you can easily export morph targets in ue4 with 3ds max using that morph modifier, or with their script w/ vertex shader. You don’t have to setup bones or anything.

Thanks for sharing the knowledge!

Izabela Zelmańska
Guest
Izabela Zelmańska

You are absolutely right. Naturally, when I do work for big projects with actual animation teams, I do a lot more individual blendshapes for more complex muscles tracking. Keep in mind this here animation is just my quick solo preview to show off face sculpting and texturing in a fun way 🙂

Sean Rove
Guest
Sean Rove
The part about using blendshapes for facial animation is that to really capture a believable character the sculpts should be small muscle movements and not entire face sculpts. If you have an expression that is sculpted to perfection and you animate from neutral face to the expression face, it will still move very linearly. If that expression were made up smaller segmented blendshapes and each could animate at different rates to reach the target expression, it will look and behave more realistically. Otherwise when just one expression is sculpted for the entire face, it looks great static, but when animating… Read more »
wpDiscuz