Animation Details in Character Movements

Jonathan Colin discussed the way he crafts animation for game characters and monsters. Jonathan worked on Horizon Zero Dawn, Styx: Master of Shadows and Cryengine game Wolcen.

Jonathan Colin discussed the way he crafts animation for game characters and monsters. Jonathan worked on Horizon Zero Dawn, Styx: Master of Shadows and Cryengine game Wolcen.

Introduction

I currently work at Guerrilla Games as an animator. Before Horizon Zero Dawn I worked on the indie Hack’n’Slash Wolcen: Lords of Mayhem and the stealth game Styx: Master of Shadows.

I’ve always been passionate about drawing, films and games. I studied at the game school Supinfogame in France, where I learned about game design, storytelling, 2D and 3D art, mainly through practice. We developed a lot of prototypes there with fellow students, which taught us team-work and self-learning are some of the most important skills in a constantly growing industry.

Approaches to Animation

With the amount of assets AAA games like Horizon Zero Dawn require nowadays you can’t really avoid using mocap.

So we record as much as possible for humanoids, mainly on-ground actions such as run cycles, idles… And for more acrobatic stuff like climbing, jumping or takedowns we have very talented animators here at Guerrilla keyframing most of it. And obviously all the robots have to be hand-animated as well.

A friend once asked me if using mocap wasn’t reducing the animator’s creativity. On such a big production I think it actually helped us being more creative! Simply because using mocap speeded up our productivity, meaning we had more time to focus on actual creation and design rather than execution.

What it is important to understand is that when you build a game you have to see the big picture. You should think in terms of animation SET instead of single takes.

 

To get a fluid result the question is not just “how am I gonna make this one animation look awesome?” It is more “how many transitions do I need for my whole system? How many variants? How can all my animations work together to cover anything the player can do at any time?”

Imagine you work on Aloy’s starting set, from idle to move. The player can push the joystick in any direction, so Aloy can start forward, left, right, backward. She can start in walk, jog or sprint, in stand or crouch, with the left foot or the right foot forward…

The number of transitions and variants can grow exponentially! So the more content you can tackle, the more creative you can get. And motion capture is just one of the many tools that help us do that.

Artificiality

Using motion capture isn’t just about realism. For in-game animation especially, we don’t limit ourselves to cleaning up the data (animating the props and fingers that aren’t recorded, removing pops and jitter…). In fact, we really have to push and tweak the motion to get a character as fast and responsive as Aloy. If you only use the timing and metrics of the raw mocap, you get a very sluggish and slow character that won’t be fun to play as (in our type of games anyway).

So the difference between mocap and hand-key animation is not as black-and-white as it seems. Here animators are simply free to choose whatever technique suits best what they’re trying to achieve.

And I believe this is true not only for realistic games but also for stylized art directions. I see more and more cartoony games where you can feel the mocap underneath but where the timing and the poses are exaggerated to the extreme.

Personally I love using mocap at least to get this bit of noise and weight that gives realism and life to your animation and is pretty hard to hand-animate. So sometimes I start with some data that has nothing to do with the intended result, just to get a nice hips sway or a little shoulder shrug to start from, and I hand-key the rest to end up with Aloy drawing her bow for instance.

 

Making Characters Feel Alive

We have a collection of idle animations we call the Contextual Emotions for whenever Aloy is reacting to her environment (rainy, cold or warm weather for example). She also has a more relaxed pose if the player doesn’t do anything for a while, or an ‘out of breath’ idle if you’ve been sprinting for a long time.

The emotions are triggered in code based on a series of parameters. They also have a priority: ‘out of breath’ is more important than rainy, and rainy is more important than relaxed.

Informing Players

Although a lot of feedback goes through UI, Vfx and Sound, we try to communicate as much as possible through body language as well. But because players tend to run all the time and don’t really stop in games, we use animation layers, which allow us to play a ‘substate’ animation on the upper body while the lower body plays a walking or running animation (this is pretty common in games and is not new to HZD).

For example in HZD we make Aloy raise her shoulder and look around when there are enemies nearby. She also grabs her weapon automatically in that case. This gives players a hint to be careful without stopping the character.

She also has an injured animation when your health is low, even though the border of the screen is already red and the pad is vibrating. It’s always good to tell the information multiple times in action games since there’s a lot going on at the same time.

Weapons

We don’t record any prop or weapon in our mocap studio so we hand-animate them. However once we edit the mocap in our 3D program, we can use the motion of the hand that’s holding the weapon as a start.

To me the key to animate props efficiently (especially two-handed weapons like spears and rifles) is to make sure your character rig allows you to easily switch spacing: basically the weapon following the hand vs. the hand following the weapon.

 

For example when Aloy is sprinting while carrying the bow, the bow is following the left hand, which inherits a bit of the arm swing naturally present in the sprint. But when Aloy is aiming, the bow has to be more stable so we animate it in ‘free mode’ and the hands are following it.

Another trick is to animate elements on the weapon itself, or at least simulate them with physics. There’s always at least some feathers swaying or some mechanical parts moving on our weapon which makes them feel more authentic and emphasize their weight.

Like any other games we adjust the position of the feet to match the collision mesh of the terrain, using IK calculation (for that we also annotate all our animations to let the code know when a foot is on the floor).

In Horizon Zero Dawn we also have specific ‘slope cycles’ for walking/running up and down a hill. We recorded that on an actual slope to get more realistic weight and bouncing.

About the horse: there’s actually nothing fancy in terms of technology here, just a lot of really sweet animations our team spent quite some time on to make Aloy’s position feel natural. We blend and trigger these animations based on the camera angle so Aloy keeps aiming properly.

Jonathan Colin, Animator at Guerrilla Games

Interview conducted by Kirill Tokarev

Follow 80.lv on Facebook, Twitter and Instagram

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more