We're using Unity (Quarter Circle Games). I'm happy to give an interview and some of my lighting/PP techniques. You can view out game here: https://store.steampowered.com/app/907500/The_Peterson_Case/
This is a fan project, like the Lord Inquisitor was. GW has absolutely nothing to do with it!
Bioshock Infinite animation director Shawn Robertson (shawnrobertson.com) talked about the functions and ways to apply animation in video games.
Shawn is an amazing artist. A real titan of the art in games. He worked on a bunch of games and helped to shape the amazing world of Bioshock. He’s not doing just animation but a bunch of other stuff as well, including 3d modeling (he did some models for Freedom Force vs. The Third Reich). He currently holds a position of the Art Director at Ghost Story Games. It’s a new company founded by former Irrational developers. And yes, he’s working on an immersive story-driven game. We’re absolutely sure, it’s going to be awesome.
Shawn Robertson also gave talks during the GDC (a couple of years in a row). He’s a great speaker and advisor, who can help you better understand the way art works in games. We’re incredibly proud to present this interview to you.
My name is Shawn Robertson. I’ve been a game developer for close to 20 years now. I got into game development in a bit of a roundabout way. I graduated from Ringling College of Art and Design in 1994 with a major in illustration and zero computer skills. The Game Art BFA program at Ringling College brings our feature film aesthetic to games and is focused on providing students with the professional artistic skills necessary to create compelling and believable interactive experiences. I was lucky enough to get an internship at NYNEX Science and Technology’s Media Lab. They wanted me to apply my painting style to an experimental UI that was being developed for cable TV applications. Applying my traditional art skills to designing a UI helped me ease into digital art.
After that internship I used my new found digital skills to get a job at a children’s ‘edutainment’ studio called Funnybone Interactive. There we churned out 8-bit point and click stories using Macromedia’s Director. As silly as the end products were, this was a great environment to get used to the collaborative process that comes with making games.
I jumped from there to a couple of miscellaneous jobs before landing at Looking Glass Studios. At Looking Glass, we were working on a World War Two flight simulator, my dream job! Looking Glass was a pretty amazing place, down the hall there was a team working on Thief, The Dark Project. Near our pit, the nascent Irrational Games was working on System Shock 2. I soaked up as much knowledge as I could before the studio closed in 2000 and we all went our separate ways.
Luckily, after my layoff I landed at Irrational Games, then based in South Boston. I started off as a generalist with a focus on animation. For my first project, The Lost, I was responsible for all animation and special FX, along with a few odd modeling and texturing tasks thrown in to boot. For various reasons, The Lost never shipped and we moved onto SWAT 4 where I continued to perform as jack of all trades. After SWAT 4 shipped, we started working on Bioshock 1. Sometime during the first year of that project I moved over to the animation lead position. Next up, my role on BioShock Infinite and the two Burial at Sea DLC’s was animation director. Moving past Irrational and into our new studio and our new game, I have taken on the role of art director.
The Main Functions of Animation
I agree with the statement that good animation is generally ignored and bad animation is usually called out. Ideally, you want the players to be so immersed in the world that they aren’t really thinking about the craft, good or bad.
Animation serves many functions in games, narrative being the obvious one, but narrative can be expressed in multiple ways. For instance, if you’re talking about a FPS, camera and hand/weapon animations serve as the unsung hero for selling who the character is to the audience. Whether or not you are a weapons expert or novice, fleet of foot or a total klutz, how you experience basic movement and interactions in the world are largely defined by the camera and hand animations. Camera animations are especially tricky. You are constantly riding that edge of making something feel grounded and physical vs. giving a good chunk of your audience motion sickness.
In a game like BioShock Infinite the animation, as well as the rest of the art, serves the purpose of pushing the narrative forward. Even combat animations fall into that context. Who you are fighting is a direct result of what the story says and the animations have to support that. Even when you are making animations for combat you are thinking about who these characters are and why the player is encountering them. There may be animations that are more design-centric, but they still have to obey what the narrative is doing.
I’m not sure if I agree with the statement that animation is one of the biggest parts of modern games. Everything from audio to models to textures to lighting and level design and FX has to come together to make a scene or an encounter work in a game. I find that successful animators work well with others and understand how their craft fits within the larger picture.
One of the best lessons I have learned doing game animation is that theatre is an often under looked source of inspiration, especially when you are creating scenes in which you are allowing the player to move around. In film, the camera tells the audience exactly what they are looking at. In a game that allows for any camera movement, you have to draw attention from the audience in other ways and theatre stagecraft is a good place to draw reference from. Posing and silhouette become much more important as an animator when you are competing for the attention of a player who is allowed to move the camera around. Lighting and sound are also important tools and should always be considered when you are thinking about how the animation is going to be presented. In the Andrew Ryan scene in BioShock 1, we stepped away from a realistic office in order to construct a more abstract stage, complete with formalistic lighting, in order to sell the ‘Would you kindly?’ scene. For the Sander Cohen scene in Burial at Sea part 1, we used a simple dance loop to cast enormous shadows on the wall as you enter the space to further draw you in. If I remember correctly, the actual dancers casting the shadows were hovering in mid-air, hidden, and scaled up to produce the effect, completely stepping away from any kind of realism.
Outside of more formalistic expressions, just presenting a character moving around in the world still offers challenges. Movies often fill the frame with the character you want the audience to pay attention to. In a FPS, outside of a cutscene, characters will most likely be fairly small in the frame, often competing for attention with interesting backgrounds and player attention spans. How often in a movie do you see a character’s full body (head to toe) in the frame, vs. in a game? Because game characters (outside of cutscenes) tend to be smaller on screen than their film counterparts, you have to not only think about the animation, but also think about how you are going to present that moment. Sometimes that means pulling back on the environment details to help push characters forward.
Realism and Stylization
This is a pretty subjective area. Realistic motion capture vs. 100% hand-keyed animation and everything in between all have a place within video games. Ultimately the animator has to decide if they are effectively communicating to the audience. Does the audience understand what is happening? Are you getting the ideas across to the player that you need to sell? There is no monopoly that one particular style of animation has over another in communicating ideas. Most of the time you see games sticking to one style of animation or another because it makes sense narratively. I can imagine though, if someone were to make a Roger Rabbit style game, you could have a situation where mocap and squash and stretch animation could live side by side, because it makes sense in the context of that world. I’m not sure that would work so much with a serious military shooter.
Motion capture and performance capture are great tools we have available to us. It really is a subjective choice when it comes down to if you use them or not and depends on what you are trying to accomplish with your movement. I would never claim something to be artistically unacceptable. If it works and looks good, go for it! We happened to mocap extensively on BioShock Infinite. Raw mocap was great for quickly blocking out scenes and getting things up and running. We would usually replace or heavily clean up much of the mocap as the scene was iterated on, but the mocap generally provided a solid base for us to start from. For most of the Elizabeth scenes we tried to keep as much as the actress movement in as possible, since she really nailed the character during the mocap shoots. We also used mocap on BioShock 1. We had a much smaller animation team, 2 full-time animators in addition to myself, so we generally leaned more towards keeping the mocap than keying over it. That was more of an economic decision rather than an artistic decision though. On our current project, we haven’t used mocap at all.
Again, this is a subjective question. There is always a fine line between too much animation and not enough. For instance, an overly complicated reload animation could look awesome the first 3 times you see it, then quickly turn obnoxious as you watch it a dozen more times. Creating a reload animation that happens so quickly, the player is unable to understand what is happening is also not so great. Usually with gameplay animations you are working closely with design. They have a vision, and we’re here to help them realize that vision.
One example I can think of, where we used animation as a purely artistic choice to influence the player’s feelings about a character without affecting gameplay, are the additive reactions we gave to Elizabeth in BioShock Infinite. In order to ground her to the world more we gave her additive flinch animations that she would play when people were shooting near her. These animations didn’t stop her locomotion or prevent her from performing any actions her AI wanted her to do. They sat on top of what she was already doing and provided a little bit of awareness to the world around her.
We also gave Elizabeth an emotion system that would play additive animations on top of her base AI movement. From a gameplay point of view, these animations didn’t affect her at all, but from an emotional stand point, they really helped Elizabeth feel human. These animations were small and easy to create, but had a huge effect on informing the player how Elizabeth felt. Most of the time we were using facial expressions combined with arm and hand animations, with the occasional upper torso animation thrown in. The trick was that we couldn’t hijack her movement, so everything had to sit on top. Not every piece of animation content needs to be over the top, sometimes subtle movement is all you need to sell the story. When we were working on the femme fatale version of Elizabeth for Burial at Sea part 1, we used the emotion system for her smoking animations.
We always started from a place of, what do we need to tell the player right now? The answer to that didn’t always involve creating animation and when it did not all animations are created equal. When it was a moment that required animation, what type of animation are we talking about? Full camera with no control? Limited control or full control? Full camera with no control were the easiest, those are basically movies. Every player will experience that scene in the same exact way. I felt that these could also be more frustrating for the player. After all, they are playing a game not watching a movie. They want interaction! For scenes where we left full control to the player we usually put them in a small room with Elizabeth and provided context for that room, i.e. you’re in an elevator, let’s use this moment to get some narrative across. Having a small space lets the player still have control, but makes it easier for us to give something for the player to focus on. It also eases the burden on level design, since we know when you are in an elevator we know you aren’t going to accidently walk into an ambush.
The elevator trick gets old quickly though. It’s the updated version of ‘put the character behind glass so you can’t interrupt them’ trick. Now we’re working on systems that globally will tell other AI’s to stand down if the player needs a quiet moment for narrative.
How can you enrich the scene with animation?
Nothing draws the eye like movement! There are animation techniques like vertex animation that usually fall on the shoulders of environment artists that help add some punch to a scene. Flames, sparks, birds, etc. usually fall within the scope of an FX artist. It’s all animation, but with the tools available to us, it doesn’t mean that only the person with the title of animator can contribute to movement.
As far as more traditional animation helping sell environmental mise en scene, it really comes down to the story you are trying to tell and what you’re willing to let the player get away with. If you want an animated character in the scene, can the player interrupt them? If they can, what is the player expectation of what would happen, and can we support that? Looping FX like smoke, fire, and swarms of flies don’t really have the same expectation of interaction that a fully animated character does.
Tools of Trade
The Max vs. Maya argument is well known within the development community, but I think that each studio and each animator has different needs. We worked in Max on BioShock 1 because I was pulling double duty as the tech animator and at the time BIPED was a huge time saver. Moving onto BioShock Infinite, we stayed in Max and switched over to the CAT system. This time around we had actual tech animators that could provide real support, so we could customize our rigs as needed. On the BioShock Infinite DLC’s we were already moving over to Maya and fully customized rigs and tools because we had more animators that were comfortable in Maya. This was possible because we had dedicated tech animators. There are things in Maya, like the function curves, that are miles ahead of Max, but also things like the animation layers that are way less useful than the ones on the CAT system in Max. There is a give and take. For me, I have to listen to the guys that are animating every day. If they are asking for custom rigs and working in Maya then that is the right direction to move in.