logo80lv
Articlesclick_arrow
Professional Services
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
Order outsourcing
Advertiseplayer
profile_loginLogIn

VFX Production For AAA Video Games

DOOM VFX artist Wirginia Romanowska talked about the way visual effects are produced and used in video games.

DOOM VFX artist Wirginia Romanowska talked about the way visual effects are produced and used in video games.

Introduction

I’m originally from Katowice, a city in Poland. I didn’t have much growing up, but art and creativity have always been a part of my life. I would take the scraps of clothing from my mom and sew small dresses for dolls or spent hours painting horses in oil pastels. Photography has been my passion as well – starting when I was only few years old with a vintage Zenit 35mm film camera.

Easily, the biggest decision was at 17 when I spent my life savings on my first computer. My mom was so mad! My parents were upset that I was always “messing with the computer.” They wanted me to have a good education in a solid profession. Looking back, it makes sense for them to want that, but I knew the computer would be my gateway to the world. I learned Photoshop and 3ds max at night while majoring in metallurgy at University during the day.

Eventually, I got paying gigs. Basic stuff from creating websites and user interfaces to art for indie games. I finished my master thesis on visualizing distribution of heat depending on burn conditions. My study of material science helped nudge me towards vfx.

VFX in Games

I’m fascinated by natural phenomena. If I wasn’t working in video games, I’d be off chasing tornadoes or demolishing old buildings. VFX is that piece of magic that takes a scene from static to dynamic whether it’s a massive tidal wave swooping down on a city or a death star explosion in space.

Working on a 60hz game like DOOM had its challenges. We were tweaking performance up to shipping. It was fun learning how to work with lit particles authored using a physically based pipeline. The future is bright for vfx in games: from lit, interacting participating media to real-time destruction of buildings to more accurate rendering of fluids and gasses. It’s a good time to be a vfx artist!

While VFX play similar roles in both movies and games: from heightening the drama in a scene to communicating information to the viewer or player, there are a few key differences. The first is that a single CG frame can take hours to render on a render farm whereas a single frame of a game is computed in 16 milliseconds. We typically render specific detail of the motion in simulation to a texture and use that as a material on a camera facing quad that we can animate using proprietary tools. We can also bake a motion vectors field and use them to replay particles in-game, as we cannot afford to simulate in real time (yet). Another difference is that vfx in games typically have to look good from all angles. It is one thing to design good looking explosion seen from one specific shot and quite another to have the same explosion viewed from a kilometer away down to a meter, from any camera angle and in any lighting condition. Another difficulty is the evolving nature of the game engine. The capabilities change over the development cycle. If you are too aggressive with the look, frame rate can tank. But if you are too conservative then the game doesn’t look as good as the competition. It can be quite the challenge to plan a game in advance when you don’t have final hardware.

Effects

With thousands of vfx for weapons, player, AI’s, attacks, levels and events it’s easy to get lost in the details – part of my job is to keep an eye on the big picture. I take special care to prioritize overall loudness of the vfx. Some effects like muzzle flashes, or bullet impacts are more important to communicate – so they need to be brighter and bigger, whereas other vfx like embers or mist are complementing the scene, so they take secondary stage as soon as combat kicks off. In DOOM we also considered use of realistic vs stylized effects. We reserved fx that don’t behave “real” to represent the strangest and most dangerous form of hell energy.

With every new effect I begin from either concept art or reference. With all gameplay vfx like AI attacks, weapons or gore I start simple and iterate in passes, adding more and more detail and polishing at the end. It was hard to work this way earlier in my career – I wanted to finish an effect completely and move onto the next one. However, working in passes is, ultimately, a more efficient way of working when you are re-tweaking things over-and-over as the project gets closer to shipping. The majority of effects you see in DOOM are particles, but we use other techniques too: animated geometry, meshes with scrolling textures, ribbons, lens flares and post effects.

DOOM Environments: With a wide variety of effects and multiple fx artists working on the DOOM team, being harmonious wasn’t a given. We worked hard to maintain a consistent look throughout the game.

The majority of work in vfx comes late in the game production, only after level design, environment modeling, lighting, characters, weapons and narrative are close to finish. So the effects schedule is highly dependent on the work of other departments. This is one of the reasons why it’s good to prepare a library of generic effects, like fire, smoke or sparks ahead of time. They can be placed anywhere – and with some tweaks they can be transformed into unique effects for specific use.

Before shipping everyone on the dev team is play testing the game – that’s the best way to see if our work looks and performs in-game as intended, see contributions from others and make sure that the game is fun to play. We polish vfx until the last weeks before shipping and there is always room for improvement, but with any changes we have to be very careful, as in the last days before the deadline the stability and performance are priority.

Tools

Vfx artists employ variety of commercial software: 3ds max, Maya, Photoshop, After Effects, RealFlow, FumeFX, Houdini and others. We also spend a lot of time in proprietary software.

Any effect can be broken up into few simpler parts, for example an explosion particle system consists of multiple emitters that represent fire, smoke, debris, sparks and shockwave. Each emitter is usually a single element. Look and animation of particles are derived from a texture or material and from simulation. Realtime particles in many ways are similar to simulations you can get in any other particle simulation software like particle flow in 3ds Max and you still use common particle properties like speed, gravity or color. Proprietary software is designed with performance and fast workflow in mind, so depending on engine purpose – properties, workflow and limitations vary from project to project.

Half of success in vfx is always a good texture or material. Depending on platform and engine there are different limitations and sometimes workflow is focused more on custom, shader driven, procedural materials or on animated sequences – flipbooks.

Animated textures for elements like fire or smoke can be generated either from real footage or from rendered simulation. It’s true that footage can be the fastest way to drop an explosion into a game. But there’s hidden costs: the footage can be impossible to tweak, and will judder if played back using slow motion. There’s even subtle issues such as noise in low dynamic range areas that can be difficult to remove. Simulations initially take longer time to develop, but then there is much more flexibility around art directing and quality. Simulation software like Houdini also makes it easier to loop or/and tile animated textures.

Houdini can be used for much more – to dress levels in ivy, tentacles or vegetation. Many projects utilize it for pre-visualization and animatics. It is quicker to import the animation into Houdini to see if a pillar breaking effect is going to work than waste time hooking it up in the game. More and more game studios use Houdini and there are some new exciting ways of utilizing its power. The most recent is a talk given by Luiz Kruel at GDC 2017 covering baking out simulations to vertex animation. Very cool stuff and worth learning the package!

Functions

Vfx plays two, often contradictory, roles in video games. The first, and most important, is that of communicator. Vfx forms a non-verbal language that hints to the player if something is dangerous or friendly, deadly or life-giving. We employ embedded cultural expectations such as white or green for healing and red for damaging. It’s a kind of shortcut handed down from the days around the campfire when the red fires would burn if you got too close.

The second is in selling a believable world to the player. From the fiery sparks tumbling down a foundry crucible to the tufts of smoke from a pillar of ice, it sells the idea of a dynamic, evolving world full of danger and mystery. Without effects, if the player were to suddenly stop walking, the screen would seem to freeze. Even if the player isn’t aware of it, the vfx convince them that the world is a living one.

The difficulty arises when you want to make vfx for a weapon and you are limited in your palette. It would be strange for a gauss gun that fired blue projectiles to suddenly fire red ones. Or if demons suddenly glowed green, unless they were friendly. It’s easy to make particles and ribbons and things cover the screen. It is far more difficult to clearly communicate to the player what is about to happen through a visual language – and that’s what vfx is at the end of the day.

DOOM Characters: Clear communication with player supported by Vfx, makes DOOM combat extremely satisfying.

Costs

Vfx artists typically have the hardest time controlling their budgets. While modelers and world-builders have polygon counts and animators have bone counts, vfx is extremely view dependent. Look at a particle system on-axis and the overdraw can be murder. Or walk backwards spawning grenades and the full-screen puffs of smoke pile up driving your frame rates down to single digits. These problems can be especially pernicious in that they’ll only last for a few frames. Players may not notice the stutter but will be left feeling that their character becomes less responsive when in a heavy firefight.

We have tools in our belt to fight these problems from rate-limiting and fading particles as they get too close or too far away, to spawning particles on the GPU. You can take things further by drawing to a smaller off-screen buffer or decoupling the lighting from the drawing of the particle.

One overlooked aspect is organization. If you simply copy a particle system in various maps and then mutate them, it will be very difficult hunting them down as you get closer to shipping to optimize. Keeping particle systems in their own layers and directories can help enormously. Instead of having a thousand different kinds of smoke, make three sizes of smoke and try to reuse them as much as possible. Organization isn’t a sexy topic, but it will save you tons of time on the back-end.

Advice

To learn vfx, first cover the basics: learn a bit about modeling, lighting, UV mapping, etc. Personal projects are a great way to learn. Model a book and break it. Then make a bookshelf with constraints and break that. Or melt a goblin. Then bury a hundred goblins in sand. Make a “particle zoo”, a collection of basic effects, in Unreal and share it online. Replicating tutorials is important but push them further. And frame step through video capture of commercial games. There’s a lot more going on for that laser effect than you notice at first glance!

As you progress, start to specialize: Vfx, even in games, is not really a single discipline anymore. Some artists are more tech-oriented – they write shaders and create unique materials. They focus on HLSL, Substance Designer, materials, etc. Others, and I’m in this category, prefer working with particles, rigid body simulations, dynamics and breakables. Those should definitely download a copy of Houdini Apprentice and use that. It doesn’t hurt to try everything when you are starting out, but eventually you’ll want to go deep on a few topics.

I learned a lot from books like Pete Draper – Deconstructing the Elements with 3ds max. Later on I went through all of Allan McKay fume FX, Afterburn, Particle Flow and Thinking Particles tutorials. SideFX has some great tutorials on Houdini on their website as does Steven Knipping and Entagma.

And remember that vfx artists never stop learning. Researching and adapting new technologies is part of our job. Regardless of what you do – keep in touch with the artist in you – paint, sculpt – do art in whatever form you like most – even if it’s not CG. Just a year ago, after over a 20 year break I went back to painting in oil pastels and it’s enormously satisfying – it keeps me sane and keeps my artistic intuition at its best!

I wanted to thank and credit vfx team on DOOM, especially Derek Best, Marc Edwards and Todd Boyce for all hard work and huge contribution to outstanding visuals and gameplay of DOOM.

Wirginia Romanowska, VFX Artist

Interview conducted by Kirill Tokarev

Follow 80.lv on Facebook, Twitter and Instagram

Join discussion

Comments 1

  • Jon

    This does not reveal any secrets of their VFX.. it's more like a personal interview.

    0

    Jon

    ·7 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more