VFX artist Thomas Harle talked about the way he approaches VFX and shaders for video games.
Hi there! My name is Thomas Harle, and I’ve been a visual effects artist in the video game industry for the past six years. During that period of time, I worked on games such as Sniper Elite 3, Champions of Anteria, and Gears of War: Ultimate Edition as well as a few mobile and unreleased projects. Most recently, I’ve left full-time development in favor of focusing on freelance work, making content for the Unreal Marketplace, and teaching games VFX with Escape Studios in London.
VFX in video games is one of the broadest disciplines in game development and varies immensely from studio to studio and project to project. During my time as a visual effects artist, I’ve worked on magical spell effects to dynamic weather systems and everything in between!
Personally, I come from quite a technical background having worked as a structural engineer before becoming a visual effects artist. But, video game FX as a whole encompasses a huge set of skills that range from the technical to the artistic. I’ve worked with artists who possessed strengths in traditional skills and ones who were fluent in multiple coding languages—artists truly bring their own talents to the discipline. And don’t stress if you’re thinking about a career in VFX but don’t have coding experience. It’s a very nice skill to have but definitely not a requirement!
Generally, we can break down video game VFX into three main categories. The first category is gameplay driven VFX. Character based magic effects in games like League of Legends and DotA or muzzle flashes and bullet impacts in games such as Overwatch and Counter Strike are prime examples of this first category of VFX. These effects must be carefully executed to provide the player with a rewarding experience and to immediately convey information back to the player thereby ensuring that gameplay is quick, fluid, and understandable.
Gameplay FX in LoL and Overwatch
The second main category is environmental effects, such as background movement that makes the game world feel alive. This can be anything from wind blowing through trees to leaves, rain, and snow falling on the ground to much, much more. These background effects are equally important in creating a realistic and believable world. However, they generally are a little more forgiving than the other two categories because artists know exactly when and where their background effects will be encountered by the player throughout their game. As a result, this makes the process of correcting lighting and adjusting other conditions relatively easier than the work being done in the other two categories.
Environmental FX in Uncharted and The Last of Us
The last category of FX is cutscenes. Cutscenes aren’t present in every game, but when they are, cutscenes tend to be the most complex aspect of visual effects. They often involve large amounts of animated meshes, choreographed timing, and intradepartmental work between character and environmental artists. As game consoles become more powerful and graphics come closer to reality, visual effects artists are borrowing more and more film techniques to layer complexity into cutscenes. And these additional levels of intricacy carry with them the restraint to produce cutscenes over larger chunks of time.
A Simulated Cutscene from Ryse: Son of Rome
Techniques and Workflow
As with all artistic disciplines, it’s always best to start with good references. Specifically with VFX, artists want to be using video references as the motion and timing of an effect are two of the most fundamental qualities for its in-game appearance and credibility.I’ve managed to amass a large amount of domestic scale reference videos for objects like water, small fires, smoke, and much more. However, YouTube is the best place for more outlandish reference materials like explosions, gunfire, etc.! Another great resource is the FX References Facebook group where people share reference videos. Clearly, it’s more difficult to find real world comparisons for more magic-oriented FX. In this case, I’d recommend exploring other games, films, and concept art as valuable resources for magic effects work.
Heroes of the Storm FX Concepts
Once you’ve collected your references, you need to break down the effect into its constituent parts. For example, you might have the following when referencing a fire to create a realistic fire effect:
- Large looping flames
- Smaller flames that burn out
- Glowing coals or wood
- Small embers
Campfire with FX components
As you can see things get pretty complicated quite quickly! For each section, you then want to break down the color, timing, and motion of the effect and begin replicating that in the game engine. When you’re working on an effect, you usually want to start with the largest and most important section first—in our fire example that would be the large flames—and build things up from there. The workflow of texture → material → particle is very iterative, and you’ll spend a lot of time going back to tweak the original textures and materials once you start to see the whole effect take shape.
In addition to particle effects, there is plenty of work to be done on animated materials or shaders. Objects like water and lava are good examples because each contains complex motion that is challenging to recreate in a game engine without making very apparent artifacts like tiling. Moreover, shaders have started to be used in development to do animation by baking all the vertex animation down into textures that are read back at game time—it’s very powerful stuff!
If you’re interested in seeing what more shaders can achieve, please check out this website.
Recently, a new forum launched to act as a platform for the discussion and sharing of shaders. The website kicked things off with a competition to recreate a realistic snowball—I made my submission entry (below) with Unreal Engine 4 as my renderer.
Final Snowball Render
It’s not the most complex shader I’ve ever made, but I focused much more on the finished render rather than its final performance. Consequently, I was able to use a few techniques that normally wouldn’tpass in a game environment: high tessellation and displacement as well as specifically tailored parameters for the lighting conditions in the provided scene.
Using a tiling noise Normal Map and a clouds mask, I started trying to recreate the surface texture of a snowball. Here, I’m blending between four different layers—three of the Normal Map with different tiling, intensity, and rotation and a perfectly flat Normal (the 0,0,1 node above the Lerps). This tries to recreate the different densities of snow, ranging from it being packed perfectly flat and it being a bit more rough/uneven.Often in shader development, you’re trying to hide the tiling repeats in your material while also keeping the number of texture lookups down. As such, this technique of combining the same object with itself at different UV scales is very useful.
Here are the masks that control the Normal blends—notice again that the blends are using the same Noise texture but with different scales and rotations.By doing some simple subtraction, multiplication, and clamping, I’m able to get some very high contrast masks out of my grayscale noise. Also take notice of the named Param or Parameter Nodes—these are very powerful tools in Unreal that allow you to build your shader logic with one set of values and then adjust them and instantly see the results in real time. It’s very helpful when creating your shader to do quick iterations. Yet, these also can be instanced and used to create a very different looking snow based off the same shader but with different parameters.
These same noise masks are also combined in a slightly different way to create a pseudo-random displacement where one mask controls the displacement amount, another mask controls the contrast, and a third mask controls the displacement intensity.
Here, the Light Vector parameter is reading the direction of the main directional light in the scene, which I’ve written to a Material Parameter Collection in a separate blueprint. This allows be to do a Dot Product with the Vertex Normal – basically check if each face is facing the same direction as the light i.e. is it lit or in shadow?
This lighting mask allows me to do an essential step in my effort to make this snow look realistic: change the color of the subsurface scattering. Based off the illumination of the face, I end up using a subtle blue for the shadows and a warmer yellow/orange for the lit areas. Similarly, this lighting mask lets me play with extra dirt darkening on the shaded side as well as mess around with a different roughness for the lit and shaded sides.
The finishing touch for my entry was faking the tiny sparkle highlights of a snowball. To do this, I simply used a dots texture and combined it with itself in various scales. Again, this gets combined with the lighting mask to ensure only the lit side of the snowball has the sparkles.
Hopefully, that provides you with a little insight into to the processes and types of work visual effects artists do in games. It’s definitely one of the most challenging and rewarding parts of game development (at least I think so!) as it combines both technical and artistic skills while touching on most parts of the game engine and design.
If you want to know more, there are some great tutorial series on Cascade and FX in Unreal by Epic Games. And be sure to come on over to RealTimeFX where we’re always sharing references and ideas as well as hosting regular competitions.
Have fun making VFX!