Raouf Bejaoui and Matthijs Verkuijlen talked about the Climate Aware Material and Temperature Aware Material systems they created for Gears Tactics that allowed the artists to control the level of wetness, water accumulation, burning, and other effects in the game.
In case you missed it
You might find these articles interesting
Raouf: My name is Raouf Bejaoui, and I am a Technical Art Director here at Splash Damage. I have been working on Gears Tactics for the past 3 and a half years.
Matt: My name is Matthijs (Matt) Verkuijlen. I’m a Senior Technical Artist at Rare ltd but worked on Gears Tactics for over a year at Splash Damage as a Technical Artist.
This article offers an in-depth look at two of our most ambitious systems which helped us build the world of the game: the Climate Aware Material system and the Temperature Aware Material system built for Gears Tactics (abbreviated to CAM and TAM respectively).
Gears Tactics is a turn-based strategy game from the acclaimed video game franchise - Gears of War. Whilst players were experiencing the game from a new perspective, we aimed at making it recognizable from the aerial camera and authentic to the bold and gritty look the franchise is known for.
The vast majority of Gears Tactics’ levels are generated at run-time from a catalogue of tiles which we called “Parcels”. This allowed us to get more variety for our environments by combining different parcels in each biome, adding local lights and fire elements in key spaces all with different lighting and weather conditions. Weather played a key role in building our visual language “Matrix”, combining all the aforementioned elements in order to achieve our quality and variety goals. However, with most of our levels being generated at run-time, we knew that a typical offline solution would not scale very well with the huge amount of content we had to deal with.
We started prototyping the CAM and TAM systems with a small team of VFX and Technical Artists a couple of weeks after Pre-Production. The goal was to build a scalable system with a mixture of proceduralism for both the shading and effect components, whilst maintaining a fully parametric system that would allow our Artists to tweak the look and behaviour of the weather and incendiary effects.
A Material World 2.0
Introducing the Gears of War universe to a new genre brought a new set of challenges. We had to adjust a lot of our content to ensure that the action was readable from the aerial camera perspective. Our ground composition in particular became one of the most critical visual features in the game as we had to create a rich world, depicting the destroyed beauty of Sera on one hand whilst maintaining a readable space for tactical clarity on the other.
Gears Tactics shipped with 3 acts that all have a very distinctive environment style, called a “Biome”. Every one of these acts has its own library of “Atmospheres” which contains a distinctive lighting, post-processing and when applicable, VFX set up.
Despite the richness and exaggeration already established in the franchise’s iconic imagery, the Technical Art team worked very hard with all Art disciplines to make the lighting and material definition in environments and on characters as physically accurate as possible. With missions generated at run-time with a randomly selected lighting scenario, we had to cover a complex matrix of materials (some bespoke to each Act in the game) and atmospheres whilst maintaining consistency across the board. We developed a PBR view mode with a colour chart that allowed our Artists to preview how physically accurate their materials were in the Unreal Editor’s viewport. Alongside this, we replicated the PBR validation as a material function in Unreal and as a Substance filter in Painter to allow our Artists to quickly correct Albedo and Metallic values with a few sliders in case they were outside of the recommended range.
For lighting, we implemented an HDR validation view mode which allowed us to maintain consistency across all light sources intensities - from LED lights on character armours, to street lamps, all the way up to the primary light source (Sun/Moon light).
Singing in the Rain
The addition of wetness in some of our Atmospheres has added an extra layer of complexity to our material authoring pipeline. We built the CAM system to handle rain and wetness in our environments and on characters at run-time regardless of the biome and lighting conditions used. The rain and wetness intensities are controlled globally by Artists for each Atmosphere or on a per cinematic basis when applicable.
In the same way, we kept our material response and lighting PBR compliant, we wanted the wetness response on our material to be as physically accurate as possible and as Artist-friendly/easy-to-use as we could possibly make it. Having spent so much time iterating upon our PBR compliant material authoring pipeline, we didn’t want to approximate wetness with a simple roughness multiplier which would make everything shiny on screen in wet conditions. The Gears of War art style relies heavily on material variation, with break-ups in tone and specular response to give visual cues to the players such as the “golden path” on ground materials.
The system takes care of both water absorption and accumulation (overflow). For the absorption component, we’ve studied wetness based on previous research such as Sébastien Lagarde’s impressive writings on their findings for Remember Me . We developed a function that takes the Albedo, Roughness, and Metallic components of the underlying material and computes a “wet” version at run-time based on the physical properties of the material and the wetness amount. Inspired by the research done by Hobbs , the University of Oslo , and Lekner/Dorf , our function would treat the albedo component of dielectric materials (non-metals) with a darkening effect due to water absorption.
With a typical physically based BRDF (bidirectional reflectance distribution function), we expect the light reflected off a surface to be no greater than the incoming light. In the case of rough materials, we have extrapolated the porosity of the affected material by its surface roughness multiplied by a predefined coefficient.
When a surface is covered by a thin layer of water, we simulate the amount of light that gets “trapped” in between the water medium and the affected surface - because more of the light gets scattered inside the medium as opposed to reflected back to our eyes, the darkening effect simulates the amount of light “lost” during this process that didn’t contribute to the overall diffuse term. Porous materials in particular are more sensitive to this phenomenon as each pore on the surface gets gradually filled with water which in turn substitutes a weaker diffuse reflection of the surface by a stronger specular reflection, as opposed to a smooth material which absorbs less water as it “glides” around its surface. With the roughness component allowing us to derive material porosity, this allowed us to better retain all the nuances in terms of diffuse and specular reflections from the underlying material definition and gave us more credible results. This translates into rough materials such as sand, rock, or fabric reflecting less of their diffuse contribution in wet scenarios and reflecting more specular, whilst smooth materials such as polished metals or marbles on vertical surfaces keep their diffuse reflectance almost untouched.
Taking the metallic component as an input, the function also modified the specular reflectance of raw metals (beyond a certain threshold). In a physically based BRDF, the specular reflection’s colour is derived from the albedo component as opposed to dielectrics which remain neutral and are only affected by the light colours. Metals underneath a layer of water lose some of their specular reflectance and therefore some of the coloured specular reflection - we implemented an additional Fresnel effect that interpolates between a 90° angle at which the metal’s specular reflection is slightly reduced based on the water level and at a grazing angle where the underlying metal’s reflection is completely gone, substituted for the water layers reflections.
For the water accumulation, we created two separate effects for vertical and horizontal surfaces, whilst blending everything in-between for slopes. On characters, this is represented by drops of water on horizontal surfaces and drips on vertical ones, portraying the water accumulating on the surface and traveling down under the effect of gravity on smooth surfaces (we did not allow this on more porous materials such as fabric which only got the water drops effect even on vertical surfaces).
For environments, the accumulation was not only driven by the material definition, but also by the height of the object affected. One of the most critical features of the CAM system was the addition of puddles on the ground - because we were generating levels at run-time, the puddles had to work across different parcels of different sizes that could be randomly rotated 45° at run-time. Manually painting puddles with vertex colour on a per tile basis was not an option, we decided to leverage procedural texturing at both macro and micro scales.
At a macro level, we generated a set of procedural noises in Substance Designer that we projected vertically at different scales; it allowed us to isolate wetter areas (puddles) across multiple tiles without ever having to worry about texture seams between multiple parcels.
At a micro level, we combined the procedural noises with the underlying material’s heightmap which allowed us to break up the tiling pattern of the procedural noises and better localize the puddles.
We built an interface that allowed our Artists to tweak the rain and wetness for each atmosphere. The wetness was primarily driven by 2 parameters: Wetness Intensity (generally higher in rainy conditions) and Wetness Level. The Wetness Level controlled the amount of water accumulated on top of each surface, which allowed us to depict almost flood-like sections of our levels - because the water level function was driven by height, more water accumulation can be seen on lower grounds in environments which had different levels of elevations.
The Building Is on Fire
Gears Tactics is a character-driven story introducing new heroes in a story set 12 years before the events of the original Gears of War. We started development on the TAM system (Temperature Aware Material) a couple of weeks into production with the goal of supporting our game’s narrative with the addition of large-scale fire areas in our environments. [Spoiler alert] Early on in the game, the Coalition of Ordered Governments, led by chairman Prescott, decide to launch a full planetary strike using the Hammer of Dawn weapon arsenal. [/Spoiler alert] From an Art Direction standpoint, we wanted to portray vast environments scorched by raging fires at the beginning of the adventure, and gradually reduce the intensity and density of the fire as our heroes flee the epicentre of the fire.
At its core, the system handled 4 components: Fire effects (either particles or cards), lighting, scorched/burnt materials, and audio. Each TAM actor could be placed in the map by our VFX team and contained a light associated with the effect to give it an incandescent look in dark spots or during night time but also an Audio actor for ambient sound - this allowed us to significantly reduce the back and forth between multiple disciplines as all of these components were controlled by a single actor.
We aimed at creating a versatile system allowing our Artists to control the intensity of the fires and their effect in the surrounding environments - we exposed some global parameters that allowed us to take all the TAM actors in a scene at run-time and globally control the intensity, ranging from burnt-out fireplaces with smoke and scorch marks to raging fires with incandescent lights, sparks and so on.
The fire effects catalogue was authored by our talented VFX Artists Iryna Tkach and Mustafa Alobaidi. Each TAM actor had a Fire and Soot intensity exposed to the user which was adjustable on a per fire cluster basis alongside a scaling/temperature range that would support any range of values from nothing to inferno intensities. The soot applied to the surrounding environment was projected vertically with a set of deferred decals to save on performance, depicting a thin layer of smoke particles as hot gases travel upwards because of their lower density.
One of the most ambitious applications of this system was the way we handled material interactions with the fire sources. We used Unreal’s Physical Materials to define visual effect and audio response whilst interacting with physical objects in the world such as footsteps on the ground or bullet impacts on walls and other objects. The TAM system contained a collection of material functions that would take an input position (surrounding fire(s) position) and a radius (surrounding fire(s) area of effect) and output a burnt looking version at run-time for each physical material. We relied on physical materials for this effect as the underlying material definition was not enough to distinguish one type of dielectric from another. The Technical Art team spent a lot of time analyzing burnt materials from real-world references to give each material type a distinct look:
- Alpha cards for vegetation and fabric leave glowy, ember-like effects around them with a reduced alpha coverage depending on the fire’s proximity;
- At the hottest point, woods would turn into grey ashes with glowing embers surrounding them and gradually fade out it a darker charred wood;
- Concretes get their albedo discoloured at the burning point and accumulate soot at higher elevations whilst the surface normal and microstructure gets a higher frequency noise to make the surface crackling with the effect of heat;
- For metals, we implemented an approximation of a real-world steel temperature chart  with a range between 200°C - 1200°C for the tempered metals effect which affects the albedo component till the 600°C mark then gradually affect the emissive component until the melting point is reached at around 930°C. The metal surface gets bent/deformed using a vertex shader function based on the mesh scale and thickness close to a fire source at the melting point.
Because we were controlling the burnt effect on a per material basis via a material function, we had to cap the number of fires that could contribute to the same mesh at any given time for performance reasons. We’ve built an offline process that parses all fires in a parcel and picked up the 4 closest fire sources to affect each mesh. The material function then retrieves the position, intensity, and radius of each TAM actor and uses a sphere mask combined with a procedural noise and the underlying material’s heightmap to give it a more natural look.
The TAM and CAM systems supplied us with a lot of variation, with little additional authoring needed. Overall, we were happy with the quality of the material functions, the puddles and fire and soot effects all seemed to interact well enough without manual tweaking. Iteration times were also drastically reduced, for instance, our VFX and Lighting team requested a secondary light source on fire actors at a fairly late stage in production – because we took a systemic approach for fires, this change took a few minutes to implement in the system and propagated instantly across the entire game.
Changing weather conditions over the course of a mission was a thing we played around with early on, which would have been interesting to explore further. The systems achieved their overall goals, hitting both our quality and variation goals whilst expanding our visual matrix greatly.