Very cool review of the making of Spellbreak. Would be even more cool to see some videos inside UE4 showing how they do a few very specific things unique to them.
This was so helpful for me. I'm hoping to adapt your tutorial to pull off something similar comparing modern satellite imagery with historical maps. No topo, so my steps should be simpler, but I'm a novice with Blender and you've really helped. Thanks!
Even Top Notch Artists will be replaced by AI. You have no idea what you are talking about. If you do, only very superficial. At the end you are only an employee. You dont have any contact or experience to the High End Echelons we worked on. In 20 years, 40% of workforce working today will be out of jobs. First we will get worldwide financial crash, then AI takes over. Admin will remember my words in not distance future.
Ludovico Antonicelli showed how he created the amazing master material for his UE4 project, which opened up a lot of easy and fast customization options for textures.
The design of the sci-fi environments is inspired by the works of Tor Frick. In particular, the design of one model in this scene was completely taken from Tor’s portfolio. So, be aware of this. If you’re interested in learning more about Tor’s work, make sure to follow him on Artstation and check out his website. It’s got tons of hard-surface goodness. And please, try to do your own designs. It always pays off.
Hello there! My name is Ludovico Antonicelli, I live in Milan, Italy and I just graduated from a three-year CGI program at IED Milano (Istituto Europeo di Design). Inspired by series like Dead Space and Mass Effect, I focused on environment design during my studies, and since then have learned UE4. To cap off my time at school, I worked on a 10-person thesis project called Echoes from Cryo, a game demo built with Unreal Engine 4. Over the course of this project, I enjoyed managing materials and defining the workflow for our team — hence why I was the environment/technical artist on this project! Currently, I’m expanding my portfolio but will soon be on the job hunt in the video game industry.
Echoes from Cryo
In Echoes from Cryo, I was responsible for the creation of a snowy, natural environment that acts as the setting for an intro cinematic before the game begins; hopefully, I can go into greater detail about this environment another time on 80lv! In terms of gameplay, I defined the pipeline meant to bring assets from the modeling software to the engine. The responsibilities of a technical artist may be less well-defined than classic roles, such as the modeler or animator, but a technical artist, especially for larger projects, is convenient to have and an important member of any team. Much of what I did over the course of the project involved mediating between artists and programmers to help solve performance deficit issues.
Our production workforce consisted of only one modeler and one texture artist, and we didn’t have much time to finish our indoor environment. The only way to complete it was to create the material in such a way that Massimiliano Italiano, our art director and lighter, could achieve his desired look without spending a ton of time asking Jacopo Conte to constantly come back into our texturing software to make simple changes, like modifying color or tiling texture. Utilizing material function turned out to be the optimal way for maintaining a consistent quality throughout the environment, allowing us also to discard unique PBR textures for every mesh. For a basic explanation of material functions, check out Clinton Crumpler’s article about the subject.
Defining the desired surface properties with our art director was the very first step of this process. For that, we referenced the amazing work of Paul Pepera, a great artist who unfortunately passed away this year.
To build our functions library, I used Quixel’s DDO Painter to create all the tileable textures. I didn’t spend much time making them because the materials needed to look neutral, meaning they’d have an output that didn’t differ much from their material presets.
Afterward, I imported the textures into Unreal and started creating the functions. I ended up with quite a few textures and tried to discard some of them by recreating them with simple math inside the editor.
With this type of trick, every material function uses one to two textures at a maximum but still maintains its correct PBR property. In some cases, I reused the same texture for multiple, similar functions to save storage space—keep in mind that this doesn’t save texture count in the final shader because it will count as one for every usage in every function. In the end, I reduced the texture count from 24 to 7.
Next, I created the master material and threw in the function that were blended together using a MatLayerBlend_Standard node (which is also a function) that needed a grayscale texture as alpha—this mask defines where the top function appears over the base. Switch Parameter nodes are essential to keep the shader complexity low as they allow us to discard some part of the material (if needed). In this instance, there are switch nodes for every function, thus allowing us to decide which one to use depending on the asset.
The first one (Paint) worked as a base—following the function chain, I added an emissive component that blends via a mask as well. Also, a normal map is blended right before the final output.
Setting all switch nodes to ‘true’ results in usage of all of the function. However, this does not break the material because it uses fewer than 16 textures, which is the maximum amount allowed by Unreal. Even though this would have been expensive in terms of performance, it was rare for us to use more than five functions at a time for one asset. Eight grayscale textures were needed in the case of one asset that used all the functions. Those can be packed in two RGBA masks, where each channel represents a specific function mask.
Converting a texture sample to a parameter allows for one to plug custom texture into the material instance.
But how all of this can be useful to get functional assets?
At first, I provided Jacopo with the textures of the functions, which he then imported into Substance Painter (his software of choice) to recreate the materials.
Once Jacopo received an unwrapped 3D model from Mirko, he could then begin painting the ID mask where a material should appear. He used fill layer, a smart material and a smart mask tool to accomplish this.
Normal maps were painted directly in Substance to avoid texture-baking times from a high poly model, and sometimes we used a mixed approach too.
All of the masks then had to be exported and packed in a RGBA texture in a precise order. We used a Photoshop script to automate the process, and Jacopo also added some padding to avoid artifacts.
With the textures ready, Jacopo brought them into Unreal and plugged them into an instance of the master material. A raw material instance without any modification would look like this:
Every function has some adjustment facility regarding color, roughness value, normal intensity and texture tiling. I did this by creating an input node inside the material function node.
The input type indicates what can be plugged in the function input. In this example a vector 3 is needed to control the albedo color.
The input type indicates what can be plugged into the function input. In this example, a vector3 is needed to control the albedo color.
Now, in the master material, the function’s nodes have a new input pin where we can connect some parameters and control the material function. I could have created those parameters inside the function with no difference, but I like to have all of my parameters inside the master material and leave functions as neutral as possible.
By repeating this process, we can parameterize anything in the functions and change the variables in the material instance.
Every time I create a master material, I try to present its instance in the most understandable way for a distinct purpose. My goal here was give to the art director the opportunity to change the aspect of the scene in seconds—while being able to focus exclusively on mood—without having to wait for changes from our texture artist.
There are plenty of ways to set up an efficient material instance so anyone can play with it.
– Organize in groups: I used groups to tackle every material function by itself, and I also assigned a number as the group’s name so they would show up in an easy-to-use order.
-Clamp Sliders: In some cases, it’s helpful to drag the slider left and right without constantly checking and fearing that you’ve reached some crazy value. If you need to interpolate linearly between two values this will be useful.
– By default function, inputs organize themselves in alphabetical order, but sometimes it can be useful to sort them differently—do this inside the function with Sort Priority. From engine version 4.15 onward, one can do this with Parameter as well.
When considering performance, it’s usually convenient to separate material at the geometric level using Material Elements. But, we needed per-pixel control so we ended up using only one material instance, which drastically reduced draw calls. During the length of gameplay (about 20 minutes), all the material function textures are constantly loaded (seven textures at 2048×2048 resolution for a total size of 35 megabytes). Further, individual assets have one to three textures streamed in and out (two ID masks + one normal map). A benefit to this is that the ID masks don’t need to be at high resolutions, especially if the mesh UVs are straight and parallel to the axes.
To understand this better, check out two materials blended with a mask below:
On the left, the material is applied to a mesh with straight UV. Clearly, the quality of the cut is decent even with the mask downsampled to 512px. On the right, the material is applied to a mesh with skewed UV. A large mask at 2048px maintains its shape, but we begin to see pixel art when we downsample the mask to a reasonable game resolution.
We imported all of the masks at high resolution. Then, they were downsampled directly in the texture editor of Unreal.
The size one should downsample textures to depends on one’s target texel density and mesh size; in general, it’s good to shrink them until you start seeing a loss of quality. Most of the masks were downsampled to 512px, but sometimes we could reach even smaller sizes, like 128px, for simple models with very good UV.
In the images above, check out a pavement section and a full corridor with masks at 2k vs 256px resolution.
It took more than a week to organize everything for the production. Understanding each team member’s skill set was incredibly valuable for building a successful pipeline—it also was the most time-consuming part of the production! After that, setting up the material is quick once you identify your goal. I also think it paid off to doodle in the editor and study PBR.
I wouldn’t recommend taking the same approach for a smaller project. But, in this case, the art director, modeler, texture artist and shader artist were different people, and time optimization for the project was crucial. Moreover, using function as we did only saves texture memory if there are many assets sharing them. On the other hand, running unique per-asset texture will tax your computer’s RAM if you don’t have an efficient, level-streaming system. Instead, on smaller projects, artists have complete control over their assets so it’s easier to bake out exactly what you need from a texturing tool.
Anyway, I’d highly recommend using a small master material to tweak your shader in preparation for any light condition. Here’s an example:
Thanks to Kirill Tokarev and 80lv for interviewing me! I hope what I’ve discussed today helps you all in your future UE4 projects!