Very impressive article Jake! You are very talented.
nice article! i love seeing the breakdowns.
Emiel Boven talked about the way he experiments with the sky and ice shaders in Unity.
My name is Emiel Boven and I’m a third-year game art student from the Netherlands. In my work, I focus on the more technical side of game art: shaders, lighting, and VFX. I have worked on the upcoming co-op game Rite of Ilk during my internship at Turtleneck Studios here in the Netherlands and I recently teamed up with a small group of artists, developers and designers from my school to start our own company called Rift Shard Studios. We’re just started working on a souls-like action adventure title which we will be sharing more information on later.
Shaders in Unity
I actually started my study to become a concept artist but I quickly fell in love with the things I before thought of being the “boring and technical” parts of game development such as 3D modeling and programming. Since then I haven’t done much 2d art anymore and I’m experimenting away with shaders and building my own small game projects in C#. The thing I love about shaders is the fact that they can take a set of variables and data from a mesh, like normal direction or vertex position, and manipulate them to create interesting and visually pleasing effects that you can’t achieve with just modeling and texturing.
The basics of shaders come down to a simple idea: you have a set of 3d coordinates and attributes that get passed through a set of small programs called shaders before they can be written to the screen as a 2d frame. These shaders all have a very specific functionality and some of them can be swapped out for custom shaders to change how this final frame looks, for example, the vertex and fragment/pixel shaders. The vertex shader can be, for example, used to offset the vertex positions from the model using a displacement map, without having to change the actual imported model.
Writing shaders that interact correctly with lighting are quite complex to make so for most purposes I use Surface Shaders. Unity’s surface shaders generate most of the lighting calculations for you so you can focus on the effects you want to create without first having to write all those functions to light your model correctly. That being said, I still use vertex and fragment shaders for effects like deferred decals and postprocessing shaders since they don’t have the need for those lighting calculations.
Amplify Shader Editor
Amplify Shader Editor is a node-based shader editor for Unity. What that means is that it creates the shader code from a system of interconnected nodes you can build without having to know a line of HLSL. This is a great way to learn about shaders without the need to write a single line of code. In the second year of my study, I learned about shaders for the first time in a series of lessons around shaders and the render pipeline. In these lessons we learned to create shaders using another node-based shader editor Shader Forge. That way I learned about all the cool things I could do with shaders. After these “artist-friendly”`shader lessons I was very interested in shaders and decided spend some time to learn how to make them without those node-based editors. That really helped me deepen my understanding of the shader pipeline and now helps me build more complex shaders using node-based systems since I now know more about what goes on under the hood.
For those ice and snow shaders I essentially created two different materials in my shader: a normal material and a material that had snow on top and icicles hanging from the bottom.
The second material creates a mask from the world normal Y axis to create snow and ice only on the surfaces that face upwards or down. This makes sure you can rotate the mesh but the icicles will always face down in the game world. The snow and icicles both use displacement to offset the vertices in the up or down direction to give the illusion snow is actually piling up on top of the material. The icicles differ a bit from the way I made the snow. They’re created using a noise map that masks some parts of the bottom to offset like the snow on top and some to not offset at all creating that jagged icicle effect.
Then I used the vertex color on the model, which you can add in your preferred 3d software package or in the engine with a vertex painter tool, like I did, as a mask to blend the two materials together on the meshes.
Shaders for Stylized Environments
The biggest thing, I think, is shapes and color. A stylized environment uses models that mostly have a strong shape language, for example, stones are bigger and certain features are more exaggerated. When creating effect the outcome should use the same kind of shape language and color schemes. If I, for example, would have used realistic thin transparent icicles instead of the bigger unrealistic blue ones I made, it wouldn’t fit with the overall stylized aesthetic of the environment surrounding it.
The skybox shader I made was an experiment to see what things I could create using the node-based editor. The shader maps the skybox with a texture of clouds I created. The file containing the cloud texture consists of three black and white maps stored in the separate RGB channels of the picture. Those textures are loaded into the shader and used for the three main parts that make up the clouds: the cloud shape, the rim lights and the cloud inner shadows. In the shader I multiply these maps with colors so that those parts can be colored separately when needed. The rim lights are a bit more special than the other two maps because the are multiplied with a falloff gradient around the sun so that the rim lights only show up when the sun is near the clouds.
As said earlier the color of the cloud parts can be controlled separately but this is also true for the sky and the horizon. This way the artist using this material has full artistic control over the skybox.
Since clouds move but I use a still image I manipulate the UVs using a noise texture to make the clouds slowly scroll across the sky and change the shape of the clouds.
Implementation of Shaders
Shaders in Unity are fairly simple to implement. A shader, written manually or generated using a node-based editor can be assigned to a material in Unity which can be assigned to an object in the game. The uses of shaders are as diverse as you can imagine. They can be used to create interesting mesh based effects but can also be used for particle effects and post-processing effects. I actually discovered a great talk this morning about how tech artist Simon Trümpler created all those stylized effects in the game RIME, that is a great inspiration if you need some ideas to use shaders to create cool effects.
The Beginner’s Guide
I would recommend using a program like Amplify Shader Editor or Shader Forge. Especially if you have no prior experience with shaders. These programs give you a way to start experimenting without the need to know HLSL but in return are sometimes less able to create highly specialized shaders. But I would certainly challenge those familiar with node-based editors to try to create a shader without those editors to really dig deeper in the ways shaders work. But even if you’re already a star in coding shaders I still would recommend the use of a node-based system since it cuts away the struggles of syntax errors and knowing the exact names of the functions to use. That way it can speed up your development time and makes it that you have more time left to actually make the rest of the game.