We don't care. RIP Crytek.
I believe the scaling and sliding videos are swapped
С ума сойти как круто!
Tilmann Milde is a member of the YAGER team in Berlin. He’s been working on lighting and shaders for a while now, building amazing visuals for video games. He’s also been a part of the UE4-project, created by some of the Yager artists. We’ve talked about this initiative last year. In this post Tilmann discussed the production of his interesting looking Throne Room environment and explains how he managed to build and light this very complex scene.
Hi, my name is Tilmann Milde and I am a technical lighting and shader artist at YAGER in Berlin. However, while writing this, I am also preparing my family and me for a move to Stockholm by the end of February to work as a lighting artist for DICE in the future.
I have been in interested in video games, realtime rendering and the involved creative processes since they started appearing back in my youth. However, I always thought there would be no way to do this for a living so I ended up studying graphics design instead. It was when I had least expect it, that I learned of the Games Academy in Berlin, where one could „learn“ how to make video games. They offered a Game Art & Animation course for two years, which I decided to take. It seemed like one of the best opportunities to combine my love for visual things with video games and I had enough of working in an ordinary job. I totally loved it and gave everything, leading to me getting an Internship at YAGER right after the end of the education. I worked for 6 months as an environment art intern on the at that point unannounced Dreadnaught until I got hired permanently to join the Dead Island 2 team.
Here is some work from the old Games Academy days and one of the student projects we worked on:
It was then when I asked to try out lighting instead of environment art, since I was never the fastest modeler and more interested in the overall look, feel and quality of a games image. The company allowed me to do that after I did a small test scene that they could evaluate, and since then, I have tried my best to make things look as good as I can with any given tools and constraints. Since the team was not extremely huge, I had the opportunity to work on many more things than just lighting and it was a terrific learning experience. When we switched from Unreal 3 to Unreal 4, I immediately recognized the advantages and power of the new PBR system, and since lighting heavily depends on how light interacts with matter, I started studying the hell out of all these new magical things. During my time on the project, I heavily worked on most of the master shaders that we used (surface based system), the skydome shader, landscape shading, character skin tweaking, lighting setups and post processing as well as evaluating lighting middleware.
Lighting Up a Games Environment
We fought quite a tough fight since DI2 was too huge for static lighting and dynamic lighting support with equal quality for indoor and outdoor areas was basically nonexisting in Unreal 4. However, this lead to some very interesting and insightful workarounds, hacks or whatever that did the job. I would say….when I wasnt working on the material side of things, lighting was mostly keeping things working instead of being super creative with what you do. When you have one lighting condition in a huge world that just stays the same…you tweak it until you are happy and leave it (untill some major things change or so). But stuff breaks…always. And you need to take care of a lot of detail work to make lighting issues less visible.
We ended up implementing a custom lighting solution for Dead Island 2 which was a lot more tailored towards our specific needs. I have not yet worked on a more linear game, so my experiences on this are limited. But in settings like this, creative lighting and guidance can play a bigger role and also leave more room to design the lighting custom tailored to the situation. As an example for this, I would like to talk about a scene I built together with my friend Robert Larrson, which we did for Polycounts Throne Room Challenge.
The Throne Room
Our initial idea was to make a Kingdom of the Rock…carved in some sharp black stone, the throne room of their king. Sharp angles and some kind of forgotten mood should be our key pillars. We also knew that we were both pretty busy in general, so we decided to make a rather dark setting to allow us to hide minor details or unfinished assets a bit better. However, there should be one strong key light leading the eyes to the throne. I came up with a super weird and rough blockout of some shapes that I felt had potential, and we started from there with iterations and overpaints.
Here you can see the overall progress of the blockout:
In the end, Robert drew this concept which became our final piece of reference. We actually started with the throne itself pretty late since we were quite busy with the things around the throne, and luckily enough…designwise, one thing just led to another in the end and we had our throne.
Final Concept we used:
Everything came together nicely and we liked the contrast of the black rock and some gold ore veins and elements we added on the throne.
Here are some concepts for the throne:
And the final one:
I actually started pretty early on with importing things into Unreal Engine 4 and slapping some random tileable materials on it to get a better impression of where things were going.
I really like playing around early on to find things I like and Unreal Engine 4 is a great tool for providing exactly that possibility. The material editor is a blast and absolute beast to work with and the rendering keeps getting better and better. I sadly haven’t worked with Particles or Blueprints a lot, so this is something I want to invest more on in the future.
Building Assets and Materials
This is a reference board we used to get a general idea of materials we wanted:
For the actual modeling part of assets, we split things up. I suggested that we try to do a lot with tessellation to add detail that doesn’t really need to be modeled like proper rocks. So we actually used only3 different rocks in the whole scene. Everything else is cave geometry with vertex blended tessellation materials to make things quicker.We split most of the modeling work between the two of us, however, in the end, we didn’t finish all assets we planned and the final scene still uses a lot from the initial block out geometry.
Here is some of the rock sculpting from Robert and some early scanning tests from the Zoo in Berlin:
I made a couple of tileable material functions for different surface types like rock, rock flat, rock sharp, gold ore etc. We then used those in our shaders layered together via masks. Most of the materials use tessellation and almost every master shader has vertex paint options with tessellated height blends. So there are only uniquely baked normal maps and AO maps. Everything else was done via tiling material functions.
Here is the main shader used on almost all larger assets:
The actual materials are honestly not the best I have made so far. Most of the textures were slapped together from random images and have tons of problems and in general, I didn’t optimise things as much as I could have when looking on the shaders. Also, a lot of them tile pretty badly, but since the scene was so dark and everything was blended together with different materials via vertex painting, I did not care that much for this in this scene and I would argue that you don’t notice it in the end anyways.
Here is a short breakdown of how I generally approached things for materials:
- google for some super large rock images
- just drop them into bitmap2material
- make it tile there
- remove a bit of lighting there (but just very subtle…I don’t really like the auto tools for these kinda things)
- make a good relief for the large shapes (slope based detection is the king!)
- copy/paste the tiling diff, normal, ao and height over to photoshop
- level the diffuse and do some frequency separation to remove lighting information from the texture
- create roughness map based on the diffuse
- if needed, make a detail normal out of the diff with ndo to further refine the one I got from bitmap2material
- export all that stuff over to Unreal
- make a material function for it so I can call it in a material and tweak the roughness in the shader to really look spot on the way I want it
- add micro grain to roughness for that kinda slight noise effect you get on basalt or cold lava and in general micronoise makes a lot of materials more believable!
Here is a quick breakdown of one of the material surfaces for a rock:
I did not use RGB channel packing for most of the textures, since I was iterating on them till the very last moment. So a lot of optimisation could have been done here. However, in this case, I didn’t ran into any issues and preferred the working speed. My other contributions were the throne shell and two rocks that I scanned. For the throne shell, it was really important to stand out as a main eye catcher but framing the throne to guide the eye and so the light could hit it. Since I knew the light would hit from inside only and all around would be only bounce, I actually didn’t spent a lot of time on sculpting something crazy, but rather just did some broad and generic strokes that I combined with some material magic in Unreal, further down the road. Its pretty quick and dirty, but I am still happy that it did its job so well.
The rock scanning was actually a super fun thing and the whole process until I got the finished asset unexpected painless. I was getting my son off daycare and while walking to the station, I noticed a nice looking rock right next to me in a small park. I used my crappy cellphone camera to take like 50 pictures or so that I later used for the scanning process. I exported a super highres obj and my diffuse texture and went over to zBrush. The texture needs to be flipped so zBrush gets it right and I was ready to do a quick cleanup pass, as well as building the LP and do texture projections.
I baked down vertex colour as my diffuse map, normals, and ambient occlusion. The rest of what I need ed would be generated later on. The Diffuse gets delighted in Photoshop to have a clean base. With that setup, I took the baked normals and loaded them as a base for Quixel NDO. I then added my diffuse texture (in a not delighted state) as micro details and started blending them nicely together. NDO also helped me with cavity maps and additional AO details.
Here are highpoly, lowpoly wireframe and lowpoly with textures in engine:
Lighting the Environment
For the lighting setup, I knew we will end up with only one light source since the place was abandoned, so no torches or fires etc. I have two key concepts when I approach a new lighting setup both of which are equally essential: Less is more! Its important to not overdo things and just slightly add more…play with values and settings before adding more things that start to crowd the scene and become hard do manage. Always look at and learn from reality! Its a cave? You still see something? Why? Learn why sometimes one lightsource can illuminate a huge area if the conditions are right. How can you use your given tools best to add as less as possible but to the greatest effect. Because thats mostly what happens in real life as well when it comes to light.
When I approach my scenes like this, I can make sure that I end up with a manageable setup that stays mostly true to what you would see in real life. Sometimes however, you either want to tweak things more towards something stylized or surreal, or there are technical limitations starting to happen. Thats the moment I mostly start introducing helpers of all kinds. This can be additional fill lights or hidden geo that helps with light bouncing. However, I really try to fake as less as possible and only use what normally would be there anyways.
So for the throne room, all of this meant that we would have one strong keylight coming through a hole in the ceiling and it had to hit the throne to put focus and some kinda halo of grace on it. It also needed to bounce around so much that it was enough as the only lightsource in the scene. Also, in a dark environment like this, its important to have good values and not a lot of textures that are too dark. The definition here however, should come from the lighting.
These are unlit and lighting only shots:
I went on for quite a long time like this, but eventually decided to add 3 more lightsources. 2 super subtle point lights on each side of the throneshell to fake some more bounce, and the lava below the walkway. I just needed a bit more defenition and variation. However, I strongly believe that it helped to show the scene in the right ways.
I also chose to have quite some high exposure settings to make sure the „eyes“ can see in the dark and I get nice bloom from the little bright spots. There is another thing I did to compensate for some limitation which is not the default way of doing it, but I think it helped quite a bit with the environment.
Since the cave was basically closed, there could not enter any skylight. In a case like this, you normally work without it and use the ambient color in lightmass. However, this is an even color and it needs to be combined with reflection captures for indirect specular, but no localized IBL is happening. I tried using the skylight inside the cave and reduce its capture distance to capture the inside of the cave. This way I got an IBL that provided indirect lighting and specular from inside the cave and the effect looked more defined and overall better than using just captures.
In the end, I was pretty satisfied with the results I got so far. After that, it was mostly polishing a bit of settings like post, DOF and some material parameters. In regards to colorgrading, I mostly use the in engine tools, but sometimes I do LUTs as well. However, I dont like the process of doing them that much and I prefer having the control in engine. When the new filmic tonemapping got introduced with the kite demo, these in engine options got even better
All the effects I used are from other Unreal 4 examples like the Elemental demo and such. We decided to just stick with them since we both had no experience with particles and did not want to loose time on them. So yeah, thats about it. I think we both were quite happy with it in the end, even though we both knew there was still a lot left to do. But its also important to stop with things at some point and leave them be. Thats why I really like working on challenges and contests, because you are just kinda forced to a specific time frame and push you. Also, working with the Unreal Engine 4 and its tools helped us a lot to get the results we wanted in a quick way. Its a lot of fun and you instantly get feedback on the things you do…plus beautiful images to look at.
I hope you guys find this article interesting and have a great time!
Here are a couple of more images…
the final shots:
some more throne stuff: