You should release this publicly as a "playable demo" much like the demo's from Epic Games where you can move the camera around and take screenshots or in-engine renders.
Link is broken!
Wow, you're really annoying. Cool article though.
Freelance artist Berker Siino and former Crytek artist, who worked on Ryse, Crysis 2 and Crysis 3, gave a detailed breakdown of his new environment for Rick Future adventure games. This is an incredibly example of great PBR-environment created with very limited resources. Berker talked about the production of materials with Quixel SUITE, described the way he used Unity 5 and shared some of his signature tricks in this amazing breakdown.
My name is Berker Siino and I am currently working as a freelancer. I have previously worked at Crytek for about 5 years. The beginning of my career started at the Games Academy in Berlin where I studied 3D art and also made the first connections to many of the local game developers, Crytek being among them. In the course of my professional career I collected credits on Ryse, Crysis 2, Crysis 3 and numerous other smaller projects. I also supported pretty much every licensee using CryENGINE during their game development efforts.
One of my personal highlights was to be one of the guys who exported the first Star Citizen ships and showcased a demo scene to Chris Roberts and Cloud Imperium Games while they were evaluating CryENGINE. Working at Crytek advanced my skills in all kinds of directions and I quickly became an ‘allarounder’, getting proficient in many areas.. Being able to take on technical art roles and still having the passion of being a 3D modeler and texture artist made me a quite useful member of my team at Crytek.
The Rick Future Project
After leaving Crytek and working as a freelancer for a few years I started working with MetalPop Games. MetalPop, also made up out of exCrytek devs asked me to help them out with all their art and technical art tasks for the adventure game they were working on.
The project was first developed using CryEngine but was switched over to Unity with the release of Unity 5. When I joined, one of the first things I did was to develop a new art pipeline to utilize the full strength of Unity’s new rendering features.. Being a very small team with limited capacity Unity was simply the better choice to get results quickly.
I got the chance to do the art direction and develop a new style for the game, and I think getting the chance to this meant quite a lot to me as an artist. As usual with indie development and having no publisher picking up the tab, time and money are constant worries. As an artist I still want to go as near as possible to a AAA quality with our game though. Only having a strong pipeline could realize this, which is one of the reasons why I used Quixel Suite to automate and create textures fast in PBR fashion. I took the responsibility to design, model, texture and art direct all game scenes, all in quite a short time. Our first result was the benchmark level, we call the ‘Engine Room.’
Preparation and pipeline
The main goal of creating levels and scenarios as the only artist in the team was to keep things simple but at the same time don’t lose fidelity and the details of a AAA game. In a production with more artists you can keep one guy busy with an asset to make sure to keep the desired quality level.In indie development of course this is not always possible. The main approach to achieve the look I was going for is to make use of lights and reflections. Of course those are only given when working with normalmaps and roughness/gloss maps in PBR. The beauty of PBR of course, is that you can change lights without worrying about objects looking wrong in different light conditions. If you get the material values right from the beginning, that is. Quixel Suite is a great tool for that matter. You can start by adding a base material and tweak it to your needs and of course try to break the procedural look of the mask generation. Thankfully Quixel now has a painting tool which allows you to have more artistic freedom while creating your assets. I did not have much time to model out every detail on an asset therefore I made sure that the base chamfers of an asset had the highpoly information included. To add further details I used NDO which gave me also a fast way of masking out the elements for a color ID map.
The normal maps got baked out in XNormal by using cages that you can generate in the 3d viewer.
The more important maps to get out of the tool, were the normal map with the green channel up, the curvature map (could be also baked in DDO) and an ambient occlusion map. The color ID map can either be baked from a highpoly model with different colored materials assigned to it, or created during the detail pass in NDO. Also important is to get the Xnormal Tspace synced to the engine tangent basis. I was using a plugin for that during beta of Unity 5 and had to smooth the angles of objects in engine manually. I am not 100% sure but I think that got fixed by now and you can use the standard Mikk Tspace now.
After getting proper working normalmaps you have a foundation to build up detail in NDO. It is all about planning the asset. For example if you would see an asset from the side and you add tons of detail on a flat surface, this would destroy the purpose. You need to add geometry to keep a working silhouette, that is without question but only in areas where it makes sense.
From this point on you generate all your masks and smart materials in DDO until you get your final texture. I highly recommend not using standard smart material and call it a day. You have to make sure to get rid of the generic look of procedural textures. Quixel is pattern based so you can scale them and that is pretty much it. But now we can use the paint tool to also add variation, which is great. The power of generating presets and smart materials was the reason why I chose Quixel suite to be part of my pipeline. After creating the input nodes like normal map and ambient occlusion I was able to automate textures based on my smart materials that worked already perfectly in my scene.
My library of presets and smart materials grew over the course of creating the scene and I was able to use it on other levels as well, to get the production speed up.
Unity 5 Needs Plugins
The problem that I had during creating the pipeline was that I didn’t have access to certain tools that I got used to while working with CryENGINE. Unity 5 doesn’t have a solution for decals for instance, which is really not understandable for me. I was hoping to use POM and certain other next gen dx11 features on decals which I couldn’t. We didn’t change the standard shaders however and stuck to them. I was mainly focusing on getting the big picture and not the craziest shader setups. Rick Future is a point n’ click adventure game, so you have certain camera shots that track and pan slightly along the movement of the protagonist. You can’t come up close to a wall like a FPS game. It will probably happen that the game gets changed to a more advanced shader system in the future but that is not the case for now. I used certain plugins like SE Natural Bloom & Dirt Lens from Sonic Ether, Light Shafts and Easy Decals though.
Light and reflections
We are using deferred rendering in our project with the specular PBR workflow. Therefore we can use many lights without killing the performance of the game. The way I start my light setup is usually out of a dark scene. I mean, what is better than adding controlled information to get a certain look instead of starting from a default lid scene? A scene usually uses 3 lights. Key light as a main source of light, rim light to highlight silhouettes and a fill light to boost some of the too dark shadow areas. At this point I set all my nondynamic assets to be static to let the system know which assets getting bounce during light generation. That is also the way to get access to glow. And we did need glow for certain sciencefiction elements of the game. I used dynamic lights mainly for the key lights to get shadows and light variations in the game. At any time the lights can be baked though (make sure to generate a second uv with non overlapping uv shells for lightmapping).
You can use objects that are almost invisible to cast glow as some sort of fake light source. The tunnel area behind the main engine for example got the bounce light from a transparent object.
This effect could also be achieved by baking the bounce and then hiding the object after bake.
Usually you start by adding key lights with a default value of 1 intensity on a logical way. I got 2 major light sources in my scene, the top light from ceiling neon lamps and the engine glow light with the light shaft, both of them casting shadows. At this point the GI is also being generated and you can tweak them as you like by adding more bounce for instance.
The next steps are to make sure to get a bit more light in general into the scene. You do that by either tweaking the main light sources or simply adding a few more in the room.
One more tip:
Try to avoid as much overlapping of the light radius/range as you can. It causes overdraw and hurts the performance. If you have more lights you should try to avoid it.
Once you got the base lights done, you set up your reflection probe and capture all the surrounding environment into one reflection map. You can set it to realt ime to see all changes updating live. Finish up the light setup by placing indirect lights from monitors and other smaller light sources that should cast light. You can change the appearance of a light by using what Unity calls ‘light cookies’. It is basically a black white mask and is used to project the light in different shapes. Light is one of the best tools to model out shapes and forms. You can define them and guide the viewer through your scene with it. For example: I added spot lights to some of the areas such as the big generator close to the camera on the bottom left of the screen, mainly because of the gameplay mechanics. There is a puzzle element behind the hatch which needs to be solved. In a lot of games they use these lights to guide the player. A door for example has usually some sort of light source attached, maybe a torch or a lantern to show the way.
Post process, color grading and tone mapping
Also important is to add is a screen space ambient occlusion to merge the objects contact shadows into the environment. I added a few more glow related scripts to get the SE Natural Bloom effects like lens dirt and bloom. Most important is color grading and tone mapping, even if I did not use it in my final scene setup. I got the results simply by using color grading. There is a lot of documentation about what color grading is and how to do it, but in a nutshell it is basically the idea of taking a screenshot of your current level pasting it into photoshop, manipulate colors and then throw it back into the engine.
With tonemapping on the other hand you can use an S curved tone curve to enhance contrast in the middle range and compress highlights and shadows. I would recommend to keep it simple here, because with all of these tools it is easy to get confused. Get the basics right and then add stuff, and if it is too much take it away and readd with a different approach if needed.