AI applications are only beginning to materialize. It's so exciting what neural networks will do to art workflows in the coming years.
W T F!!! Why, really don't understand, EA = just a fucking looser company!
Amazing... Congratulations for the new way to show information.. I hope they could use this to teaching.
Kimmo Kaunela talked about the way he produced Medieval Village, and how he used new features of Unreal Engine in this process.
Typically, I begin personal projects but don’t finish them as I often run out of time — this project, which I started years ago, is one of them. Originally, it was meant to run in Unity 5; but, after I learned more about Unreal Engine, it became clear to me that UE was the right tool for the job. Having already modeled some houses, I used them as placeholder assets and found it rather quick to prototype in UE.
My initial goal was to create a small medieval environment with a few wooden houses and a small backdrop. After spending many hours in Witcher 3, however, I became interested in studying big environments, which influenced my work and grew this project into something much larger.
Baked lighting wasn’t going to scale well for the size of my project. Thus, I devised a fully dynamic system that could support day and night cycles. Further, when working with a large landscape, it’s less than ideal to place trees and rocks by hand so I made a point of learning how to do that procedurally.
In addition, I planned to create all of the assets within the environment from the ground up — a departure from the trend of utilizing Megascans nowadays. At the end of the day, this project became my playground where I tested my wildest ideas and I see it serving the same function in the near future, too.
Сombining 3ds Max and ZBrush
I usually start my scene in 3ds Max by blocking out things and making sure everything is appropriately scaled. After working with a few VR projects I learned that it’s critical to get scale correct. It’s very quick to block out things so making mistakes and performing tests is easy and important at this stage of the production process. If something feels off, simply forget the idea and focus your energy and resources elsewhere. At this stage, I also think about what needs to be modeled and how I’ll model them.
To save time and avoid repetitive tasks, I envision how to make assets as modular as possible. I built wooden houses with logs so I started modeling a few of them and piled them together. Once I was happy with how they looked, I created a tiling texture for them.
I usually do tiling materials entirely with Substance Designer. Yet, in this instance, I sculpted a base in ZBrush, baked a height map from it and then textured the rest with Substance Designer. My plan was to develop a system that allowed me to make wooden buildings of different shapes and sizes.
Same goes for the thatched roofs made of dry straws. I ended up modeling every piece of straw in 3ds Max and then baked and textured them in Substance Designer. With these cards, I’m able to build realistic roofs and easily control any roof’s density, helping me—if necessary—avoid too much overdraw.
First, I modeled props in 3ds Max and then continued with ZBrush to add more details. Afterward, I made low-poly versions by hand or, depending on the object, I just ran it through ZBrush Decimation Master or 3ds Max ProOptimizer. Next, I made UVs and baked them down in Substance Designer.
Designer and texture assets using my material library. I also add support for different paint layers because some props have painted details. After, I make different patterns in Photoshop and bring them to Designer. The graph does the rest and adds some damage and scratches.
Moving forward, I plan to convert this system to Substance Painter where I can paint patterns directly.
For rocks I usually start by modeling rough boxes to achieve certain feelings and forms. Then, I do the rest with ZBrush and DynaMesh.
The last step is to decimate and make UVs. Here, I sculpt only big forms because smaller details usually come from tileable textures. This ensures that not too much visual noise is present, which is fairly typical when blending detailed normal maps together.
I have models that work like lego blocks — they’re my favorite assets! I use them to build all sorts of things and structures, and I like to build spline blueprints out of these. For example, I’ll make stonewalls or planks on top of the ground to form platforms that are easier for horses to walk on. I’m able to generate these easily with splines and keep these models pretty light in terms of polycount so I can use a lot of them.
All of the texturing was done with Substance Designer. I really wanted to challenge myself by creating a huge library of different materials that I could blend together. I ended up with a system where I have a few different base materials, like old wood, forged iron and so on. These form the base for everything. Smaller details like moss, sand and rust are then layered on these to produce an aged look that blends the assets better with the environment. All of these materials are in their own graphs and are used as subgraphs when texturing assets. Blending is done with material IDs and, depending on the situation, different masks.
Here are some rules for different materials. Outside objects experience different weather conditions like rain, sunlight, wind and more. This should be reflected in their design — add moss or rust to the object depending on its base material. And feel free to imagine realistic weather patterns existing in your scene — perhaps because the weather is arid, a wind might carry sand that ends up on your objects!
I attempt to make as much of this material blending in textures because it’s the cheapest method in terms of runtime performance. However, there might be dynamic objects or models that need to be rotated, causing this baked-in solution to look wrong. That’s why it’s important to make some of the blending in engine.
There are plenty of different ways to blend materials in engine, and some are more static than others. Vertex colors are great because they’re cheap; but, they depend on the object vertex density to produce accurate results. These are useful for buildings and static objects as they also give pretty good results when combined with heightmap-based blending. Textures that work as masks blend between different materials but usually only work well on certain objects and require loading and storing textures.
Personally, my favorite blending methods are dynamic and don’t require too much of my attention.
One way involves blending materials based on certain directions. If we know that all of our props need to have a thick layer of moss on top of them, we then can implement a method that tells masks to always face the proper direction. Unreal Engine has a node called WorldAlignedBlend that allows us to rotate objects and ensure there will always be a moss layer on top of them.
Generally, the best results are those that combine these methods while keeping things as simple as possible. Blending too many materials together will result in heavy shaders and will tax a system’s memory with tons of textures.
I created a system for rocks in this project wherein I baked different maps in Substance Designer from hi poly to low poly, which generated all of the important maps there for blending two tileable rock materials in UE4. I’m storing this information to different RGB channels; in this this case where R = edge mask; G = mask for second rock material; B = baked ambient occlusion. The moss material blends on top of everything using WorldAlignedBlend, making it fast to create and texture rocks.
Trees are a huge component of this environment and I really wanted to discover the best way to make different trees in a rapid fashion while maintaining variation. Previously, I modeled everything by hand — from tree trunks to leafs — but this time I wanted to test SpeedTree.
I had done some hi-poly tree assets for my older projects that I used now as a base. I baked them down and re-textured in Substance Designer to incorporate them well with my current workflows. Then I cut them into different plates and continued with SpeedTree.
Some of the tree trunk textures were entirely done procedurally and some of them were my photogrammetry tests that I later tweaked with Substance Designer.
In SpeedTree, I made the actual trees using leaf cards and trunk textures. I also set up wind, baked ambient occlusion for vertex colors and generated different LODs, all of which were very easy to bring into Unreal Engine.
In UE, I tweaked the LOD distances and made sure that the last LOD was just a billboard that would activate when the distance was long enough. Wind in SpeedTree is easy to work with; it’s basically just one node plugged into World Position Offset and a wind actor that controls its intensity. When all of the foliage within an environment uses SpeedTree’s wind, it becomes possible to create different wind conditions simply by changing one value.
At the end of the day, I could have just gone with Quixel Megascans — but I didn’t! I think creating objects and making mistakes along the way really helped me understand what occurs behind the scenes, which makes perfect sense for a personal project where the end goal is to learn as much as possible.
I made the grass and other foliage by hand using a basic workflow where I first modeled hi-poly models and baked them down into a plane. I used that plane to cut different parts out that I then twisted and bent to get final models. Texturing was done with Substance Designer.
UE materials are rather basic and use the Two-Sided Foliage shader to achieve that subsurface scattering effect. I also generate an SSS mask in Substance Designer that controls where the subsurface scattering effect happens. The color comes from an albedo texture that is multiplied with Vector3 so I can tint it based on my needs. I also use different nodes to add color variations to grass and trees.
When using instanced meshes, it’s possible to add color variations using the PerInstanceRandom node, which takes random instances and applies different values. I have one master material for all of the foliage types that needs to have that SSS effect. Hence why I use many static switches that I can turn on/off when working with material instances.
The lighting took most of my time and still needs minor tweaking before I can truly call the project finished. I wanted lighting that would work with any situation and would be as dynamic as possible because I wanted to create a fully dynamic day and night cycle.
First, I started by creating nice day lighting, which amounted to two light sources in the scene. One directional light acts as both the sun and skylight and is responsible for the ambient light and the fake-bounced lighting. My project also uses distance field AO and shadows so I have ray-traced shadows for distance trees and good AO for objects. Also, with distance field AO, it’s possible to get large and smooth shadowing for objects when tweaking values. This tweaking is difficult because it’s easy to end up with either too strong or too weak a contrast.
One last part of this lighting is Light Propagation Volume, which creates a fake global illumination effect. LPV exhausts performance so it’s more like an optional feature that I use when I can afford it. I also like to completely disable Auto Exposure because it doesn’t work how I want it to most of the time, and it makes getting the right lighting difficult. Lastly, I avoid correcting the lighting with post process effects in favor of getting the lighting as good as possible from the beginning. Then, if needed, I’ll make minor, final adjustments with post effects.
I spent a lot of time gathering reference images of different lighting scenarios that I hoped to achieve. Other games can also be a great source of inspiration. Masterpieces, like Uncharted 4, strike the perfect balance with lighting and style. From my understanding, Naughty Dog used baked lighting for most of their in-game environments to obtain good indirect lighting and to create smooth transitions from interior to exterior spaces.
Based on that, I began figuring out what parts of the lighting system needed to move and change over the course of a simulated day.
I planned for two sublevels with my first iteration — one for night and one for day. I would change between these because every sublevel had its own lights and post process volumes. It was easy to set up and fast to find good values but it wasn’t quite dynamic enough.
I ended up creating a timeline in level blueprint that was responsible for changing values as time transpired in the environment. The sun, for which I made three curves, is the most important feature of this system. One curve is rotating the sun in the y axis to simulate different times of day. The second curve controls intensity so the sunlight is stronger throughout the day and slowly gets weaker until it’s night, ensuring that the sun doesn’t shine after sunset. The third curve controls the color of the sunlight to simulate morning and evening moods.
The skylight has two curves — one for intensity and one for color. Skylight becomes the most important feature at night when sunlight is totally absent from the scene. It’s responsible for a very small amount of ambient lighting that prevents complete darkness.
Exponential Height Fog also has one curve in the timeline to control inscattering color. The fog color is warm at day and cools to blue at night. Volumetric fog also works perfectly with this system because it changes based on the sun’s direction. Volumetric fog is easy to control and I usually leave it as default. Tweaking happens in light sources with the “Volumetric Scattering Intensity” setting.
I also controlled sky intensity, color and star brightness within the environment. It took a lot of work to find the right values for all of these curves and to test and tweak them to work well together. There’s even a small time window when the sunlight has almost faded when things become purple!
Because everything works with just one timeline I can use a few floats to control how fast time goes, how often scene actors are updated and the time of the day. It’s also incredibly easy to create interesting timelapses this way.
A few things to keep in mind in terms of optimizing. This system changes different light, fog and sky material values so avoid doing this every frame. In my case, the day/night cycle takes a long time to run so I have to use a custom event that fires every now and then; but, with this custom event, I update the entire system every three seconds.
This system works well with materials and foliage — I can play the level and watch how the sun sets behind procedurally scattered trees that glisten under the last rays of sunlight for the day. Then, I’m able to see stars twinkle in the brisk air of night. UE4’s blueprint system is very powerful for this aspect of 3D art despite my lack of coding knowledge — it’s incredible how I can get this up and running, which is a major reason why I think it was a good call switching from Unity to UE.
Setting up in UE4
There are many different ways to add foliage and assets in UE4. It’s important to use instanced meshes because it’s the optimal way to render a huge amount of assets with as few draw calls as possible.
The most common way is to use the foliage paint tool where you assign a mesh and some basic controls for density, scale and rotation, culling distance and so on. This is perfect for finer details or smaller scenes, but it’s not optimal for large environments.
Another way to add foliage is to use landscape material layers. This way the engine will add assets for specific areas on the landscape and you will have control over the density, random scale and so on. This is optimal for small grass and debris to cover large areas but it lacks finer control.
Then there is the experimental and powerful procedural foliage system that is ideal for large environments. It works with volumes that the user can add to levels. The user can assign different foliage types and tweak all sorts of aspects separately. There are settings for the seed amounts, density, growing curves, min/max scales, growth in shade, distance of seed travel, what assets overlap with each other and much more.
This provides a generous level of freedom but it requires a lot of testing and iterating. What one basically does is set rules for scattering meshes, which can be anything from small grass to large trees, stones or even buildings. The editor will end up with the same foliage types that you would paint manually with a foliage paint tool but instead of doing the heavy lifting yourself, a procedural foliage system takes care of these time consuming tasks. This also means that after the procedural simulation you can go and manually tweak things as much as you like.
The optimal way is to use all of these systems together. Simulating millions of meshes will require a ton of time and power to process, so it would be ideal to handle the landscape with a grass system and perform final tweaking with a foliage paint tool afterward. To keep things optimized, it’s important to take the time to figure out the right culling distances for each foliage type. Remember, not every foliage type needs to cast shadows! Think about what the player needs to see at a distance and what things you need to render near the player, but also be mindful of culling off when the player is far away.
Another tip to keep in mind is maintaining a small volume size at first. This will keep testing faster because you won’t have to simulate a large amount of instances. When you have found the right values then you can scale it to your desired size.
Also, remember to separate certain things into their own procedural spawners to make changes faster as you only have to simulate one spawner that has certain types in it. I used one for trees, another one for small foliage and one for rocks. I ended up having around 15,000 tree instances. Imagine how long it would take to spawn small grass models this way instead of using a landscape grass system!
It was fascinating setting up this system and then playing my level to discover the beauty of nature — as procedurally scattered by a computer! It would be interesting to study different procedural scattering tools and techniques like Houdini. I also picked up a thing or two for my project by reading how procedural technology was used in Ghost Recon: Wildlands.
I’m slowly becoming happy with the results of my project but there are still plenty of things I have to do better. My goal here was to establish a strong foundation and to learn and push my optimization skills further. Two things I still need to cross off my mental to-do list are finalizing the day and night cycle and adding a system that can create different weather situations. I would like to add more content, like lakes and caves, through procedural techniques. It would also be interesting to add some gameplay there with blueprints. The level and assets already have colliders and rigid bodies working so it would be fun to add some NPC characters into the mix as well.
This project helped me approach dynamic lighting as a system that consists of different components that need to work well together. Changing values over time can end up looking good enough and can be easy to set up. I also figured out what lighting features I should use and learned to keep things as simple as possible.
Having previously worked with mobile games and now stepping into VR territory, I’m starting to understand how much things can cost and how to fake things as much (and as best!) as possible. I recommend everyone giving procedural tools a try because UE’s built in system is a pretty awesome starting point!
I already use similar systems in our two-man virtual reality project Planetrism, where we have even bigger environments with a much larger library of assets to scatter. Also, I now know how to optimize dynamic lighting in a way to make it a working solution in virtual reality experiences on a large scale.