@Tristan: I studied computergrafics for 5 years. I'm making 3D art now since about half a year fulltime, but I had some experience before that. Its hard to focus on one thing, it took me half a year to understand most of the vegetation creation pipelines. For speeding up your workflow maybe spend a bit time with the megascans library. Making 3D vegetation starts from going outside for photoscanns to profiling your assets. Start with one thing and master this. @Maxime: The difference between my technique and Z-passing on distant objects is quiet the same. (- the higher vertex count) I would start using this at about 10-15m+. In this inner radius you are using (mostly high) cascaded shadows, the less the shader complexety in this areas, the less the shader instructions. When I started this project, the polycount was a bit to high. Now I found the best balance between a "lowpoly" mesh and the less possible overdraw. The conclusion of this technique is easily using a slightly higher vertex count on the mesh for reducing the quad overdraw and shader complexity. In matters visual quality a "high poly" plant will allways look better than a blade of grass on a plane.
Is this not like gear VR or anything else
Dmitriy Baidachnyi and Luke Stafford talked about the technical side of the game production covering Houdini and Substance Designer advantages and procedural workflows. The guys also gave a talk on the recent conference CG Event.
Introduction: Dmitriy Baidachnyi
Hello there! My name is Dmitriy Baidachnyi, I am a Technical Artist at Dragon’s Lake Entertainment. I have a radio-engineering background but I was always interested in CG art and this interest led me to 3D software such as 3ds Max and After Effects.
I was lucky enough to jump into the game industry almost without any relevant experience but with a ton of enthusiasm. For a couple of years I was focused on creating real-time and pre-baked visual effects for mobile and PC games but eventually, I found myself paying more attention to creating tools and simplifying asset creation processes, learning to script, digging into shaders, figuring out how all this magic is made in real-time on a lower level.
I’m still much interested in pre-baked visual effects and simulations which helps me to bring some efficient techniques from the film industry and adapt them in the game dev pipelines.
Introduction: Luke Stafford
Hi, my name is Luke Stafford, I’m the Executive Producer at Dragon’s Lake Entertainment and I’ve been building games for as long as I can remember.
I started exploring games back in 1993 with Doom, although I had already tinkered with games before then, copying BASIC code from magazines. That’s a different story, however.
So Doom was the first time I really understood that you could make game content, it was my first real taste of the level design, and from that point on, I was well and truly hooked.
Over the years that followed, I made hundreds of maps for Doom, Duke Nukem, Quake and anything else I could get my hands on. I learned new tools, new programs, and techniques as fast as possible and just like Dim (Dmitriy), my enthusiasm for games and specifically the freedom of creativity that brought was infectious. A career in the game development was almost inevitable.
I was pretty lucky when I entered the business professionally and got my first gig as a level designer at Codemasters, getting to work on games like Operation Flashpoint 2 – Dragon Rising, and their racing franchises Colin McRae’s DiRT and GRiD. I also had experience working within large teams on simultaneous, multi-platform, game releases.
By the time I finished my tour of duty for Codies I was their Chief Level designer with a lot of new experience in the game production. Then I moved to Crytek… but more on that later.
Procedural Approach in Gamedev
80.lv: The current landscape of tools has changed dramatically with the introduction of procedural workflows. One of the biggest things that influenced the development was probably the texturing software like SD and SP. From your high-level perspective, could you explain to the general audience how SD revolutionized your workflow?
You’re absolutely right! My personal feeling is that the industry shifts fast into proceduralism. Nowadays big studios invest time not into creating one-time-used assets but into developing something more complex which results into later benefits of time saving.
Substance Designer is a good example of this. It’s way much cheaper to raise (or hire) one specialist who is capable of creating one procedural material than to have a dozen of junior texture artists spawning the variations of old-schooled textures for different purposes. Having a library of substances rather than adopting your image texture base to daily needs results in huge time savings for the company.
But there can be way more possibilities if you combine the procedural materials with the power of shaders which process the incoming textures using the information that they take from 3D geometry data. Imagine the amount of power you have when you are able to сhange almost any parameter of your assets in the game.
This can be an appearance, shape, animation, behavior etc. The most valuable thing about this is art direction flexibility. Spending enough time to set everything up you come up with a system that allows you to focus on the pure artistic polishing. This always results in better visual quality without hitting the performance problems and thus a better player experience after all.
If there is one thing I take away from my time with Crytek, it’s this quality. Working on the first Crysis game as Lead Level Designer for ‘Crysis Wars’ was something very special, an honor and a privilege. Handcrafting every individual asset, every level, every texture, prop and material, and every crack in a rock took a lot of time and a lot of effort. I got to work with some truly incredibly talented artists, developers, and designers. I learned a lot about the business, building quality games and the Art of Production.
Today, things are different, the tools have come a long way, engines and rendering have evolved but it’s always good to remember where it all started. My team at Dragon’s Lake all share similar experiences and have their own great stories to share. We try to apply our knowledge of traditional Art production pipelines whilst exploring new ways to achieve high-end, Triple-A content. By utilizing contemporary pipelines and tools like Houdini, Substance Designer/Painter and leveraging Unreal Engine, we stand a far better chance.
Houdini’s Position in Gamedev
80.lv: Another tool that is coming more aggressively into games today is Houdini. SideFX is developing new game development tools, introducing more and more workflows. From your perspective, what are the best things that Houdini brings to the table?
I must confess that Houdini is my love at first sight. It is a perfect tool for a Technical Artist and can drastically empower any kind of the game dev pipeline.
The only disadvantage I see in this software solution is that it’s pretty hard for entry-level specialists. It differs a lot from any DCC package, so you have to change your mindset and approach to digital content treatment. But when you do that you feel like Neo in the ending of the first Matrix: you see the data flow everywhere and can manipulate it in an easy way using handy tools. More than that, you can create your own custom tools for yourself and the team. Houdini digital assets can be used in major DCC software as well as in game engines like Unity and Unreal with the help of Houdini Engine plugin.
Speaking of the GameDev Toolset, you cannot underestimate the contribution that Luiz Kruel, Paul Ambrosiussen and Mike Lyndon are making for the industry by creating new tools inside Houdini. I personally use tools from this set daily and they save a ton of time. Pretty complicated things like creating flow maps, flipbook textures for effects with normals, emission data etc. are now available in a couple of clicks thanks to these tools. More than that, you can inscribe them into your pipeline and generate all the mentioned things procedurally regarding any artistic changes you might face.
Houdini & Procedural Level Production
80.lv: The procedural generation of the levels in Houdini is something everyone is talking about, but it feels there’s little trust to it. Some feel that the architectural meshes or the whole levels either lack individuality or feel very simple. Could you talk a bit more about the way you think Houdini actually works with the generation of complex parts like levels?
We’re very aware that complete levels created in a procedural way often result in content that feels generic. It’s a big part of why we approached the generation of content with the understanding that Houdini and other tools can help us create content but do not completely replace the handcrafted work. We needed to find harmony, a balance between the modern and traditional art production techniques.
In our current product we are actually using the tools and pipeline to help us generate buildings, landscapes, and props, to make the first high-level pass and take care a lot of the grunt work, then we work hand-in-glove with our traditional artists and level designers to ensure the individuality is well represented.
This actually means we are also creating additional tools for Unreal Engine that level designs can quickly customize layouts, add additional details (like telegraph poles), and tunable prop placement. We find that this approach enables us to both automate a lot of repetitive work whilst still delivering a crafted experience to our players.
Procedural generation is indeed a great subject of discussion. It can be a powerful addition to the classic approach but cannot replace it completely. So I would say that for getting a good benefit from it you have to wisely combine manual work with procedurally generated parts.
It’s always good to have some additional tools. These tools may be simple like creating roads or fences with splines. Or they may generate whole levels. This much depends on the team, its skills as well as on the scale and type of the project.
You might not want to invest a lot of time to create a procedural tree generation system with LODs and auto baking textures if you are making a small indie game. It’s often easier and cheaper to create this kind of assets by hand. But on the other hand, you might save time in future if you create a procedurally generated level and bring some individuality, manually adding some details to make it look prettier. In the end, it is always up to the artist whether to use it or not depending on certain circumstance.
Houdini’s Useful Features
80.lv: What are some of the cool new features that Houdini introduces? Maybe the new photogrammetry workflow or the new ways to work with liquids simulations, animated textures, and VFX?
And I can’t thank enough SideFX guys for the GameDev Toolset. Tools for vector fields, looping fluid simulations, creating flip-books etc are definitely worth checking out for VFX artists, as well as Unreal Niagara Tools which I believe will become industry standards. I haven’t used the new fire presets and gamedev RBD tools yet but that’s what I’m looking forward to embedding into my pipeline.
Photogrammetry workflow tools and Open Street Maps data import tools are super interesting too, but I’m afraid I didn’t have any opportunity to use them yet. Still, I can imagine all the possibilities they open.
Texture Production in Houdini
80.lv: Could you talk a little bit about the way you can actually work with textures in Houdini? What are the ways SD and Houdini can work together?
The nice thing about Houdini is that you can create almost anything there including textures. With a bit of knowledge in VEX and Python, you can recreate any tool similar to Substance Designer. Of course, most likely you will not do that because this stuff is much more convenient in SD.
In my recent experience, I have used COPs in the bunch with Heightfield 2D volumes. I wanted to create mountains for a mobile game. I generated the mesh using Heightfields workflow and imported all the volume data it had (heights, water, flow, masks, debris etc) into COPs and there I composed all the useful stuff into textures’ RGBA channels. After that, I created a custom shader inside the game engine that blended some tiled textures using the unpacked data. Needless to say that there were dozens of iterations but the procedural approach made the experience easy and smooth.
In most cases, I use Houdini for mesh generation while SD — for textures and procedural materials, and they meet inside the game engine glued together with custom shaders. Houdini has a Substance SHOP node which allows you to import .sbsar files and use it inside the material system for renders. There are cases when you need to have some quick options to render out a game asset if your game has 2D elements or your studio’s marketing department asks to provide some content for promotion. In order to get the same visual result as in the engine, you can recreate any real-time shader with SHOPs or new material network in Houdini. So you can stay procedural even in this case.
Houdini & Game Engines
80.lv: We’re really interested in how Houdini works with modern game development engines. There have been some very neat releases of bridges between Houdini and Unity/Unreal. How do they help the development?
I see two approaches to creating this kind of bridge nowadays:
- First is to set up the output paths to save content directly to Engine’s project folders and do all work inside Houdini or DCC tools using Houdini Engine.
- The second is to focus mostly on creating the tools and .hdas (Houdini digital assets) for use inside Unity or Unreal with all the advantages of Houdini Engine there.
The option you choose mostly depends on the pipeline of the studio. To be honest, I prefer the first approach because Houdini Engine in Unreal and Unity is still raw and unstable in some cases. Though I love the fact that you can set up a lot of additional mesh info on the fly. Collision meshes, LODs, vertex colors, additional UV sets, material binds, sockets etc.
A very interesting new feature for VFX that connects Houdini with Unreal is Niagara toolset. It’s in beta at the moment but I expect a lot from it.
Renovating the Pipelines
80.lv: Finally, guys, how do you teach the artists to work with such complex node-based tools? How do you actually make them overcome these fears and appreciate all the advantages they bring?
Our goal is simple: free up our artists to do what they are best at and let the pipeline and tools do the heavy lifting.
As a new studio, created by industry veterans, we already knew that we needed to find Artists who worked with these tools, and like Dim, have a technical approach to Art production, so we didn’t have much issue with overcoming our fears.
While most of our Art team still approach their work with traditional methods, creating assets in 3D tools like Max and Maya, they also routinely work with Substance Painter to produce their final asset materials. Checking the results live in our test environments in Unreal Engine they’re able to quickly review their work and make changes.
One of the most important things about Substances for us is that we are able to work with dynamic real-time materials, allowing our Engineers and Artists to work together to get the most out of our pipeline.
Aside from the obvious Art benefits when using Substances, there is also a very important benefit when producing games with Substances, something we covered in our talk at CG: in other words, a much smaller distribution package.
Over the last decade, the disk size of games has grown dramatically, largely this is due to considerably more data going needed to achieve much higher fidelity Artwork in the product. Working with Substances we are able to target a smaller target size, so instead of pre-packing our textures, we can distribute .sbsar and unpack them client side. This also gives us the added advantage for our PC audience that they can customize their setup and use materials that their rig can handle.