It says "404 not found"
Wow, amazing job, really inspiring !
Where do i send gold, i want the game!
Check out the amazing talk from Greg Zdunek, talking about the way he creates outstanding realistic vegetation with the help of Houdini.
My name is Greg Zdunek and I’m a CG generalist and software developer based in Melbourne, Australia. I’ve worked across a variety of games and VFX projects as both an artist and technical director. I’m also the founder of Vertex Library and actively develop software and educational resources for the CG industry.
I started my journey in 3D and computer graphics about 13 years ago in my early teen years, with ambitions of creating games. Since then I’ve been interested in all aspects of game development. I spent many of those early years learning programming, 3D art and music production from video tutorials, online resources, and personal projects.
I ended up studying computer science and game development at Monash University in Australia. By that time I had already built a small game engine (nothing like Unity or UE4 existed back then) and developed a number of simple 3D games, so my formal education was often a revision of things I already knew. That gave me a freedom to explore new interests like photography and filmmaking, which is an unlikely twist, eventually pushed me into the VFX industry. Despite studying game development, my final student project was a small animated short film called Glimmer.
I most recently worked for Luma Pictures as a pipeline and lighting TD, developing tools and supporting artists on Marvel films like Ant-Man, Deadpool, Doctor Strange, etc. As both a programmer and artist, I always look for ways to leverage new technology to simplify and enhance the creative process. So in 2017 I decided to leave the VFX world and founded Vertex Library as a platform to research new ideas and build better content creation tools.
I’ve always been captivated by nature and particularly the concept of plant life reclaiming the land and growing over abandoned buildings in post-apocalyptic art. For a long time I’ve wanted to create a range of animated short films set in such a world, but rendering huge natural environments on a low budget has numerous technical challenges that have made it almost impossible.
It wasn’t until testing Redshift in 2014 that I realized the speed of GPU rendering finally made it practical to create highly realistic 3D environments full of plant life, without needing a huge render farm. The forest render below took only 5 minutes on a single consumer GPU, which convinced me to totally switch to Redshift as a primary renderer.
Since then I’ve worked to improve every aspect of creating natural environments, from finding new efficient ways of making plants to more interactive scatter tools and workflows. A lot of my early work was all done in Maya, which was fine for smaller natural environments, but eventually, it became clear that I needed to take a more intelligent and efficient approach to create larger and more detailed scenes.
After completing my Swamp Town project in Maya, I turned to Houdini and Substance, and started to develop a modern pipeline for creating games and films with procedural content. The first major project I started researching was a modular plant library, which marked the start of HyperGrass and HyperTrees.
I should note that HyperTrees is very much a work in progress. An ultimate goal is a standalone tool that lets you very easily customize your own trees from a huge set of modular pre-made trunks, branches, leaves and bark textures. Almost like a Lego set for making 3D trees. The idea is to have a perfect blend of simplicity and supreme quality that is ready to render in a few clicks.
With that ambitious goal in mind, I had to look at using both photogrammetry and procedural texturing to achieve the quality it needed. The photogrammetry capture workflow is fairly straightforward. I mentioned a number of useful tips in my recent guide to outdoor scanning.
The HyperTrees bark pipeline relies on photogrammetry and is built around Houdini and Substance Designer. Houdini handles the retopology, UV’s and baking, while Substance takes the tiled texture bakes and creates a full set of textures with a PBR conversion template. Everything is wrapped up into a Python-based system that’s almost fully automated. The diagram below illustrates the whole process.
For the tiling process, I developed my own tools and workflow. I wanted control over which parts of a texture are tiled and have the ability to patch out unwanted patterns natively in 16K. So my first step is to hand paint a special RGB mask in Photoshop containing the areas I want to keep. In the example mask below the red channel stores the vertical tile mask, green is horizontal and blue is used to patch out repeating or unwanted areas.
This mask and the baked height map are then run through a variety of edge detection and height blending functions with python image-processing libraries to generate a more accurate mask for creating seamless tiling textures.
Processing the textures natively at 16K/32K before down-scaling to the more usable 4K and 8K was key to retaining the all the detail from the raw scans. It’s also much easier to hide tile seams and nicely patch out repeating patterns when you have more pixels to work with. If you use the 16K textures directly the results can be fantastic in close-up shots.
However, building a pipeline that does everything in 16K or higher isn’t as simple as doubling the resolution. RAM is often the major bottleneck, so I had to build a system to split the raw 16K maps into more manageable 4K chunks. These chunks are distributed and processed across multiple threads and PC’s in minutes, and finally stitched back together to 16K. The same process is used for 32K maps, but with more 4K chunks.
Achieving realistic results
PBR is currently a world standard and the most practical approach to realistic rendering. So the first and most important step is respecting PBR conventions. For example, ensuring that the albedo and reflectance maps are in the correct ranges and balanced correctly. This will ensure consistency and compatibility with other PBR textures.
If it’s a tiling texture, you should pay special attention to how your texture might be tiled and how obvious the repeating patterns may be. As humans, we are incredibly good at spotting patterns and recognizing if something isn’t real, so you may need to identify and patch out or blend together outstanding shapes that catch the eye.
Finding the right balance and getting a realistic texture to function well is all about context. Will it be used and blended with other textures? Will it be visible on a large close-up asset or used in the background? What lighting conditions will the texture be rendered in? What degree of realism is needed; perfect physical accuracy or exaggerated hyper-realism? Understanding the context of a texture can help greatly in tuning the balance of the PBR components and making it function well for its intended purpose.
My approach to creating plants and natural environments always starts with a good reference and a fundamental understanding of plant biology. A majority of plants live through seasonal cycles of growth, death, and rebirth. In this cycle, there is a constant battle and search for energy, be it from sunlight, water or nutrients in the soil.
From a visual perspective, there is an entire hierarchy of plant life at different stages of growth, all competing to survive. Translating this into a practical workflow for 3D art has two important components – layering and variation.
First, you need to model your plants at different stages of growth and depth and layer them together when scattering to replicate the full life cycle of the plant.
Secondly, you need enough variations of shape and color so that you can scatter the plant in a way that looks natural and not repetitive.
This was one of the key considerations for HyperGrass. The workflow for using HyperGrass is indeed quite simple. The main difference between traditional workflows is the increased density. The actual grass models are single sprouts, not the usual large round patches you see everywhere. This allows you to scatter millions of them if you want, and have precise control over the transform of each sprout. When you combine this with the growth layering approach and use noise to scatter the grass, you can get all sorts of organic formations, clumping, and natural shapes that are otherwise impossible to achieve with traditional grass patches.
Everything I’ve described is already within a procedural mindset, so Houdini is easily the best tool for the job and offers incredible control and flexibility. You can procedurally design a plant in its fully grown state once, and then control its parameters to create as many variations in growth, shape, and color as you need. Having this ability to create hundreds of variations at once also opens up creative freedom, as you can easily pick and choose variations that are appealing and fit well together, and discard the others without any real effort wasted.
The procedural future
The shift to procedural content creation is already happening throughout the industry. One example is texturing, which has been completely transformed with the adoption of Substance Designer.
A few high-end open-world games have already been using Houdini to populate portions of the playable world without needing hundreds of artists to manually place every asset. This is also making its way through VFX studios and there seems to be a common goal across the industry to make procedural content more accessible and easy to use for all artists, big or small, regardless of technical knowledge.
I think the logical next step and most interesting thing on the horizon is AI machine learning. I have no doubt that over time our tools will leverage AI technology and change our industry much like the procedural workflows of Houdini and Substance have done already.