There are AwesomeBump that is written in QT and do not require .NET Framework and has code open.
Интересно, не понятно зачем, но круто. Я бы хотел поучаствовать в проекте
Already have ndo, b2m, knald, and others.. Why another one?
Marina Alexandra Bade gave a talk on Shanty Town, her procedural environment tool for Unity with Houdini Engine.
My name is Marina Alexandra Bade and I’m based in London, UK right now. However, I’m originally from the north of Germany, remotely wrapping up my bachelor’s thesis in Animation & Game at a University near Frankfurt. I started off in 2016 just considering myself a 3D person. After wanting to pursue a career as a Texture Artist and after that as an Environment Artist, I followed down the rabbit hole of the procedural Tech Art at the beginning of 2018.
The projects I worked on were mainly digital reconstruction of the real world and historical environments for German TV and museums. I also created a VR experience where the player is abducted by a cult-like alien regime craving Coca Cola.
Procedural Favela: Start of the Project & Goals
This project started in March 2018 with one month of full-time Houdini learning. In April I was ready to start my 6th-semester university project together with three teammates. The idea was to create a parkour game with bunny hopping and strafe jumping mechanics. The story behind the game was set in a world of natural catastrophes and floods where mankind has retreated to the last inhabitable places. Humans were building their new homes on the ruins of washed away apartment buildings with anything they could find.
My main goal here was to enable the game & level designer to quickly fill big levels with game-ready environments. Even though the project wasn’t finished due to time restrictions, I was able to bring the tool to a point where it proved its worth.
The core element of this tool is Houdini Engine for Unity. This plugin made it possible to create digital assets in Houdini and use them inside of Unity. The asset’s parameters are revealed in Unity for custom manipulation. The values of the parameters are then calculated in Houdini and sent back to Unity where you see the result in real-time. It can be baked and stored as a game object or kept for re-manipulation at any time.
The buildings are generated based on a “stack boxes” algorithm. A box is created within certain scale boundaries and with every iteration of the tool another box is added. To control the overall dimension of a house width and height parameters are given. The current house iteration is also presented as a parameter to control the house’s complexity. A seed serves as a comprehensive randomizer to every parameter.
The procedural tool features two distinct building modes: a custom mode and a random mode. After heavily developing the random mode I learned that the game & level designer needed much more control over the environments he wanted to create. So I added a custom building mode that made it easier to type in actual box sizes and pre-define the house’s look. Whether it should stand on a plateau, on stilts or on no base at all. The wall material can be set on a per house level. The custom mode is still work in progress as I am planning to add handles for easy box placement.
The random mode, however, randomizes all of those parameters inside of certain boundaries to spark the users’ creativity and allow for fast environment creation.
The tool also has a low poly and a high poly mode. After realizing that the polycount of a more complex house could explode into unreasonable amounts, a low poly mode became necessary. I baked down all of my panels on planes like a trim sheet. Those planes were then processed in the exact same way as the high poly panels. This enabled me to precisely predict the way a high poly wall would look, which made quick iterations much easier and saved tons of polygons while looking nearly as good as before. In the image below you see how a high poly wall mesh would look like:
Buildings feature as many panels as their skeleton has polygons. However, the panels are trimmed to fit the polygon’s size so each of the polygons only uses a fraction of a whole panel. The image below illustrates this process:
The materials are distributed as attributes on the surface of the house. After figuring out that “one wall-one material” looked very artificial I figured that organically growing the materials from random starting points (like growing a selection) look way better. The image below shows how this is done:
Image #04: Material attribute distribution on the walls:
Each of those material clusters is taken separately, the shared edges are offset to achieve a less uniform look and one material is chosen. Possible options are metal plates, wood or corrugated metal (compare to the image #03).
For each face, one variation in the form of a material plate (high poly mode)/ trim sheet (low poly mode) is chosen. The material type stays the same throughout one cluster.
Each face serves as a kind of stamp, stamping out a random piece of this material plate/ trim sheet. This again increases the wall’s visual variety.
The nodes I used are all pretty simple but involved a lot of vex scripting (Houdini’s own programming language). What makes it so powerful is working nodally with many simple code snippets and nodes into a complex network. I relied heavily on algorithms for checking intersections, using bounding boxes, attribute transfer, booleans, point neighbors, copying/stamping, sorting points and calculating directions. For more detail on the process, I will keep updating my Polycount thread to explain and document what I did.
Windows, Doors & Props Production
A technique I use a lot is scattering points on surfaces such as walls or roofs for props or window placement. The doors were the hardest ones to place as they needed to follow a ruleset for proper doors.
There are two kinds of doors: ordinary doors that have ground in front of them, and so-called balcony doors with no attachment to any ground. To figure out what positions were suitable I had to take into consideration whether the wall or face that was currently checked was wide and tall enough to hold a door mesh. Also, there mustn’t be another house box directly in front of it that would obstruct the door and make it practically unusable. After finding suitable faces, I check for whether they have ground underneath them or not with the help of a raycast. If the ray intersects with another piece of geometry that is of type floor (roofs or base plateaus) the face is eligible for holding an ordinary door. If not it may hold a balcony. After isolating the bottom center points of those faces as an origin I added a parameter to the tool depicting how many doors should be present. That number determines how many points are then kept and a variety of door meshes are spawned on them. In addition to that, the base shape of the door is stamped and extruded out of the house box to form a doorstep.
Windows and Props are much easier to place. After figuring out that randomly scattered windows across the walls look incredibly odd and unnatural, I decided on spawning one window per face. First, I eliminate all faces that aren’t quads. After that, I check the face’s dimensions and determine which kind of window mesh it will be able to hold. The window mustn’t exceed the face’s boundaries as this would place it outside the house wall. Based on the parameters of the asset a corresponding window that fits into the space of the face is then chosen.
To place props on the roofs and plateaus I isolate the top faces where they will spawn and insert them by the maximum width of the largest prop to prevent any floating objects or intersections. After that, I scatter the number of points on the remaining surface area dictated by the input parameter. The points will serve as spawning positions for randomly chosen props with a specified ratio.
In the video below you can see how all of it comes together inside of Houdini:
The Modules: Creating Meshes
I created the meshes and the whole houses based on concept art by Jonas Hassibi.
With his help, I was able to get the vibe, shapes, and materials right. The meshes were created in Blender as mid & low poly meshes. I used the HardOps plugin by masterxeon1001 to quickly export high poly meshes and bake edges and details onto my low/ mid poly meshes. In total, over 50 modules were built including wall panels, windows, doors, all kinds of props, plateaus, sunshades, balconies, pillars, and fences. The meshes themselves are never stretched or modified other than in the ways detailed above. They can be combined and exchanged as they are all created with the same scale and placed on the origin facing in the positive Z.
Art by Jonas:
The restrictions I had were mostly polycount related. I knew that a lot of the meshes would be used multiple times throughout one building. As I didn’t have any experience with this workflow I didn’t know what my hard limit would be. I was developing the tool while also building the modules, so I had to wait until the end of the mesh implementation phase until I could see whether the polycount was low enough. Especially all pieces including fabrics were problematic as I didn’t have the time to do proper retopology and ended up retaining a lot of polys.
One of my main focuses in the project was complete flexibility to be able to exchange the meshes used in the tool right now. The tool reads the mesh files it is using from a fixed folder structure, so these meshes can be easily exchanged. Either in the folder or as drag & drop inputs inside of Unity. This can completely change the results of the tool, changing the output to an oriental city in place of the shantytown in a few clicks given the correct modules. As the tool inside of Unity has the exact same parameters as in Houdini there are no differences in flexibility. All parameters are accessible and editable inside of Unity. However, whenever I want to change something in Houdini, I need to re-export the .hda and update it in Unity.
The materials are set up traditionally. This means that every mesh has a material that I created inside of Substance Painter. During the texturing process, I build smart materials based on color ID maps distributing a different kind of metals or wood variations. These were always the same throughout the project. If I wanted to change one of those base materials I could easily re-export the maps. However if this project would have included more modules, or if I had more time I would have set up this whole process in Substance Designer with the Substance Automation Toolkit. In a bigger production, this would definitely be the way to do it.
The texturing itself was a very creative and free part of the project. I could recharge myself for the tougher technical tasks so I enjoyed this bit of manual work a lot. I tried to make the modules feel as if they are actually used and people are actually living in these houses. In the image #05 you see that the sunshades are made out of plastic tablecloth or old curtain fabrics, as this would have been something people have at hand. The walls are spray painted with slogans and sometimes it seems as if somebody started painting over them but stopped as it wasn’t worth the effort. Some hand marks and footprints are visible as well.
Practical Use & Optimization
I created the tool with production in mind and my team was a great support there. They would always tell me right away if something I had thought of was unpractical or if they needed some other functionality instead. I tried to implement all the fundamentals a tool would need if it went through its first iterations in production. That’s why it also features light baking, different levels of detail, as well as a controlled and random building mode. Houdini Engine already offers a lot of support there as it is aimed towards using the assets in production.
Optimization, on the other hand, is a process that’s never really done. I have no idea whether my level of optimization is enough. I’m still iterating through these kinds of issues trying to understand the engine better and how I can benefit from its features. This tool is no fixed pack of nodes, it’s just as flexible as I am. If better solutions for problems are presented in future I will work on improving it as much as I can.
The next step for me is testing the tool with different sets of meshes. I want to really understand where its weak spots are, and how I can further improve its features for production. I want to start working on another tool that will create cities with the current house tool as an input. This may then manipulate the house tool’s parameters based on a distance to water, the terrains height or other landscape specific attributes.
I don’t think more complex buildings are necessary. This tool is a proof of concept that something like this is doable and working well in terms of visual and technical requirements. It also shows that you don’t need a big team of developers to create these kinds of tools. I would prefer to specifically tailor a tool for every project that wants to use procedural assets. A one-size-fits-all tool is neither possible nor worthwhile.
If the community is interested in using my tool and experimenting with it I will release it on Gumroad and try to work on an Unreal version as well. Still, this may take some time as the Houdini Engine for Unity and the one for Unreal are quite different.
Advice for Learners
Tutorials that helped me a lot were those from Anastasia Opara, Rohan Dalvi, and Entagma. Also if you’re interested in coding inside of Houdini, Joy Of Vex is providing some great little exercises for learning the fundamentals. The Houdini Forum, Houdini Discord, and the SideFX Support were also incredibly helpful and encouraging.
I would advise everybody who wants to really get into Houdini to intensively study the documentation and all of the sources named above. It can be very frustrating in the beginning, and very hard if you’re trying to learn it on the side. That being said, it will definitely pay off, as Houdini is the greatest and most powerful sandbox I ever had the pleasure to play in.