As a indie developer in a country that don't have a commercial agreement with the US, I hope steam listen to Epic. Steam takes 30% of my revenue, and US gov. get another 30%, than I need to pay taxes in my country (Brazil). So in the end, i get roughly 20% of the full price.
Nice to meet great personality who has excellent talent in 3D characterization and found here best example. Caroline, http://www.personalstatementfolks.co.uk
The game lovers of Epic Games will be disappointed to know that from now onward seam would stop doing exclusives if Steam lifted its revenue cut for developers. William, http://www.eliteassignment.co.uk
Andreas Glad explained some of the complex bits from his tutorial on paintsplat creation with Houdini and UE4.
What a year it’s been! Since we last spoke I’ve worked on a whole bunch of projects as a freelancer like Apex Construct, Battlefront 2 and Crow: The Legend.
It’s been fun trying on some smaller projects, and a lot of VR as well, but I’m done with freelancing for now. I missed having colleagues and I wanted to get back to working on really big projects. So when I heard that Ubisoft was opening a studio in Stockholm I couldn’t resist applying. I’m happily working there as a Lead VFX artist now.
Houdini Game Dev Toolset
Oh man, Houdini is getting so good for games. The rate they are pushing out new tools is borderline insane. I think the record was when I was working on the Flowmap tutorial and came across a bug. I pinged a guy at SideFX and explained it. The bug fix was published within the hour.
The best thing about the game dev tools is that they are essentially shortcuts. Most of them are just taking known workflows and make them super simple. Take the Vertex Animation Textures for example. That was doable before through writing scripts and building the materials and so on. Now it’s one node to do it. They are also making it easier to create custom content. As an example, it’s been easy to draw generic flowmaps in different painting software for a while. What Houdini did was make it easy to paint them on the actual objects it would be used. It’s like comparing Photoshop and Substance Painter.
Most of them are also HDAs which means you can open them up and see how they work. Last time we spoke I mentioned that in my Pluralsight course I show how to make looping smokeballs. That’s now obsolete as there is a MakeLoop node that does that for you. However, if you open it up, you’ll see that it’s the same technique as the one I showed.
Staying on top of the new game tools is challenging. They are putting them out faster than I can learn them. Here’s an overview of the current toolset found in Houdini.
As you can see in the overview, there are a ton of great tools in there, and more coming all the time. It’s a very exciting time to be making games with Houdini, that’s for sure. Hell, I bet it’s an exciting time at SideFX as they are getting a foothold in games. Gamedev is big and, depending on how you measure it, it’s as profitable as films or close to it, and growing fast!
You can think of velocity fields as roadmaps in 3d. The field is a box divided into voxels, which are small boxes as well. Each of these voxels has a velocity value. When a particle enters the voxel it gets that velocity added to it which will make it go in the direction the velocity is pointing. It’s a bit more complex though, as the particles are also affected by other forces like surface tension and drag. Once all of those things are added together the particle has a direction it will go in. Jump to the next frame, the particle has moved a distance in that direction. Maybe it’s entered another voxel and is now receiving a new velocity. It keeps doing that every frame.
Now, by controlling what direction is in each voxel we can guide our particles where we want it to go.
Nature is noisy so by adding in a noise into the velocity field it creates a natural, random looking result in the fluid simulation.
There are bazillions of nodes in Houdini and I don’t think anyone outside the SideFX office knows all of them. Everyone finds their own set of goto nodes that they use over and over again. I know the nodes needed to generate simple meshes and shapes that I need to make VFX for games. However, I would be lost if someone asked me to do character animation stuff. It’s just not part of my realm. Whenever you need to build something outside of what you know, you’ll start picking up a couple more nodes. The shelf tools and gamedev HDAs are great for this. I use shelf tools all the time so I can learn how things should be connected and what nodes have what functionality. Please don’t tell the hardcore Houdinists though, they want the shelf tools hidden and think you should only use wrangle nodes. I say use whatever works and try to learn from the process.
The polyreduce was recently updated so it’s a lot faster. It calculates the mesh once and then you can use the slider to determine what polycount you want to keep. However, that means if you change frames it’ll have to recalculate so it will be slower. Stay on one frame if possible. It’s not ideal to use the polyreduce on a mesh sequence that has a varying polycount as a flipism would. The polyreduction calculates on every frame, giving different results. This causes shimmering or flickering in the mesh. Right now there’s no great way of getting around it without adding a ton of triangles. Keep your animation fast and snappy and nobody will notice anyway.
As for content preparation, it’s probably the things I do daily. Semi-procedural stuff. There’s a misconception that you must do everything completely procedural in Houdini. That’s not the case. One thing I do a lot is model processing. It can be something as simple as generating mesh pieces I can use as particles when an object is destroyed.
To do that I’d bring in the object I want to destroy, manually select the faces I want to keep, then pipe that into a network that I reuse all the time. It would then do some operations on it like, filling holes, UV mapping and moving it to the origin ready for export. You could make this as complex as you want. In the video below I show a simplified version of this that generates gibs to be used as mesh particles in game.
You could make this super complex so it automatically sets materials up, simulates some fleshy parts that fold over the edges and so on. The major decision you have to make as an artist is what part of the mesh you want to use.
Now, this could have been wrapped into a neat HDA with exposed parameters and so on, but I’ve found that when I make a set of these meshes I often need to change the network slightly between sections. Like for the arm, I would need a different bone than the leg and for some thin pieces, I might need to add an extra fuse node. With this setup, I have all of that flexibility but by starting out from this “preset” I can get to those tweaks so much faster.
I don’t want to give up my artistic choices. I just want the tedious stuff automated, so I can produce better content faster.
Being able to copy the material from Houdini to Unreal blew my mind. It is however only a demo material so by default it will just sit there and loop through its animation to show that it works. To make it a bit more useful you can add in some features. Outside of making it work with particles like in the tutorial, I like to add in a bit of math to the frame selection part of the network. It allows me to store multiple animations in one texture and then I can select individual clips from it. This is useful if you have several animations stored in there just for variation like in this video.
It’s even cooler if you have a more complex setup. For example, if your vertex animation is a shambling zombie, you can store 5 different death animations in the texture. Then you can randomly select one from the particle system when the particle gets the event that the zombie is dead.
Once you’ve updated the shader to read parameters from the particle system there are so many things you can do and it’s all up to how you made the animation in Houdini. You could make a completely custom effect that’s matched to a cutscene that you simply play back over time. Or you could export an animation at high framerate so you can alter the speed on demand without getting any stuttering. The more data that you cram into the Vertex Animation Texture, the more freedom you get in the engine. I tend to vary the animation speed over particle life. In the case of blood and paint splats, I tend to keep the blob shape for a few extra frames, before speeding through the initial burst really fast and slowing down for the erosion and settling of the effect. That gives it a lot more punch than just playing it straight through. Especially if you include a couple of directional squirts to lead the effect. I think this is a problem a lot of VFX artists have in general. Just playing through an explosion flipbook for example often look very sluggish and boring. It’s fine for movies to have long, slow explosions since they can spend time on them. Games usually need to be a lot snappier. Speed it up in the beginning and let it linger longer on the fully expanded parts.
Now the decal isn’t covered in a very thorough way in the tutorial because of time restraints. To finish the decal off, I should have created an erosion map for it either using the same technique as the splat or using photoshop actions. The erosion would then have been driven by the decal lifetime node. It’s quite straightforward, but if you want to see it all set up I have VFX pack with its setup that I give away to my patrons over here.
Now, if you want to create a painting game like Splatoon, this type of decals is not the way to go. You should instead look into render targets.
If you are interested in learning more about using Houdini for games, check out my Patreon above, or my tutorials over at here.