Pretty good post. Thanks to blog author for the wonderful and informative post You may also read the website - http://www.coloradoloansnearme.com
A servo stabilizer is a servo motor controlled stabilization systems that performs optimum voltage supply using a buck/boost transformer boost that captures voltage fluctuations from input and regulates current to the correct output. For more informetion us : https://www.servostabilizer.org.in/what-is-servo-stabilizer
Tim van Helsdingen did a breakdown of his sequence made with Houdini, Redshift, and Megascans, and talked about rigging a vehicle, simulating water, rain, mud, rendering heavy scenes, and more.
My name is Tim van Helsdingen. I am a freelance Houdini FX artist/Generalist based in the Netherlands. I started doing 3D around 2006, but I never really knew what to focus on. I literally tried everything for the first couple of years, from 2D animation to making games, from stop-motion to 3D/FX. 3D however was always the thing that I like the most, but it took me a while to find my speciality in this.
After I finished school I rolled into doing a lot of motion design. Mainly in After Effects and Cinema 4D. But I never really enjoyed the type of work it brought me, it was a lot of branded content and usually very short projects (a few days of work or sometimes even only a few hours) and I really wanted to focus more on working fully in 3D and FX to do more high-end stuff. I especially wanted to do more FX because that was something I enjoyed the most.
So I quit my job at the time to start freelancing and began learning Houdini around the same time, because I heard a lot of good stuff about it. I slowly moved from first doing a lot of things in After Effects / Cinema 4D to now doing most FX in Houdini. I freelance for a whole bunch of different studios in the Netherlands, mainly in advertising, and I really like the diversity which they bring. Freelancing gives me the freedom to also have time for personal projects which I do in between. I don’t think I’d do as much personal work if I was working in the office 40-50 hours a week. Personal projects are really something I like to do to push my skill, as I’m not bound by any time constraints which you usually have in production, so I just fiddle around until I’m happy.
Jeep Driving Through Mud
My personal projects always start out a bit randomly. I was working on a commercial car project for Jaguar at Ambassadors studio when I came up with the idea for this. I thought that it would be cool to try some car animation myself after that to see how I could do that entire pipeline from CAD model to FX all through shading/lighting by myself (for work, I mostly do the FX part only). Also, I wanted to see how far I could take Redshift for this (since it’s quite a heavy scene).
Since I really like doing fluids I figured it might be cool to try to make a jeep driving through some mud. I had a pretty clear picture of what I wanted to create, so once the jaguar project was finished I had some free time and spent a few days trying to build the basics. Then I just kept building on that over the course of about 2 months whenever I had some free time.
I really like trying to recreate reality as much as I can in personal projects (I’ve also done some boiling water sim last year with the same idea: tracked camera, photorealistic shading, etc. You can check it below), so my goal from the get-go was just to create something that looked like someone just shot this scene on the smartphone.
I looked at a ton of reference footage, – apparently, there’s a lot of communities for driving jeeps through mud, so there are plenty of videos on youtube. I also used a lot of sounds from actual clips in the final video for the sound design. Having realistic sound helps selling the shot.
The image below was my main reference which I didn’t try to replicate exactly but drew inspiration from. I also got the track for the camera movement from there.
For the rig, I used Cinema 4D. I didn’t choose Maya because it’s just a tool. I used Cinema 4D before I got into Houdini so I’m familiar with it. I spend 90% of my time doing Houdini work now, but for doing the rig it was just easier to work in Cinema 4D. Rigging is something I’m actually not too familiar with and I came across an awesome car rig for Cinema 4D, which I just changed to fit my jeep.
I made some extra enhancements for suspension and other parts, but the changes mostly happen underneath the car, so you don’t even see much of that in the final video. I had fun doing it nonetheless.
I only use Maya when I’m working in a studio where they use a Maya pipeline. Ideally, I’d just stay in Houdini 100% of the time but sometimes it’s easier to partly work in another package.
To get the whole thing to work I had to optimize the CAD model (in the current form, it still consists of several millions of polygons, but hey, it’s rendered). Cinema 4D handled that task quite well.
Simulations in Houdini
Houdini is the most flexible software package for FX which allows you to grab any piece of data and manipulate it in any way. Other programs give you a couple of buttons to press. In Houdini, you make the button and you make it exactly how you need it.
It’s really easy to make custom velocities, collision meshes, etc. to drive simulations exactly how you want them to. For example, in the beginning, water was flowing over the car (over the hood) which looked weird, so I made a velocity force that was following the hood of the car and pushed the water down. Same with water in the front: it flowed into the grass which I didn’t want, so I put some velocity fields there.
Also in Houdini, it is easy to manipulate the sim afterward to fix any mistakes that might still be in there. Sims never turn out exactly as you want, so changing them afterward is usually the easier way to go. I’m pretty sure I pushed some stuff into place after the sim was done.
I used Houdini to add animation to my SpeedTree and Megascans assets. Since I don’t have the Cinema version of SpeedTree I was limited to using the version for Unreal and then exporting the trees from Unreal as FBX. This doesn’t let you use the animations (since it happens in Unreal itself) so I had to come up with a custom solution for that. I basically animated all the foliage with just displacing the point positions based on noises, and the noises are just animated by time. I also added some bending controls which are driven by noises which randomly move them over time. I’ve done this to all the assets (both Megascans and SpeedTree) and I think it really added to the realism of the shot because static geometry looks just… static.
These assets were randomly instanced in Houdini by randomizing an instance attribute based on some rule sets.
The rain is just a pretty basic particle simulation. The particles fall down and when they hit the ground they make a little splash. I created trails from those particle sims. The rain was rendered separately from the main scene, but I did refract the actual environment inside the droplets. It integrated way better than when just refracting a HDRI.
The little drips on the ground were done in compositing (fusion). I just made a little animation of circles forming, rendered it as an animated texture, put it on my ground surface and used to displace some of the passes to get raindrops. The effect is very minimal and it mostly gets lost in compression but works quite well.
Water simulation is something I’ve always really enjoyed and what actually got me into learning Houdini in the first place. For water, I used Houdini FLIP fluids. The setup itself was fairly straightforward: I mainly had to make sure that water didn’t go where I didn’t want it to go, and that’s what the custom velocity fields were for.
Apart from that, it was just dialing in the fluid attributes so that they look nice. I did have an issue with meshing which caused weird renders in Redshift. When my mesh had too many small details (ripple), I would get flickering in my water. I’m still not sure exactly what was causing this issue, but I solved it by having a mesh that was more uniform as well as reducing the IOR for the water to a level that’s actually not realistiс. It worked well anyway because the sheet of water is quite thin and when the car moves through it the light obscures it a bit. But this was definitely something that gave me a big headache.
This gave a lot of flicker:
This gave less flicker for some reason:
Example of flickering:
Whitewater was a pretty standard whitewater sim. I just added some particle scale and opacity controls with a ramp so I could fade it. The whitewater worked the best with very low opacity in this case, because I didn’t want it to feel like ocean water, more just like bubbles inside the water.
I also generated a wetmap wherever the water hit the car bumper and color-graded the diffuse and specular AOVs to make it look like the bumper got wet after it hit the water. It’s very subtle, but it’s something you do notice if it’s not there.
Mud was a part of the actual fluid sim. I liked how it formed those trails and how the trails filled up again once the car had passed. Normally, this would be something you would just add in the materials, but I really liked how it looked in the sim. Again, you don’t really notice that too much but if you pay attention to the back of the shots you see the mud slowly filling the trails back up. Also some of the mud gets dragged up in the fluid and floats through the water, which looks pretty cool. It does have some displacement errors in there but I don’t think anybody has noticed that!
GIF of the mud trails in sim:
The mud was just a setup with higher viscosity than water had, plus I used a way less dense point count for that (because it wasn’t needed). I initially also had a UV which would distort based on the sim, but I didn’t get as the mud texture as sharp as I wanted (it was slightly deformed from the very start and looked a bit weird), so I went with just doing a UV projection after the action. Since the mud doesn’t move too much in front of the frame that works quite well.
Materials: Combining Megascans, Redshift, and Houdini
Megascans is great, and I hope there will be some type of Python integration in the future where you can automatically load models and link up the textures to your shaders, but for now I use my own Redshift Megascans loader. There I can point the shader to my Megascans repository and it links up all of the textures. I just go in there to manually change materials if needed.
I do have to link the models manually but I made a Houdini gallery for that. Those models also have LOD controls and the animation controls I talked about earlier. Presets in the gallery also hold other render settings like displacement and more.
I blended materials together in the shader by painting point colors and using them to blend shaders inside Redshift with RScolorsplitter. I would love to have some kind of painting tool in the future where you can blend shaders (similar to how it works in Unreal Engine, for example).
Lighting, Rendering & Challenges
Lighting itself was pretty standard: I used a free HDRI plus made the headlights of the car. Since I just wanted to imitate a real life scenario and was not going for anything particularly cinematic, this worked fine for me.
I was very surprized by how well Redshift handled this quite heavy scene. I think there are about 1.5 GB of assets per frame (the water + whitewater sim is over 1 GB per frame, plus the car itself is quite heavy). I learned a lot about the use of Redshift in heavy cases.
For example, I was first trying to shade my car by unpacking the alembics and assigning shaders afterward. This would cause Redshift to crash instantly once I rendered the entire car. But if I just had the shader assignments based on the same alembic paths while keeping the alembics packed it would work perfectly fine. I guess having Redshift do the unpacking process itself is more efficient.
I actually made all the materials for the car in Redshift as well, just blended together a lot of Triplanar mapping, rain drop bump maps, added dirt with curvature, etc.
Here is the shader graph for the red car paint:
Many times when I render with Redshift, I render 1 frame per GPU via deadline. However, it didn’t work for the final output of this sequence because it was too heavy, so I had to utilize all 3 of my GPUs (2x 1080ti + 1×1080) to work on a single frame. It was between 10 and 14 min per frame, but a portion of that was just loading of the assets.
For compositing, I used Blackmagic Fusion. It was my first time using Cryptomattes and I must say I really liked them! It’s an easy way of getting mattes from your objects. I did run into an issue where Fusion would just randomly make different selections in my Cryptomattes during the render time, so I had to do pre-renders for each of my mattes.
If you found this article interesting, below we are listing a couple of related Unity Store Assets that may be useful for you.