If you are search that from where is my clipboard in windows 10 then i have seen that from the google i have collect all the information because it is the right place that you know very clearly.
Mnay people has been ask that what are causes mesothelioma other than asbestos then i have get analysed and the whole informative information will be seen in your zone i am thankful for those who are help to collect the all information on this disease.
When first time my friend has been share this roblox robux free game then i have shocked when i have seen that there is lots of game that could be help to find that how this will be played.
Alla Chernova shared a tutorial on techniques and tips she used to create a full CG open environment in Houdini.
In this tutorial, I want to talk about some techniques and tips that I used while creating a full CG open environment in Houdini.
I was using a lot of procedural modeling, scattering and texturing. Houdini is an amazing program that is capable of handling a lot of geometry. It has an array of useful tools, which are perfect for creating big environments. In addition, I wanted to experiment with Redshift render, because the speed and capacity of GPU rendering are very impressive.
Working as a CG Generalist at Ingenuity Studios, I spend about 80% of my time working in Houdini. I assemble assets for projects, work on shading and lighting, as well as deal with animation and FX caches and render out images. I feel nowadays it is crucial for a CG Generalists to explore Houdini and add it to our skill set because a lot of companies find Houdini beneficial and move there a big part of the pipeline.
I am a CG Artist, born in Russia but currently based in LA, with a strong background in Traditional Arts. Prior to doing 3D, I was working as an Art Restorer, and I practiced many different analog techniques as an Artist, including oil painting, watercolor, drawing, mosaic, and fresco.
At the Gnomon School of Visual Effects, Games&Animation, I got my second major as a CG Generalist. As a part of my journey at Gnomon, I had won the Grand Prize for Best of Term contest with my animated short Lost Fish.
Since I started my work in the industry, I participated in some of the most well-known productions in the nation, such as Black-ish, Counterpart, Super Store, The Walking Dead, and Arrested Development; programs that are not only hailed by the general public, but critically acclaimed as some of the most outstanding in the industry, as demonstrated by the Primetime Emmy, Golden Globe, Writers Guild of America and other notable awards these shows have received. I have also worked on feature films of many renowned and lauded directors and producers including The Best of Enemies, directed and executive produced by Robin Bissell (The Hunger Games, Seabiscuit) and starring Academy-Award nominee Taraji P. Henson and Academy Award winner Sam Rockwell, and Happy Death Day 2U, directed by Christopher Landon. In addition, I have lent my abilities to some of the largest video game projects ever released, the most notable of which is Destiny 2 Forsaken and its cinematic trailer Last Stand.
My work always starts with gathering a reference library. For this project, I wanted a specific composition: a wide shot presenting an environment of an island from the ocean. The most ideal island landscape for me was Bora-Bora Island:
I created multiple renders with a different composition to develop a camera angle I enjoy the most:
Image E had the most interesting composition to me. From this blockout, I started to develop the final scene.
The next step in my workflow on the environment is to work on lighting. Lighting not only creates highlights and shadows in an image, but it also sets the mood, supports compositional choices and helps to tell the story.
I wanted to have a daylight scene. However, the sun should never be right above, because shadows are usually the most appealing part of an image. I feel that a good balance between light and shadow is as least 40/60% accordingly. My favorite lighting setup is when the sun is shining from behind from SL (screen left) or SR (screen right).
I chose version A as my primer lighting scenario for this environment. I liked how it created subtle shadows on the water, emphasized highlights on the mountains, and centered composition on the foreground hut SR.
Building Terrains with HeightField Tools
To create the terrain system for this environment I used the HightField tool in Houdini. My first step was to create an overall silhouette of the desired terrain by applying different types and sizes of noise to create surface breakdowns on a HeightField plane:
I wanted to have a mountain peak that would be located on more or less of a flat surface. I created a painted mask for the peak area and blended the peak and the flat areas using the mask:
The next step was to Compute Erode Range for the HeightField terrain, this would late allow me to create different sorts of masks for texturing and vegetation scattering down the line. When the terrain finally looked the way I wanted it to, I converted HeightField Terrain to Polygon mesh and cached it out. Caching my network helped make the scene lighter and easier to use.
For this project, I created two different HeightField terrains and used them to create the environment that I wanted by rotating and scaling them.
For this specific environment, I had to create a whole mountain chain. In order to create the kind of silhouette for the back mountains that I wanted, I needed multiple instances of the HeightField terrains that I created. Even though these were instances of the geometry, because of the scale and number of polygons it still was very heavy for the scene. As a solution for this problem, I decided to bring all GEO of the background mountain chain in Maya and merge it together, deleting all intersecting and unnecessary polygons. I ended up with one geometry for the whole mountain chain on the background and one geometry for the areas that I wanted to texture as sand. Procedural generation for terrains is an amazing tool, but it is hard to do only procedural modeling if there is something very specific that I want to achieve.
This solution I came up with, is a unique way of dealing with a big mountain chain situation – to merge it all into one geometry. This solution works if I have a very specific silhouette in mind, and I can’t get it from one generated terrain, or when I want the terrains to also cover the bottom of the ocean, or when I need to have very specific UVs.
For the foreground terrains, I kept the instances because I needed to use the masks that I baked out from the original HeightField terrains.
For vegetation, I used the SpeedTree asset. I started from one type of tree – a palm. Trees needed to go on top of the terrain but shouldn’t cover 100% of it. I needed to use some sort of mask, that indicated where the trees should and should not grow. I created multiple variations of masks by using the HeightField MaskbyFeature node and backed them out from HeightField Output node:
I was able to use those masks right away, or I could bring them into Photoshop and create new masks by blending and layering maps together. After all the masks were ready, I brought the ones that I needed in Houdini through AttributefromMask node and created a Null node with clear naming for me to understand which mask I needed to source in the scattering process. These are the masks as an example for the SR terrain:
For the vegetation system, I created a GEO node for each type of plant, for grass I did pre-made patches of grass in a separate scene, this way I don’t have a million scatter points in the main scene. Each one of the plants received its own instance nodes where I set up the scattering network. When I set it up for the first tree, I copied over the same network to all the other plant’s instances, changing the masks I was sourcing, the number of points and the scale limits for each.
There are multiple ways of approaching texturing for the environment. I could texture it in Houdini, bringing in some rock and sand textures and blend them by using masks. I could export geometry to Substance Painter and texture it in there. I could pre-make tileable textures in Substance Painter and assign them in Houdini, or I could do texture projection from the camera in Houdini.
I started with the method of creating complex tileable texture maps in a Substance Painter that included more than one material and assigned it in Houdini. It is the perfect method for procedural modeling and texturing. However, in my case, I wanted to have certain textures in very specific places, such as roads, dry sand on the beach, wet sand near the water, underwater plants, grass around huts, and so on. The solution for this was to do a camera projection texture, that I could hand draw in Photoshop.
All the other objects in the scene were first textured and lookdeved in a separate file and then the shaders were copied to the main file and assigned to instances.
For the Ocean Simulation, I used the Houdini setup, Small Ocean. It comes as a network, where I was only adjusting the Preview Grid and the OceanSpectrum parameters. In the Preview Grid node I could control the size and density of the grid, in the OceanSpectrum node I could change all sorts of parameters for the waves: tilting, height, speed, direction, and so on. There was no specific recipe on how to create the perfects ocean, I was just playing with parameters in the red boxes below until I was happy with the result:
The environment is big, so the ocean surface is very big too. Without any additional work to it, it looked too tileable and uniform. In the references I gathered, I could see that the water pattern changes when it gets closer to the bay. With that thought in mind I wanted to break down the repetition of the texture, I masked the areas that are closer to the bay and lowered the intensity of the waves:
At the end I cashed the network and converted ocean cache to Redshift Proxy, so Houdini doesn’t calculate the whole network every time I change a frame or turn on/off the visibility of the ocean.
Small Detail in the Environment for Scale Reference
For such a huge environment it is very important to keep everything proportional. Before scattering in the main scene, all of the GEO has been relatively scaled in a separate scene:
I brought in a human model for reference and was scaling everything in proportion to it.
However, even when I had all of the objects scaled correctly in the main scene, it still did not feel proportional and as a result, it did not feel real. The solution for this issue was to add lots of even smaller details to the huts and island, such as birds, boats, carpets, flags, chairs, paper, palm branches on the hut roof, glasses, and so on. Those details added the final sense of scale, a bit of imperfection, and a feeling of a human presence:
Most of those small props had some sort of animation. The carpets and flags had noise offset animation, that looked like it moves with the wind. Boats were slightly moving in the water as well. This animation isn’t very visible while watching the sequence, but it adds life to it. If the sequence didn’t have the additional animation, it would look stiff.
Rendering with Redshift
For this project, I was using Redshift. In the beginning, I had all of the objects as geometry or instances, except for the ocean. The ocean was already cached and converted to proxy, because it was too heavy to work within the scene. When I started rendering the sequence I faced a problem – the renders stopped after 30-40 frames. It seemed that the computer was exceeding the memory limit and wasn’t able to proceed. The solution for this problem was to convert all of the geometry in the scene to Redshift proxy, except for the mountains and the vegetation instances:
Although all of the props are small models it is still a lot of geometry scattered around. After everything was converted to proxies renders went smoothly to completion, with a very reasonable amount of time per frame. For the HD resolution, the final render time was an average of 30 min pf.
For my renders to be usable in compositing I always do EXR 32 bits images. In order to have more control over renders in Nuke, I render all available AOVs for Redshift.
- Procedural tools are exceptional, they help to speed up my work tremendously and make it easier. However, in almost every project I face an issue, that needs some creative solutions. For instance, doing texture projection from the camera or merging geometry for the background terrains.
- Being very organized in my work is always very important, especially in the studio environment, when I am not the only person who works on a file. Even when I work on my personal projects, I try my best to have some system in place, especially when I am working on a huge environment. There is a certain color code for some nodes in Houdini, it makes it easier to orient in the file: transform and layout nodes – green, renderable nodes – purple, lights – yellow, camera – white, material network – blue, null as an OUT node – black:
- Every step along the way I considered from the perspective – what is the fastest way I can achieve the best quality? It is important to deliver high-quality work, but it also very important to hit certain deadlines, meaning the work needs to be fast as well.
Thank you for reading. Hope you found it useful and enjoyed it!
Alla Chernova, 3D Generalist
Ultimate River Tool is a powerful and easy component for Unreal Engine 4 that allows you to make rivers with automatic flow-map UV warping, cascades, and interactions with physical objects.
All future updates are included and will be available for download as soon as they are released.