AI applications are only beginning to materialize. It's so exciting what neural networks will do to art workflows in the coming years.
W T F!!! Why, really don't understand, EA = just a fucking looser company!
Amazing... Congratulations for the new way to show information.. I hope they could use this to teaching.
Lincoln Hughes showed some amazing stuff he’s been doing with Megascans, optimizing scanned content for in-game use. You can also download the free displacement blend materials on Gumroad.
This whole thing honestly started out as me forcing myself to do something productive. “Get into Unreal and do something cool”, my brain said. At that point, all I really wanted to do was play another round of Fortnite, but I eventually gave in and subscribed to Megascans just to see what all the fuss was about. Boy, was I impressed.
What started out as just signing up to the site turned into much more. Once I’d downloaded a few meshes and materials and seen just how amazing the quality of them was, it was time to take the next step and see how much mileage I could get out of them. I took my Parallax Occlusion master materials and tested them out in Unreal, and bit by bit the scene slowly started to come together.
The creation of the scene wasn’t really focused on creating content, because I really wanted to see how fast I could actually build one using (mostly) assets directly downloaded from Megascans (the trees are from the Unreal Kite demo). They’re all made in such a way that they can be easily scaled down in polycount and texture resolution without too much of a loss in quality.
Another goal was to see how far displacement tessellation could be pushed in Unreal 4. So I took all of my Parallax Occlusion blend materials and converted them into Displacement Blends, and the results speak for themselves. With them, it’s easy to just plug in your textures and go, blending between 2 texture sets, or puddles. You can change the contrast of the blend to create a muddy shoreline, alter the color and height of the puddle, use world-space UV’s.
What had started out as a lazy afternoon had slowly evolved into a full-on project that anybody can download and use to test out their textures, complete with lighting and asset examples.
You can also download the free displacement blend materials on Gumroad!
I chose mostly nature assets because the majority of the assets on the site seem to fit that narrative. They also require the least amount of tweaking to get them looking good. I don’t need to clean up UVs for LODs like I would with an architectural asset, and I can procedurally create them for each mesh with the click of a button in Unreal 4.
The assets are also very high quality. For example, say you buy a scanned rock on the site. It will most likely come with 4-5 LODS, the high poly asset, and a 4-8k texture set (roughness, albedo, normal, height, etc). The poly-densities and texture resolutions are easily adaptable from film to game quality with a few custom settings in your static meshes and texture objects. More on this in the next question.
In terms of cost, a single nature texture set is anywhere between one to four credits depending upon the complexity of the material, and every month you get 30 credits to spend so for me it was a no-brainer. If you want to quickly build out your scenes and minimize the number of interruptions you have from content-creation, subscribing makes so much sense.
Polycounts aren’t nearly as much of an issue as they once were, and the tools that we have now to optimize high-poly assets are better than they’ve ever been. Like I said before, Megascans gives you 5 LODs sometimes, and if those are still too high-poly for you, it’s incredibly easy to procedurally generate new lods directly inside of Unreal:
When importing a mesh from Megascans, if the poly-density of the LOD 0 object is too high, it might be necessary to take their LOD 5, and import that as your LOD 0, and then use Unreal’s procedural poly reduction tool to create all new LODs that fit the polygon density parameters of your specific game or scene.
To do this, double-click on the mesh you’d like to create LOD’s for, scroll down to the tab “LOD Settings,” and choose one of the presets within the “LOD Group” drop-down menu. After choosing one of those presets, Unreal will procedurally create new LODS for you based on the number that you choose in the “Number of LODs” setting. It will automatically down-res the poly-densities of each sequential LOD, as well as create appropriate LOD distance-swap parameters for each of them, based upon the size of the mesh and the settings that you choose. It can be extremely handy.
You can optimize your textures by going into your texture settings and changing the LOD Bias. With this, you can procedurally down-size your images to something more appropriate directly in the editor, without the headache of using Photoshop. In the image below, see how the texture was imported at 8K resolution, but when we change the LOD Bias to a value of 2, it gets down-sized to 2K? Much more game-friendly!
That was actually a big part of the reason why I started making this project in the first place: I wanted the average person to be able to just throw their assets into Unreal 4 without worrying about all of the headaches that are bound to show up once you import them. The lighting is already set up, there are examples of settings for each mesh and texture, and each material caters to the needs of anybody wanting to just GO and get on with creating their scene.
That’s one thing about working on personal stuff that you just can’t beat while working at a game studio: the ability to just do art. There are so many little technical hurdles that you constantly have to iron out. The lighting is broken. The grass isn’t moving, etc, whatever it may be, in a game studio, there are departments entirely dedicated to making it so that all you have to do is go. No hoops, no stress. Just art. Sometimes…
That was the purpose of this project, to remove the stress of dealing with these things. Anything that will make it easier for somebody with limited knowledge and a passion for creating something cool, to input their stuff as seamlessly as possible, was key.
For instance, making grass light properly is actually a really complicated task with about 10 different solutions depending on the unique circumstances of your scene. Your lighting and material settings have to work for it, as well as your mesh and material settings as well, and without the technical knowledge that such a task requires, you’re kinda left in the dust.
Without getting into too many specifics (because that could literally be a full tutorial on its own), I think that being surrounded by people who are willing to help you solve these problems is absolutely essential, so be sure to join the Unreal 4 Developers Community Facebook group, or any other 3D group that’s relevant to what you’re working on.
Creative freedom behind scanned assets
Using the foliage painting tool is pretty key for this. It will randomize the scale, rotation, and translation to whatever you want, so I’d definitely take some time to play around with that.
It’s quite easy to use: Just click on the Foliage Brush tab (the one with the leaf), drag and drop the static meshes that you’d like to experiment with into the bottom area, and customize the setting for each depending on what you want:
Select the meshes that you’d like to paint by checking or unchecking them directly in the Foliage Brush tool, like this:
After you’ve set up all of the meshes that you’d like to use in your scene, you can paint them onto any static mesh or landscape in the scene by holding CTRL to add, and SHIFT to remove:
A more complex, content-driven strategy for adding more variation into the scene could be to create all new clusters of assets from the original Megascans objects. If you have some knowledge of Substance Painter and Z-brush, you can import a group of high-poly photogrammetry meshes into Z-brush (say 5 different rocks), create a new arrangement for them, export the clusters of props out as one new mesh, and bring them into Substance Painter and retexture them. I didn’t actually do this for the project, but I’ve been meaning to try it and see how simple it would be. I think this process is quite documented online, and finding a tutorial with Google should be quite simple.
It’s a set of Displacement blend materials, that are designed so that the average artist can just input their textures, tweak a couple of sliders in the material instance, and then have something that instantly looks good enough to show off.
Here are a few of the features of the materials included in the scene:
- Vert-blend between height maps and puddles
- Two-layer displacement texture blending
- Easy puddle customization
- Foliage master shader for grass and leaves
- World space UV’s
- Distance-based tessellation
For the actual blend itself, I just used a Vertex-Color node paired with a Height-Lerp function inside of the shader network, all controlled by a contrast parameter that can be tweaked per instance by the user. The contrast parameter is then plugged into every height lerp function on every rendering pass (Albedo, Height, Normal, Roughness, etc.) and can be used to easily tweak how hard the blend will be between the puddle and the texture set. It sounds really technical and overly complicated, but once you’re familiar with this stuff, this is actually quite basic compared to some of the crazy stuff that a technical artist would do. If you want more specifics regarding how to actually do this stuff, download the files and crack open the master materials. They’re all there for anybody that wants to cross-examine them.