Creating a Victorian Library in Maya, Substance 3D Painter & Unreal

Daniel Faber talked about the Forlorn project workflow, shared the story behind it, and showed how the cloth simulation was made in Blender.

Introduction

My name is Daniel Faber, and I am a self-taught 3D Environment Art Student from Sydney, Australia. After completing a degree in Audio Engineering at the University of Technology, Sydney, I realized that music and sound design was not a career path I was entirely passionate about. Thanks to the covid outbreak here in Australia, I had a lot of free time to broaden my creative horizons, which ultimately led to me falling in love with 3D art and more specifically, real-time environment creation for games. I love the challenge of having to produce great art while also setting up a scene and organizing assets in smart and efficient ways to maximize performance.

The Forlorn Project

Before starting Forlorn, I had a massive interest in the Victorian era. There is a certain elegance in the aesthetics of the interiors from that era, and I knew that it would be a perfect theme to try and tackle in 3D. Upon searching through ArtStation for inspiration, I found a matte painting by Andrii Shafetov called Forgotten Library, which embodies the type of Victorian-Gothic gentleman’s study I was looking for. 

This piece provided ample opportunity for my own adaption and a solid framework for visual storytelling. I would always recommend working from a concept that inspires you if you are relatively new to environment art simply because replicating great art can help you understand the reasons why it is successful. 

Modularity

After choosing the concept, I got to work planning how I would break the scene into its various parts, deciding where I could repeat assets and work modularly, if possible, to save time and memory. I then compiled a list of unique assets and a list of modular assets that would need to be modeled in Maya and textured in Substance 3D Painter.

Because the bookshelves in the scene were quite large, using tiling trim textures to texture these elements would be the most economical way to achieve adequate texel density while keeping memory usage to a minimum. This is simply because having one 2k map for assets whose surfaces take up around 30 square meters of the scene is much more desirable than comprising them of smaller meshes with unique texture sets in the 0 to 1 UV space. In this stage, I also planned out which elements of the bookshelves needed to be reflected in the trim texture later.

I got to work creating the unique assets that have texture sets in the 0 to 1 UV space using a standard high to low poly workflow. Although the vast majority of the props were made by me, I did end up using a globe, a chair, and a pen from Aiko Shinohara's Library as well as Megascans cardboard box. Don’t be ashamed to use other people's props in your scene!

Cloth Simulation

I did all my modeling in Maya, except for the carpet and curtains, which were done using Blender's cloth simulation. Although it is perhaps not as robust as Maya's nCloth, I find it to be much more intuitive, and it worked perfectly for what I needed it for. To create the curtains, I created a plane, SubDivided it generously, and hooked the top row of vertices to a hook object. I then added these hooked vertices to the vertex group and added this group to the pin group in the cloth settings. This allows you to scale down the hook object while the simulation is running to draw the curtain open. I then animated a collision plane underneath to create a more natural scrunched-up look where the curtains meet the floor. I also used a torus with collisions that could be scaled down around the X and Y of the curtain to pinch them closed.

I wanted the carpet to look as though someone had been frantically running around the room and moving furniture out of place on top of it. I could achieve this look by making parts of it crumple up and flipping corners upside down. I tried using force field objects as well as manually moving the mesh around while the simulation was running, however, these techniques ultimately failed. I achieved the final result by importing the low poly table mesh, adding collision to it, and moving the floor plane the mesh is sitting on upwards along the Z-axis. This forces the mesh to collide with the tables and fall organically back onto the plane below. I think the resulting scrunched-up pattern really adds to the composition of the final piece as it leads the eye around the room as well as adds detail to a place that wouldn’t have much visual interest without it. Be patient with your simulations, they often take lots of fiddling and tweaking to get the result you are looking for. 

I then added a Solidify modifier to both the curtains and the carpet to provide some thickness, which would allow me to create lightmaps that contain UVs for both sides of the mesh. If I was to keep these meshes as one-sided planes, I wouldn’t be able to assign a different material to the underside of the carpet or represent the back-facing side in a lightmap that could contain the light bake later. Although this doubles the polycount, it was a necessary cost in this case. Finally, I used a Decimate modifier on both meshes to bring the polycount down drastically while keeping the overall shape intact. It does this by taking away vertex density in areas of the mesh that don’t need it while preserving it in areas that do.

Don't forget to UV your meshes before simulation! It will be a colossal amount of work to achieve adequate UVs if you wait until after the simulation.

Books

I knew that, in order to have enough variation in the book textures, I would need to be using texture atlas with a range of different 19th-century book scans. This meant I needed to create a set of “high” poly books with unique unwraps assigned to a range of 19th-century book scans I found online (I use quotation marks because there wasn't any difference in polycount.) I then duplicated these “high” poly books to create the “low” poly mesh and combined their UVs into a singular set so that they could be used as the “low” poly mesh in the baking process. I then imported the “high” and “low” poly into Blender and baked the diffuse color from the “high” poly mesh to the “low” poly with the shared UV set. This gave me a great 2k texture atlas for the books and a starting point to create my Normal and Roughness Maps in Photoshop.

Once in Photoshop with my baked Diffuse Map, I needed to do some adjustments to make sure that the image was as flat as possible. Often these scans had a shadow on one side of the spine due to the lighting conditions when they were taken. We don’t want these shadows coming from our Base Color Maps and we should do everything we can to make sure they are minimized. For this, I used the Camera Raw filter to lift the shadows and squash the highlights. It is also important that you make sure that the overall exposure or RGB value of each book in the atlas is in the same range, nothing too bright and nothing too dark, this will ruin the illusion of variation in UE4 as some books will be brighter than others.

To create the Normal Map, I used the Generate Normal Map filter in Photoshop. I wouldn’t recommend using this method with many other texturing pipelines, but in the case of these books, it was enough to create some nice bump textures to break up the surface. The Roughness Map was created by taking the diffuse color and bringing the saturation right down to convert it to greyscale, I then used a levels adjustment to make sure nothing was too rough (white) or too shiny (black). This was a very quick way of making Roughness Maps and didn’t necessarily reflect realistic areas of shininess, but it helped to break up the specular reflections on the surface of the books.

This technique was used on all the book meshes as well as the letters and pieces of paper. This allowed me to duplicate 16 different variations of the same mesh around the scene while needing minimal texture sets. 

Trim Textures

To create the trim textures, I started with a 4x4 plane in Maya and used loop cuts to cut the plane up into the intervals that match the size of the part of the mesh it will be applied to, I then modeled the details of the Victorian moulding as well as the bevels needed to help break up the sharp corners of the mesh, and it was ready to import into Substance 3D Painter to be baked onto a square 4x4 plane.

I used a combination of Wood Smart Materials from Dmitry Dovgal, removing scratches and wear where it was unlikely to be, to try and reduce the procedural feel that smart materials can often give you. I would highly recommend investing in some high-quality smart materials as it gives you a great starting point to work from in your texturing pipeline. I also added some extra grunge layers only affecting the Roughness channel to break up the specular reflections, and the trim textures were complete. This step was done on basically every asset with a lower Roughness value as a perfectly clean surface will likely always catch the viewer's eye and break the realism of your asset.

Materials

My material setups for this project were nothing complex, with maybe the exception of the velvet shader I used for the curtains.

This velvet shader was created with the help of Ben Cloward’s Cloth Shading tutorial. The shader essentially uses the camera vector to brighten or dim those parts of the mesh where the mesh normals are facing the camera. This then allows you to invert the function with a 1-x node and brighten or dim the areas of the mesh that are not facing the camera (the rims). Combining these functions allows the flexibility to fake the way light interacts with different types of cloth. As you can control the strength and brightness of each effect, I tweaked the parameters until I was happy with the result. The curtains were looking very dull and plastic-like until this shader was applied to them.

Another tip I used when texturing many of the assets in this scene was blending an additional distortion normal into the master material so that small amounts of distortion could be added if need be. I found this technique while searching through Aiko Shinohara's Library project that I got from Gumroad. I can’t express enough the value of buying UE4 projects from the artists that you admire and seeing how they set up their master materials because more often than not you will learn something in the process that you can bring to your own art. This distortion effect is subtle but really helps to break dead straight lines that can often feel clinical and inorganic. I used this for all the books, tabletop, floorboards, and bookshelves.

Final Composition 

I wanted the story of this piece to suggest that this library has recently been left in a hurry and whoever left it was desperate to find what they were looking for. I wanted to match the reference where I could, so I used the totally outdated and cumbersome method of hand placing the books around the scene, duplicating piles where I could. I tried to carry on the visual storytelling by placing items on the table in such a way that suggests this person was in a state of stress and needed to find something very urgently. I think placing open books in various configurations with loose papers in between them helps to communicate this narrative.

Lighting

Lighting is undoubtedly the most fun part of the project for me, it’s the part where you get to see everything come together, but it is also the time when you might realize that your shaders need a lot of tweaking.

For this project, I used a combination of static and stationary lights as getting great-looking global illumination in an interior scene with Dynamic lights is a huge challenge. A Skylight is providing the ambient light coming in from the window and filling most of the room. Instead of using real-time capture with my HDRI sky sphere, I used an Ambient Cube Map that matches the sky sphere outside. I also used a stationary Directional Light coming in from the window to bring the eye to the focal point of the scene, which is the old gothic bookshelf. I quickly realized that with just these two lights, I couldn’t quite get the room to illuminate the way I needed it to and there were little to no specular reflections on any surface.

I then realized that static lighting contributes almost no specularity to your shaders and that further stationery or dynamic lights needed to be placed in order to bring them to life. There is no shame in placing “fake” lights that don’t reflect where light would naturally be coming from. If it improves the look of your scene, use them, be careful, however, of using too many shadow casting lights and set them to non-shadow casting if you can. I used four additional Area lights, two Point Lights, and a Spotlight to illuminate the books behind the glass as well as add some needed specularity to the books on the table. I should also add that this scene is using ray-traced reflections and shadows, which makes your job exactly 10 times easier.

Finally, in the post-process volume, I used a very small amount of chromatic aberration, film grain, and bloom. These effects are subtle but really help bring your scene to life. I also used an exponential height fog to create a slight dusty haze in the room as well as some dust particle effects using Niagara.

Conclusion

This project took me around 2 months to complete, I was working 3 days a week on it. This is because most of that time was spent learning the craft and understanding the methods of other great artists in the field. This project was definitely daunting at the start, but it is astonishing to see how much you can learn when you push yourself beyond what you thought you were capable of. If you are a student in this field, it is incredibly important to take projects that are slightly out of your comfort zone because it is only then that you can force yourself to learn, grow and become a better artist. This project was an incredible learning experience for me, and I cannot wait to move on to the next one!

Daniel Faber, Environment Artist

Interview conducted by Theodore Nikitin

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more