Experienced digital artist Daniel McKay talked about his first time with Unreal Engine 4.
Hi all. First up, my name is Daniel McKay, and I am currently employed as a Motion Graphics Designer and 3D Generalist at a post production company named TOYBOX. I have a background in design, having studied for 4 years and gaining an Honours degree in Computer Graphic Design. Back then (early 2000’s), 3D was still quite a difficult medium to explore, with all sorts of hardware limitations, as well as super slow internet and a lack of information/tutorials online. My first experiences with 3D was learning Electric Image Universe and a tiny bit of Maya. On graduating, I did a mix of jobs in the television industry, but nothing of great mention. I decided to make the move to London around 2008, and the opportunities that arose on various freelance projects (as well as working alongside some amazing talent!) really pushed me to further myself and my skills. Some of the highlights were: a brief stint with Territory Studio working on Killzone 3 cinematics, working with the talented folk at SPOV, and an extended ‘permalance’ gig working with the fine folk at Flock. Throughout all these jobs, I dabbled in anything 3D, using Maya and Zbrush, as well as motion graphics bits and pieces, predominantly in After Effects. Since 2013, I have been based back in New Zealand (Auckland) and am currently interested in learning Unreal Engine and furthering my knowledge in real-time technology. If anyone’s interested, most of my latest works end up here as well as more frequent updates on stuff/experiments here.
Learning Unreal Engine 4
Unreal is one of those packages that I always knew about, but always figured it was out of my reach. Being a generalist, you always try to be a ‘jack of all trades’ but you can definitely feel overwhelmed at times with all the new software coming out, and you have to be really selective about what you want to invest your time in learning. I decided to start looking into Unreal when I started watching the odd speed level design time lapses on youtube. It was amazing to me that people were building these incredible ‘living’ scenes in mere hours, with a full arsenal of powerful tools at their disposal. I was always a big Mental Ray user, and was in the habit of ‘add a light… wait for the render… rotate a little bit… wait for the render… tweak the intensity… wait for the render… etc, etc’, but seeing people just using these things on the fly blew me away. Not to mention the real-time feedback of such usually time-heavy features, like glossy reflections, volumetric lighting, dynamic forces, etc. Also, for some time, I had been working with Quixel for my texturing, and it was great to be able to see these PBR textures really coming to life in a gaming engine.
I’m also a fan of following trends in the industry, and with the sharp uptake of VR/AR/360 videos, and the like, I wanted to find a means to ‘join the party’. I believe there will always be the need for 2D content, but I think there is so much potential with the immersive experiences Unreal can provide, I felt it was a good time to get on board.
I’ve only started scratching the surface on what Unreal is even capable of, but so far (about 3 months in) I’ve find it to be very intuitive and user-friendly. Of course, I’ve run into a million and one problems along the way, but to date, I’ve found the support base online to be nothing short of amazing. There seems to be an answer for everything, and everyone seems really happy to help out. A real community spirit, which is great!
The gaming side of it is still something very new to me, and something that I intend to delve into deeper as I keep learning more and more. Admittedly, I’ve always been more the front-end user, and anything under the hood usually flies straight over my head. Again, I’ve found the initial gaming blueprints (third person, first person, et) a great place to start, and more than meet my needs for the time being.
My background has always involved modeling assets from scratch, retopology, UVing, and texturing, so these requirements were nothing new to me. I’m more cautious now with just how low poly I can make things, however, and creating a second UV set for lightmaps was a new one for me too. Texture-wise, ‘packing’ textures was again, something new to me and following tutorial videos online from the likes of Tor Frick was a huge help for me.
So, for this scene, I was looking online for some inspiration of an environment I could model and I came across the works of Masahi Wakui. This shot in particular caught my eye, and although it didn’t end up looking the same, you can see where a lot of the shapes, and layout came from. I find reference images a huge help when I’m creating an environment, and love that the initial idea can evolve into something completely new by the time you hit that final render button.
The final feel of the scene was to be something quite moody, and eerie. Again, the ‘post-process volume’ feature of Unreal is just incredible. A few tweaks of settings here and there can really change the whole feel of a piece in an instant. Coming from a Maya/AE background, I was used to putting out 10 or so render layers, mattes, etc, and then compositing in post and grading to get something I was happy with. This work flow is great in that you have control over minute details of your sequence, as opposed to Unreal which is more ‘what you see is what you get’. But again, Unreal is a completely different beast for a completely different purpose. I love that I can tweak settings with realtime feedback, shift the camera and boom! It’s all there with no waiting around.
This station scene is actually the first ever scene I have built in Unreal. I shelved it at one point to work on the sci-fi corridor but decided to revisit it and finish it off.
All up, I probably spent about 2 weeks on the station scene in my evenings, and was a really insightful little lesson in learning how to NOT do things! Or at least, how to do things better. It was here that I mostly learned the techniques of packing textures, creating tileable UVs that could also have a lightmap assigned to them, and how to use the foliage tool! The Unreal Marketplace is another amazing feature, and the likes of the free kite demo (or even just the starter content, for that matter) have been invaluable in learning how to make the most of the program.
Modeling Low Poly Assets
As described above, the use of reference imagery is always a helpful way to keep you on track, and not wasting time wondering what to do next. I try and be as smart as possible in terms of working out early on just which objects will need to be modeled, and how I could group them to save on texture space (for example grouping all the modular piping together, and UVing them in one tile).
Maya has always been my go to 3D program, so building my assets here was a no-brainer. I generally start with a blank scene and make sure I have my units set to cm (so they import at the correct scale to Unreal). For a project like this, I start with a 250x250cm plane that acts as my base tileable surface. Any variations of geometry (such as door frames, gutters, smaller walls, etc) I try and keep within these dimensions, or an easy division (such as 10cm or 50cm increments). This works well once the assets are imported into Unreal, so I can turn on snapping to 10/50cm increments respectively, and can drag and drop any combination of the assets and know that they’re not going to intersect.
When I model assets, I try and pay attention to which pieces will be the ‘hero’ elements (will be seen up close or front and centre) and make sure that their geometry holds up accordingly. Of course, performance is key here, so the lower poly count, the better, but really bad faceting can make or break a scene! I also get rid of any unseen faces that will only add to your poly count, or waste UV space. Again though, I’m only new to all this gaming stuff, so maybe my techniques aren’t as great as I think!?! If you have any words of wisdom, please… let me know!
Building the materials has become one of my favorite things to do in Unreal. I admit, when I saw some of the complex node networks people were building just for a simple surface, I freaked out and thought I wouldn’t cope. But thankfully, after watching some great tutorials on youtube, I quickly learnt that Quixel works in a way to really help streamline that whole process. You have the beauty of Unreal presets when using DDO (Quixel’s texturing program), Unreal lighting set-ups in 3DO (Quixel’s realtime renderer), and you can add/remove whichever maps you require. Next, the ‘RGB packing’ technique has been an invaluable bit of information I have learned that has really streamlined the way I do things. For any given object, I generally end up with 3 textures: Albedo, my RGB packed masks image, and a normal map. I have a bit of a system now where I have Roughness in the R channel, Metalness in G, and any particular sort of mask I require in B (AO, paint layer, rusty patches, etc). * See above tutorial pics for general work flow.
One of the things I love in Unreal is the Emissive element of a material. I’ve had a lot of fun experimenting with this feature, and when combined with a post-process bloom effect, I find it a great way of adding a bit of life and interest to an otherwise flat texture. Being a station platform, the emissive feature worked great on making those little details pop just that little bit more.
Example material set-up for the sign elements.
I also made use of decals during my final pass on the look of the environment. This included such elements as the manholes and some graffiti patches here and there. I find these an easy way to break up that ’tiled’ look, helps add more life to the scene, and has a small impact on performance.
One final tip that might help someone out there: when it comes to rendering out a Sequencer animation, turn off ‘Texture Streaming’ in your project settings. I ran into some headaches where textures would render all blurry, but this little trick stopped that happening.
Keep this guy ticked off for Sequencer renders.
Import Assets in UE4
The import process into Unreal has become very streamline and pain-free these days, but started out pretty dismal for me to be honest. There was a lot of back and forth: trying to get the uv’s right, normals weren’t softened enough in some areas (another new concept to me as I was used to everything being smoothed at render time in Maya), and I had some rotation headaches to sort out. Again, the online community seems to be very accommodating for a noob like me, and before long, was able to see where I was going wrong. Generating my own second uv map for the purposes of lightmaps was one of the key things I learned, as I rely a lot on tileable uv’ed geometry. I bring in assets as FBX and if I have generated my own lightmap UV’s, I make sure to turn off the ‘Generate Lightmap UVs’ tickbox. Otherwise, if I have a clean UV tile with no overlapping islands, I can just leave this switched on. Before exporting geometry from Maya, I add simple Lambert materials onto each section that will require a new material in Unreal. This allows Unreal to detect the different material requirements, and then you can add textures accordingly.
The assets that I built for the scene were low poly enough that I was able to run it on my old (since upgraded) machine at work, running a GTX760, smoothly and with no lag. It was only when I dotted in a bunch of foliage assets (as well as the downloaded train) that I felt Like I was giving in to the look over performance. I had made the initial decision that this was a learning exercise, and was not ultimately concerned with realtime frame rates, so I was comfortable with going a bit overboard with ‘extras’. Having said that, when it came down to it, it was still running super smooth (just under 100 fps) on the ol’ 760.
Funny enough, the lighting is as simple as you could get. I used my main directional light as the sun (with light shafts and bloom switched on). I also used a SkyLight to capture a little bit of extra ambient light. I had some spotlights running down either side of the platform to add a bit of interest onto the main pillars, and some atmospheric fog to give it that moody sort of look I was aiming to achieve. And that is truly all the lighting that was used.
My ‘Lighting’ folder showing all the lights used for the Station scene.
The real star here though was the post-process volume, where I tinkered with tints and blooms to make the scene come alive. The bloom helps those emissive materials pop, as well as a few of the old tricks I learned from grading in After Effects over a number of years… vignettes, grain, etc.
I should also point out that I used the dust particles and the steam from the starter content as well, to help add a bit of drama to the scene. These little elements are quite subtle, but they help add a bit of drama to the overall look.
This scene would undoubtedly be fine in a game, albeit a small scale environment. As previously mentioned, the initial build was done with performance in mind, so the basic geometry would hold up no problem. If I was to go ahead with a full immersion scene, however, I would revisit some of the foliage assets and make sure I was placing them strategically, as opposed to just filling up the space. For example, some areas you don’t even see from the platform, but I still went about mapping out some grass assets there anyways (the beauty/danger of the foliage editor…). Also, I’d rethink that train I have sitting in there. On reflection, the style doesn’t feel right and I’ve always felt a bit uncomfortable using assets that I didn’t create myself.
I am still yet to get an HTC Vive, but look forward to getting inside my environments when I finally get one. For now, it’s been PanoCam renders and watching through a cheap headset on my mobile phone!
Again, if anyone’s interested, I post my completed projects here.
I post regular updates and tests to my instagram account.
And, if anyone has any specific questions, or feedback on how I can do my stuff better, let me know on my email.