Developing Environment for VR Games

Dan Sonley talked about the way he created spaces for a VR-game Dexed, developed by the wonderful people from Ninja Theory.

Dan Sonley talked about the way he created spaces for a VR-game Dexed, developed by the wonderful people from Ninja Theory.

Introduction 

Hey! My name is Dan Sonley, an Environment Artist at Ninja Theory in Cambridge.

I studied Games Art at Teesside University, then after I graduated I joined Ninja as an intern though one of the Rookies internship schemes 3 years ago.

I joined Ninja during the development of Disney infinity 3.0, and since then I have worked on multiple projects, including a large role in the development of Dexed (a VR rail based shooter).

About the project

So Dexed is a VR rail shooter made by Ninja Theory. It was a fun little project made by a handful of people here. It kind of just happened one day out the blue. It started out as a game jam, then suddenly grew into something much bigger. During the game jam there were two artists: me and a shared FX artist. So my role was to drive the visuals from the ground up. I chose a simplistic art style as I knew it would be a push to get it all finished in time.

Game jam level – almost everything in this was Zbrush, dynamesh, decimate then flat shaders.

It was a pretty big surprise when the directors at Ninja told us they wanted us to keep going with it and develop it into a full shippable game. At the time it just seemed like a fun little project and nothing more, and because of that it felt like a lot of work to bring it up to a full project. So naturally we up-scaled the team, increased the scope and time frame.

To me, this game was really special and always will be. It was the first project where I felt properly in control. Beforehand, I spent most my time working as part of a much bigger team on much larger projects.

The first major task was to work out how to turn my hastily put together game jam level into a full, shippable product.

With new artists on board and a new scope, we all agreed to keep it stylised but to bring in a lot more detail and realism. Having previously worked on stylised titles such as Disney Infinity we all felt fairly competent in achieving this.

We unanimously felt it best to just start development afresh and not worry about redoing stuff. This of course meant more work, but allowed us to plan out proper levels with distinct themes. It also provided an opportunity where we could design much more enjoyable levels from our initial experience created during the game jam.

The main goal for the art was to create an interesting and captivating experience whilst reinforcing the intended player interactions, keeping the gameplay smooth and readable.

The switch to working with VR

The norm for Ninja is to develop narrative based 3rd person action/combat style projects, Hellblade being a great example of this, so jumping to a first person VR rail shooter was a big change from what the team were all used to.

For me, the biggest change for me to adapt to was not knowing where the player will be looking. VR gives the player total camera freedom, making it very hard to know where exactly someone will be focusing on.

In some of the initial tests we found people would spin around and shoot things behind them then miss everything going on in front of them. I never felt we really found a reliable solution for this; you just have to do all the usual tricks and hope the player is paying attention.

In Dexed, the player was encouraged to look in the forward direction by our use of subtle VFX, but mostly by the constant forward movement. We were quite lucky that this just happened to work out for our game.

For a lot of developers, hitting the high frame rates necessary for VR seems quite daunting; nailing a solid 60 is hard enough in most cases, and 90fps is an even harder target to reach. However in my experience, if you plan and work to the strengths of VR & your engine, it isn’t as hard as you would initially expect.

Dexed works using a floating camera locked to a spline, which meant we could make some intelligent choices that don’t suit other gameplay schemes with the environments. We don’t have any collision for example, or if you freely fly the camera around you’ll find there are lots of huge holes in the environment.

Avoiding unnecessary movable objects is a good win as well, and dynamic shadows are incredibly expensive in most cases, so building a scene that wouldn’t benefit from them saved us massively. Using static lighting was a huge saving but came at a visual cost; mainly that things can become very still and lifeless or looking unlit/flat. So this was something we had to counter in a number of different ways. Mostly by covering stationary objects in moving materials/particles but I’ll be going into that later.

Hitting 90 FPS in VR is a tall order, but if you plan development with that in mind and work smart, it’s a very achievable goal.

Production

At Ninja, we tend to work closely in our teams and there’s a lot of bouncing back and forth. We try and avoid ambiguous greybox prototypes which don’t translate into actual art/games. There’s nothing worse than trying to work out what a bunch of cubes are supposed to actually be. Bouncing back and forth at the very early stages means that the greybox can be very representative, leading to better art and often better, more immersive gameplay. 

This project was a little different, however. The levels can be compared to an old school scrolling backdrop, and as such they have little impact on the gameplay. There was little need for level flow or structure, we just drew an interesting looking spline and went with it. This meant that during the early stages of production, our artists got straight into making levels and designers went straight into prototyping mechanics, wave formations and timings.

With the inferno level, I started by gathering plenty of reference, trying to decide what I actually wanted to make. I knew it had to be in a bowl shaped volcano; I just had to work out how to make it more visually interesting than a bunch of lava and rocks. I felt the way to go was with geothermal rock formations, with my main inspiration being the Giant’s Causeway in Ireland.

I quickly boshed out a first pass greybox in Maya, then once I was vaguely happy I remade it in Zbrush. I choose Zbrush for a greybox as it meant I could get the more organic rock forms prototyped much sooner. This looked blobby and messy but as long as it gets the layout and idea across it’s enough to allow me to progress.

The initial level was much longer and winding, but it made you feel sick doing such tight loops so I ended up taking the first half and doubling the size.

This zbrush greybox was a few shapes mashed together then decimated. I not only used this as the greybox for but it’s still in the final game as a base layer. It’s pretty horrid looking but it was enough to keep me on track with the shapes, but mainly it meant there were no holes in between rocks.

The majority of this level is made up with a “background” rock. As you can see, it is nothing fancy, just a very simple none descript rock. I purposely chose a dull and basic rock as it was dirt cheap and wouldn’t draw any attention to itself.  Using this one rock I could make the majority of the level without people spotting much repetition. In the image below you can see just how much is made up of that one rock, like 80% of the screen in this shot.

After my initial placing of rocks I felt it time to bring in the centrepiece assets, the basalt rock stacks.

The basalt rock were sculpted in Zbrush, and painted in Substance Painter. Whist in Painter, I was able to quickly generate some masks I could use in the unreal material for the emissives. As for the Dexed style, the shapes and details are simplified with an overly simplified specular response.

One thing I’ve noticed with VR is that the fine details are often lost due to the low resolution displays, whereas the overall silhouette and form is extremely prominent. So to make the most of our assets, I used high res geometry with aggressive LODing.

It’s easy to forget that most current VR headsets aren’t that different resolution from a standard monitor. With a VR headset though, the image covers your entire field of view – this makes the effective DPI/PPI much lower.

Although 2k tri’s alone isn’t a particular complex mesh, considering how many of these I used the total tri count quickly racks up, not to mention the quad overdraw. So I use much more aggressive LODs than I would have usually gone for.

These are also good example of the types of simple animated materials we would use in Dexed to bring life to our fairly static environments. Being such a stylised and unrealistic setting gives a lot of artistic freedom. In the inferno level it meant I could use lots of glowing hot emissives, which in most cases didn’t really make any sense but was just there to look bad ass.

The emissive level is on a gradient, so I can push how high the heat glow climbs up the asset with a scalarParam. On top of that, there is a noise mask that pans over it to add a gradual and subtle flicker. Relatively simple stuff, but effective for our visual style.

These stacks were also contained within a blueprint; again nothing fancy, just a construction script that picked a random stack material variation and rotation. Working with blueprints in this way saves so much time, took only 5 minutes to make and saved hours of individually rotating and alternating rocks by hand.

This is simple stuff but it’s enough to save me a lot of time. I managed to make the whole level with only 2 rock meshes in less than a week.

The rest of the level followed a very similar format. Once I was fairly happy with the scene, I bulk exported it straight into Zbrush, in which I added all the lava blobs.

Once all the geometry was at a point I was happy with, I could go though and give the map a final lighting and post pass. This part is where the environment really comes to life.

Through lighting and post I brought in lots of cool blues and purples with the occasional pink to juxtapose the harsh oranges and reds of the scene.

Bringing our levels to life

As mentioned, all the levels in Dexed were actually quite static, and as such there were very few moving parts to them. So it was important to focus heavily on combating this.

All of our levels were flooded with low cost, spammable particle effects and animated materials (like the basalt stacks). Sometimes simply a subtle, slow panning noise in the world position offset was enough to bring life to a still shot. Kris Doggett, the FX artist on Dexed, did an incredible job with this. Being such a stylised and bright game he was really able to go overboard, which was truly the perfect way to bring our levels to life.

One of the levels which I think felt especially alive and dynamic was the Arcade level. This level was distinctly different from the rest of the game and offered an alternate gameplay mode.

Even though it lacked the visual impact of the other levels, it was a refreshing change with a new gameplay style and a few new mechanics.

This was the least expensive level, it ran at a solid 90 from day one and never dropped below. This is mainly as it it’s all emissive and unlit. The only difference between lit and unlit is that you don’t get post processing in unlit.

In this level, I set up every material in the scene to be controlled by blueprints, which in turn was controlled by level progression, music and gameplay. The level never ends, the difficulty increases, and the experience becomes slowly more and more psychedelic. I really enjoy the technical side of environment art, so I loved making this level; it’s definitely the one I’m most proud of from the project!

Each spawn point is its own blueprint, it meant each one can be individually controlled.

Yes, each spawner was hand placed… Yes, it was a pain… No, I didn’t make a script for it..

Long story short, I wanted to make a super duper blueprint system that built it all for me, but it gets to a point where it becomes so complex that it ether needs to be done in code or you could have just made it easier by hand. When I built it all in one blueprint it became complex to control individually, I also found that if I change stuff it would reorder them and break all of the design work. So I ended with a midway system, hand placed but blueprint script assisted. Each spawner faces 0,0,0 no matter what. So as I move it sideways with local axis it will rotate as it moves, this means it translates in a sphere. This made it quick to place yet retaining the individual component aspect to it.

When working with dynamic parameters, I find it good practice to try and keep things unified. I often make all the scalars function from 0-1. Even if the true values should be 0-3000 I would just scale it in the shader to keep those unified controls. This makes it much easier to work with and not have to remember and balance different values for every parameter.

The background EQ waves were completely dynamic and controllable within a complex material. Bit of a nuts material for it.

The breakdown is pretty simple though. It’s a line offset by a sine wave. The tricky part was offsetting the sine wave multiple times by different values. To get it to work how I was wanting I had to offset the sine wave before I’d put it into the texSampler rather than offsetting the warped texture sample, hence why I have so many of the same things going on in this shader.

The core function to this graph is super simple. However as I ended up combining 18 individual waves I ended up with way too many instructions. I managed to cut it down significantly by coming the offset amounts into the RGB, then just splitting it out again later down the graph.

In UE4 it’s easy to make use of MPC (material parameter collections) in both blueprints and materials. So the majority of this level is done by driving the MPC values from the level blueprint. As an artist, most of my graph knowledge is material shader stuff so Blueprints can often feel quite alien. Luckily being part of a wider team I can just get a designer or a programmer to check over them to make sure I’ve done nothing crazy, I found it good practice to get programmers to poke around in your graphs even if you know it works, you never know they might have a way better way of doing it.

To make this easier, I usually try and make good use of material functions where possible. This means I only need to control one thing, rather than individually controlling many things.

Back to the EQ wave material, all that crazy sine stuff just went into the opacity.

So then all the rest was driven by my generic material functions. The material functions contain all the clever world control MPC stuff. Once I had it working in a MF I could just plug it into any material wherever I needed. There a horrible mess of scary stuff, would take a while to go through with you guys in depth, so I’ll only focus on a few elements.

This bit controls the ice and fire bombs. The entire visual effect for the two abilities are driven just form this bit. For the fire bomb, I replaced the global colour with a yellowy orange and pump up the world position offset.

Similar stuff with the ice, I changed the colour but instead of making the WPO go wild, I controlled a transparency on some large icicle meshes.

The fire and ice bombs only lasted for a split second, so these were more than enough even though technically not much actually happens, a few materials change slightly.

One of the most important parts to this level was the “beatEnvelope” this is the music beat. So again this is driven in a blueprint. The music was made up in small chunks which are triggered one after another. So as the music gets triggered it also triggers a timeline which matches that chunk. Most of the materials in this scene pulse gently to the beat with increasing intensity the more rounds you clear.

Some of the heavy lessons that have been passed over from Hellblade has been to work smart and efficiently, and not to over complicate things. Mad respect for the guys and girls who were working on that in the other wing, they’re a very talented bunch. This was something I took and tried to force into my own work. Building Dexed was all about little time savers, maybe a cool thing here and there but nothing out of this world. Everything is small and manageable, no spaghetti junction blueprints please. There were times when I forgot about this mantra and wasted a day trying to build some crazy system, only to then end up making it by hand when the system didn’t work as I intended. The endless spawners being one of these times.

A bit off topic but I got a lot of people asking how and why I used designer to design t-shirts. So keeping it brief, it’s got some pretty good vector and bitmap tools. I used the Vector creation tools along with the standard set of functions to do it, here have a gif.

I didn’t want to go too much into how things worked, not everyone is as enthralled by a good shader graph as I so I figured I would leave it quite light :p If you’re interested though, feel free to drop me an email (dan.sonley@hotmail.co.uk) and ask!

Thanks a lot for your help. Really hope these questions will work for you. Looking forward to hearing back from you.

Dan Sonley, Environment Artist at Ninja Theory

Join discussion

Comments 1

  • Must match the required format

    Please don't use gray text on a white background.

    0

    Must match the required format

    ·5 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more