Thank you Richard. If I find some more ways to improvise for optimization, then I'll tell you definitely
would love to see the substance graph.
The visual shader system will be great for modular asset pack makers. You see some incredibly high quality modular asset packs on the Unreal store, whilst the ones for Unity are so-so, which I think is down to the ease of creating shaders on Unreal vs Unity. Alternatively you have to make your shaders in something like Uber Shader system which immediately splits your customer base.
Justin Mayle talked about the production of the amazing VR-title Farpoint and shared the way he created the environments for this project. Plus Alex Brown showed how skyboxes for this game were made.
Hello! My name is Justin Mayle, I am from northern Michigan. I graduated from Ferris State University, have been in the industry since graduating in 2010.
I started my career at Yeti CGI, a studio in Grand Rapids, MI, working on different contracts, such as Farmville 2, several Disney games, and our own tower defense IP “Roaming Fortress”. From there I moved to Colorado to work at the now Deck Nine studio, (previously Idol Minds), and worked on the mobile games “Tales of Honor, The Secret Fleet”, a tower defense game “Qube Kingdom”, and a city builder “Minitropolis”. In 2015, I moved to the bay area to work at Impulse Gear, as an environment artist, where we just shipped “Farpoint”, a PSVR FPS title.
My main task in Farpoint was creating a believable and immersive environment for players to explore. Throughout the project I created many assets for Farpoint (plenty of rocks), lots of world building, and some level design. Most of what you see in Farpoint was created by a small in-house development team at Impulse Gear. Having the chance to work on a large VR project like Farpoint, where I was able to work on highly immersive environments was like a dream. I’m very proud of what our small team and I were able to accomplish. We learned so much throughout the course of the project.
Coming into the project from a mainly mobile background, I knew I would have plenty to learn – but so did many others on the team. We were developing a game for new hardware, with a new controller, and for a new medium. There were so many unknowns, and many new challenges that current games haven’t dealt with. Helping solve some of those unknowns was a great experience, and was something I was excited about when joining the project.
Creating spaces in VR
Creating spaces in VR certainly is a change from current environment production practices, in some ways more so than others. Content is still generated in the same manner, and the artists still needs to consider color, form, lighting, and everything in between. With VR, however, the player is “in” the game more than ever, so everything can be scrutinized easily. The player can walk up to an object, and stick their face right into the mesh, so creating content that looks believable from nearly every angle is important.
Making sure that asset scale feels relative to the world is very important. Many VR games that I have played have assets that are way too small, and it immediately breaks the illusion of immersion. Having too many small assets or details can also be an issue. Finding the right balance can be difficult, and the benchmark may change depending on the visual demands of the project.
Being an artist for a VR title comes with a host of new considerations. Primarily, it is so much more important to be looking at your work “on device”. So often we will look at something on our screens and think it looks great, then after verifying in VR we immediately realize that something was too small, too close, or too bright. Every game is different and will come with its own issues – but finding those issues and problem solving along the way is half the fun!
Rocks made up a major part of the environment art in Farpoint, as such they needed to look great. The majority of the rocks used in Farpoint were hand sculpted with the intent that we would have a few sets of omni-directional shapes which could be used to create larger rock structures. As a small team we needed to be smart about how we built our levels and this approach seemed like it would offer a high level of flexibility and re-use. The rock shader was created by Randy Nolta, co-founder and art director of Impulse Gear. It was designed to scale uv’s based on the scale of the mesh to retain similar texture tiling between all rock assets. In addition, we had vertex paint controls to add texture variation and environmental effects. For our DLC, the Cryo Pack, Alex Brown created an icy variant of the rock shader to allow us to use the same rock geometry in an entirely new way!
We put considerable time into making sure these assets looked great, as you can get right up to them in VR, and because our game consists of so many of them! In short, I would have to say that we “rocked it” in Farpoint. Seriously though… many rock puns were made in creating Farpoint.
Our top priority for Farpoint was ensuring that the majority of players could experience our environments comfortably. To do this we designed our levels to enable players to get through them without the need for the camera rotation thumbstick. Additionally, keeping walk-able surfaces smooth and providing navigate-able static mesh collision was imperative.
We did not want players to endlessly walk through canyons and hallways, so we created plenty of open spaces that would funnel players towards the next area. We did this by using rocks or other assets to “point” towards the next zone, and by using clear value separation with our terrain painting to indicate path’s for the player to follow, when necessary.
In addition to using art to help lead the player, our engineers and designers had some tricks as well to keep our players progressing in the correct direction. Farpoint required a high level of collaboration to ensure that each level met the standards required for comfort, ease of navigation, and fun!
Actually, the man behind the skyboxes of Farpoint is our VFX artist Alex Brown. So I am actually going to let him take this question and give you some insight into how we got some amazing skyboxes in Farpoint!
Without further adieu-
Hi, I am Alex Brown, the VFX artist for Farpoint! I created the skyboxes in the game by creating three cloud textures that blend together rendered from Vue. There is one for the Horizon and one for above your head. Then there is a high cloud layer that sits above that. To fake the movement of the clouds, I use a flowmap to distort the images in a believable way. Beyond that is the “Space Layers” that consist of the Sun, Stars, Moon, and Aurora. We get the sun location from the light direction and then brighten the cloud edges around it. The Aurora is a hand painted image that is then UV distorted to add wavy motion. The Moon is just a texture that blends behind all of the clouds. It took quite a bit of work to blend the atmosphere with the moon in a realistic way. And finally the stars are one large map that actually twinkles! All of this fits into a relatively low cost opaque material that sits on a sphere. A blueprint then drives all of the parameters that affect the material. There is some real time sky changes that occur in our game but no full time of day changes. This made it easier to construct the sky with composition and color in mind. The hardest part was creating movement and compositing the layers in a believable yet changeable manner. Perhaps for my next skybox I will make a blueprint with more automation and less parameters for the skybox.
We spent a good deal of time optimizing Farpoint, and it took a great deal of effort from everyone to make the game run smoothly. On the environment side we created 2-3 lod’s for each mesh, and tried to keep the overall tri-count manageable. But by far our biggest optimization hurdle was keeping draw calls in check. We used multiple techniques to stay within our budget including precomputed visibility, hierarchical LOD’s(HLOD’s), actor merging with material atlasing, and max distance culling. On top of asset merging and HLOD’s, we would typically add LOD’s, or heavily reduce the triangle count for the large merged meshes.
Overall, I think that visual storytelling to immerse the player grounds the entire experience. Having a crashed shuttle, debris scattered all around, and scorch marks on the ground helps tell the story. Adding large bone structures into a space to make the player think about what kind of huge creatures could possibly live here. Allowing the player to look out across a landscape and see a volcano in one level, and then in the next level, the player inside the volcano.
All those things help the environments feel believable and part of a whole world. Those small environmental details can immerse you so much more! That fine detail in the rocks, or having a slight sway to some cords to imitate wind, make the experience feel more realistic. It all comes together to create a grounded and believable experience.