Apex Construct: VFX, UI and Animation in VR
Events
Subscribe:  iCal  |  Google Calendar
1, Jul — 1, Aug
Torino IT   25, Jul — 29, Jul
Shanghai CN   3, Aug — 7, Aug
Vancouver CA   12, Aug — 17, Aug
Cologne DE   19, Aug — 21, Aug
Latest comments
by scqrlett jo
12 hours ago

hell yeah

The GDC clip is basically him talking on others people work completely unrelated to what he's supposed to sell. High five!

vaporware this..

Apex Construct: VFX, UI and Animation in VR
11 April, 2018
Animation
Environment Art
Environment Design
Opinion

Kim Aava continues the dev story of Apex Construct, talking about the technical challenges of VFX, animation, and UI in VR games.

This is part 2 in a two-part series covering the art style and techniques used to develop Apex Construct, an action-adventure VR game. Part 1 can be read here and covers the visualization, design, and style of the game. Part 2 covers the technical details and know-how to achieve the stylization in VFX, animation, and lighting for VR development.

During the development of Apex Construct, we constantly seek to improve methods of dealing with hardware limitation for the different VR platforms. While technology advances at a fast pace and we see a lot of modern games approaching realistic rendering resembling film industry standards, VR still has quite a few limitations that we as a developer need to address. An example of such issues is the framerate. While a non-VR game often does not need to hit more than 30fps, a VR experience or game must run at 60 fps on the PlayStation, and 90 fps on PC, with no frame drops. Frame drops in VR are bad since latency in the sync from the headset to the brain is one of the major causes of motion sickness. Causing motion sickness is damaging for the experience, but also for the game’s reputation in general. It’s important during development to put the minimum fps bar much higher to avoid a frame drop below the necessary requirement.

As we develop for various platforms, we realized it was necessary to make different builds to reach the requirements for each headset to run the game, but also to fully make use of the high-end headsets without losing quality.

Another example of an issue we encountered was occlusion of objects. As the player has a wider view when using a VR headset than looking at a flat screen, we had to be aware that the game cannot occlude everything in the level where the player is currently not looking. In VR, objects in the periphery will still be visible. The game will have to render more objects on the screen than a regular flat screen game.

VFX

The graphical style of the VFX in the game underwent a lot of back and forth in the process of finding the right direction. The effects were more realistic at the beginning of the production, much like everything else during our early development, while we seek to find our stylization (You can read more about it in Part 1 of the Apex Construct behind the scenes). The first prototype effects were made with stock photos, which later changed as the game became more stylized. The textures for the VFX were instead made from low-resolution simulations in Houdini to get the smooth and chunky stylization.

Example of simulation of low res effect in Houdini that later was rendered to be used as sprites.

One of the most challenging parts of VFX for VR games is to tackle performance impacts on various platforms, as the spec varies a lot. When working with multiple platforms it’s necessary to decide if it’s worth making different versions of the game to support their varying needs, or go the middle way with one build that has been tweaked to fit the lowest performing platform, which also would run smoothly on any higher end headsets. Being able to ramp up or down from a middle point is favorable and will avoid issues such as reworking systems and graphics to fit each version.

To the left is a mesh we used to create the jittering hologram effects

The use of stereoscopic cameras for VR games produce issues for traditional VFX methods, such as the use of sprite sheets and billboards. Most of the effects using this method will look flat and lack depth. Often more apparent with large scale and slow effects. It’s less noticeable with fast effects, the eyes don’t have enough time to focus on the object and perceive the flatness in it. For such effects and holograms, meshes were used instead, to create that layer of depth that was needed to avoid the flatness. Especially since a lot of the hologram effects are part of player inventory and user interface that are close up.  

The use of bloom is one of the few technical limitations we encountered during the production. Bloom effects would be a nice addition for creating light and blur effects for holograms, projectiles and other effects that used light. While it was possible to use, it was decided not to. It impacted the performance in a way that other more favorable post process effects, such as depth pass, would be too heavy to run. Depth pass effects were used to create a non-flat water surface, to have it look and feel more like an actual lake. Using both bloom and depth pass was too expensive for the non-high end VR platforms. It would be possible to enable bloom for a PC release, although that would require a different set up for effects in the game. Instead, we looked for another solution in creating faked bloom for the holograms and effects such as explosions. We found that render blurry sprites and spawn them on the top of the desired effect created an illusion of the use of bloom.

An early issue we encountered with particle effects and the use of HMD (Head mounted display), was that the camera facing particle billboards would rotate the same direction as the camera, e.g the players head. In VR the billboards align with the camera in which it’s not possible to maintain a consistent up vector, causing the particles to spin around in front of your eyes, an issue that regular flat screen games won’t encounter. Simple tricks, such as covering larger areas or intersections with smoke or fog to hide seams etc, were impractical since it didn’t look good in VR with the HMD roll issue. This would usually require a case by case solution or the avoidance of using large-scale effects. Smaller effects suffered less by the roll effect.

Hologram effect on upgrade machine indicates where the player should put their hand

UX

Incorporating inventory, weapon selection and basic menu were already part of the early prototypes, to avoid breaking immersion and to quickly access everything you need by one button click. It was important that the transaction from bow to shield and menus would go swiftly and smooth, therefore using holograms and Radiance as a power source was a great excuse to disable and respawn certain part of the interface without it appearing too unnatural.

The first prototype for inventory, weapon selection and menu had quite angular corners and fast snapping movements in between the different menus. This looked great, but by testing it we realized that you had to twist your wrist too much, to a degree where it became uncomfortable and not as smooth as we wished for. In the end, we adjusted the steepness and angles, tilted the menu upward to face the player for better accessibility and also for smaller movements with the wrist.

An early prototype of the inventory and bow/shield use and switch function.

Final in-game inventory UX and bow/shield use and switch function. Meshes are used to create depth in the hologram effects.

Animation

Creating a stylized art style allows for an inclusion of more abstract, expressive and creative characters and enemies. Enemies in Apex Construct have more stylized proportions than the environment. This is to make sure that exaggerations in animations wouldn’t look off but rather blend in smoothly with the rest of the art and alert the player about changes in enemies’ movement patterns.

The animation pipeline in VR is similar to one for regular flat screen games. The differences are that in VR the animations are exposed from every angle, nothing can be hidden from players sight. In a regular FPS game, the camera has a fixed position where the animator can choose which parts of the player weapon that will be exposed to the player. It’s possible to hide certain movements or intersection between models underneath or parts that are hidden from the camera’s view.

We had to make sure that the animation works and behave correctly in a 360-degree angle as the player can rotate the bow by turning the hand around, exposing it from every possible angle. With a fixed camera, you can cheat and fake animation. In a first-person VR game, it’s almost impossible to cheat as you do not have any control over how and what the player will view.

Prototyping for animation wasn’t just beneficial for striking the right balance between realism and stylization, it was also used for testing ideas for overall gameplay design and to quickly block out and try enemy movements and scale. Animation prototyping is like concept art for interaction with objects and finding the fun in movements and the living world.

In general for stylization, over exaggerating animations is a must, however, we found that certain animations make you feel uneasy as they felt absurd close up to your face. Stylized animations that look good in the editor won’t always behave the way you expect in VR. We found that keeping the animations realistic and tweaking them to be more stylized and exaggerated silhouettes in the poses, worked best in order to find a good balance in the style, and also draw the player’s attention.

Early tests of enemy crab potential movements that ended up feeling too cartoony

Mid-production animation tests with using long exaggerated stop pose in between to get a more stylized animation cycle rather than having quirky movements. Left is without the stop and right with the alerted stop in between

Final in-game animation, the walking speed has been sped up as it was too slow. The walking pose has been changed to have no claw grips motion until it’s alerted. The crab also has a more exaggerated pose when alerted than in the previous iteration.

As previously mentioned, the camera in VR is not at a fixed angle making it hard to focus on making that particular side of an animating object or cutscene interesting. The player has the freedom and possibility look somewhere else when an animation is playing, missing out on important parts or scenes. It’s not sensible to only focus on a certain angle of the animation in VR or skip out animating a part that wouldn’t be seen in a regular game. In VR it’s important to draw the attention and focus on creating a scene that is involving you from all angles. This can be done by having visible wind flowing toward the animation, having objects thrown around that makes the player curious about where it came from etc. It’s better to make a full scene full of life and movements, rather than focusing on cycles or animating from a certain angle.

We also played with colors in the animation, which is the reason why the complementary colors red and blue are used to signaling that the enemy is alerted and aware of the player. It was also supposed to serve as a small story note that these robot creatures are not hostile in nature but rather are evoked by the player’s presence. It also serves as a noticeable point that can be spotted at a distance, seeing as the headsets tend to not be the best resolution and have some aliasing issues generating jittering.

Lighting

Mood and light are very important aspects of making a world feel believable and create the atmosphere. Post-apocalyptic art, in general, has the tendency to be very dark and gritty, usually enhancing the feeling of desolation and/or overgrowth. It’s usually desaturated world with an oppressing feeling. The key for Apex Construct has always been to enhance the colors, light, and mood, to create something vibrant and colorful (This is explained in more details in part 1 of this article series).

Seeing as we had to work with repeating elements, such as reusing the factory setting and outdoor environments frequently, it was necessary to make them feel different without having to create new models and multiple skyboxes for each level. Playing with colors in lights and post process effects were instead the main key to setting the mood in each level.

Colorful foliage in contrast with the sky was tested, and in an early prototype, we played around with the idea of using autumn as the main palette for creating accent colors in the environment and lighting.

An early prototype for light and colors before core elements in level design were set. Became redundant as the project moved to a more stylized art.

The ideas to using accent colors continued throughout the project as an experiment to make the environment feel more vibrant and stylized, not just in the foliage but also in buildings and props.

 

Early Father Figure with red ivy in contrast to the cold blue light setting. Early Hello World with red buildings and blue skies.

In the end, the style required a connection to realism and the heavy accent colors didn’t match the general art style of finding the realism within stylization. In order to give each level its own identity, we realized heavily on post effects such as grading in combination with lighting to create the atmosphere of each level. Accent colors were kept to a certain degree, but still true to the color palette of Stockholm. A basic rule of thumb was to adhere to the story and tie it together to the surroundings.

Thanks to PBR it’s possible to create quite harsh and varying lighting scenarios without breaking textures and materials. Alongside with story progress and the intent of gameplay, the light changes from bright and cheerful to a bit more moody and intimidating. In levels where the player is required to be a bit agiler and where the focus is less on the combat situation, we use a darker scheme with a focus on one or two light sources that cast heavy shadows. While larger combat areas are more brightly lit and have one or several overall light sources to not distract the player from their targets.

Summary

Making VR content provides a different type of challenge than for regular games, competing with the next-gen graphics but still working with technological limitations and physical disadvantages such as motion sickness. Developing an art style fitting for VR but keeping yourself to stylization has proven to be as challenging but still fun to work with. It really is different to immerse yourself in a virtual game world than playing on a flat screen. This is also reflected in the tools and methods we use to deliver that very experience to the players. I hope it’s been a fun and rewarding read.

If you wish to read the first part you can find it here Apex Construct. Part 1: Stylization in Realism. This covers the methods and know-how in our approach to the stylization in the realism of the art in Apex Construct.

Thanks to our artist working on Apex Construct, James Hunt (Animation), Joacim Lunde (Art), Karin Bruér (Art), Kim Aava (Art), Kristoffer Björnör (Animation), Andreas Glad (Freelance VFX), Mikael Eriksson (Concept Art Intern), Joakim Hellstedt (Freelance Concept Art), Max Huusko (Freelance Concept Art), Michael Manalac (Freelance Concept Artist) and Rickard Westman (Freelance Concept Art).

Kim Aava, Asset and Environment Artist

Comments

Leave a Reply

1 Comment on "Apex Construct: VFX, UI and Animation in VR"

avatar
Danielle T. Hebert
Guest
Danielle T. Hebert

Hi
It is very nice article to read and i like this. Thank you for sharing this wonderful idea to us
Have a nice day
Give more ideas and article about this
Thank you :https://medium.com/@yenhang1811

wpDiscuz
Related articles
Partners’ project