Great job! I want this too! Please make it somehow available!
I want this!
Afterburner Studios team talked about the development of Dreamscaper – a game about a lucid struggling dreamer. Learn about UE4 game development, world building, character animation, VFX, shaders, and more.
We’re Afterburner Studios, based out of San Francisco (and Ontario). We’re three developers and a composer.
Paul Svoboda handles all environment and characters. He previously worked as an Environment Artist at Epic, Crystal and Outpost, and his very old and outdated portfolio can be found here. Robert Taylor handles all engineering. He most recently led the engineering efforts of the multiplayer social survival game SOS and previously worked at Zynga on casual games including FarmVille 2.
Ian Cofino handles design, animation, VFX, and UI. He is a designer by trade, although much of his experience is in the visual arts. He got his start working in graphic and motion design and film, before transitioning into gaming where he worked at Visual Concepts on NBA2K and Outpost Games.
We also brought Dale North in fairly early on because we loved his work. Dale, if you’re not familiar with him, has a myriad of fantastic projects that he’s worked on, but he may be best known for his terrific work as a composer for the Wizard of Legend soundtrack.
After we left our previous jobs at Outpost Games where we met, Rob floated the idea to make our own title. We initially planned a title with a much smaller scope, but after working together for 3 months to create an initial prototype we saw the potential in trying to really take a shot at making a full-fledged, high quality, indie title. The three of us then founded Afterburner Studios.
Dreamscaper is an Action RPG Roguelite with an emotional core.
We tell the story of Cassidy, a lucid dreamer, whose day to day struggles manifest themselves in her dreams. As a player, you are confronted by her nightmares as you explore different moments in her life.
Every slumber is a different “dungeon” and every dream is a deeper dive into Cassidy’s subconscious.
As this is our first indie title, many of our largest challenges came from learning to start a studio: forming our company, learning to promote our work, taking on a million different tasks that are not within our wheelhouse, but were essential to the process.
Early on we landed upon the fiction. We knew we wanted to tell a really personal story and use dreaming as a way to layer subtext and avoid a lot of the ludonarrative dissonance that comes from traditional action titles.
Prototyping the Game
Aside from the dream fiction, we also knew that we wanted to make a rogue title. We love the challenge and progression that are built into the genre, so we wanted to do our own take on it.
We started with core mechanics, and it moved more and more into the ARPG space. The beginning is a process of exploring what types of things we think would fit the context of the game, and what types of mechanics we’re excited about. We focused on building a “vertical slice” of gameplay in the first three months, so we’d have a good understanding if this was worth taking further. We concentrated on core mechanics like combat, looting, and exploration and fleshed out the game feel and aesthetics as best we could in that time to create one “complete” level.
After we had this very early demo, we sent it out to some close friends and colleagues and used their feedback to drive some of the design decisions forward. We’ve followed this model fairly closely and we do regular testing with our community which has been a key part in improving the game.
Dreamscaper is built in Unreal Engine 4. We had all worked in UE4 previously (Paul even worked at Epic for a period of time), so the transition was very smooth. Having a solid framework to lay the game on top of has been tremendously helpful.
Our levels are a mixture of static/baked elements and dynamic/procedural elements. We usually start by crafting a ‘story’ level for each stage. These are handcrafted with a focus on creating an interesting composition with visual storytelling. Since our game is split into two halves, with the latter half taking place in a surreal nightmare, we wanted to keep the beginning more grounded in reality. Signs and small human elements (trash cans, mailboxes, etc.) help sell these spaces, create a sense of scale and life. After creating the story spaces, we can then reuse the props to create our procedural combat spaces.
Early on in the project as we were prototyping looks we settled on an overall style – simplified realism. This would allow us to create spaces that felt grounded while lacking the complexity of fully representational elements – something critical to being able to prototype spaces early on.
We wanted to convey how dreams “feel” – soft, hazy, half-forgotten. “Half-finished” concept art and impressionist paintings where the hand of the artist is still visible, parts are left unfinished, and the world recedes into fog and light achieved this.
Although our tone is dramatically different, we also looked at the work of Beksinski. We were inspired by many of the elements that make his paintings so interesting – soft lighting, gradients of color with slight textural variation, shapes receding into a fog. We used these motifs as guidelines to establish our overall look.
We started by building up a master shader which would be used for everything outside of character/interactables. To achieve a soft fade, we use masked opacity and rely on UE4’s DitheredTemporalAA. With this, we can fake soft opacity while maintaining accurate lighting and keeping the shader optimized. We use this same technique for our “soft” Player Occlusion:
Creating a World with a Single Texture
We knew we were going to use baked lighting to achieve the soft shadows in our world and wanted a way to eliminate the time spent UVing and texturing so we could focus on the modeling, lighting, and layout.
UE4 gives you access in the material to a generated Ambient Occlusion Mask. We decided to use this to define areas of color. To achieve a ‘painterly’, unfinished look, we created a brush stroke texture which is projected in world space and then subtracted from the PrecomputedLightmassAO, giving us a ‘painterly’ mask to define areas of color on each model. We reuse the same texture to add color variation and apply other effects through the shader, including water ripples/movement, adding some textural variation to the soft fade of the world, etc. This is the only texture that exists in the world (apart from particle VFX and grass). This has also kept our shader optimized, giving us some freedom to do other more complex transitions and effects
The waterfalls were a simplified version inspired by the work done in this great VFX talk by Simon Schriebt.
We reuse the same brush stroke texture, pan it vertically along a simple mesh and subtract it from a vertical gradient. UE4’s dithered opacity does the rest of the work.
Creating the meshes without typical diffuse/roughness/normal maps was a bit of a learning process. All of the detail had to be modeled in and we rely on manipulating vertex normals to define hard edges/shapes. We try to simplify objects down to large silhouette-effecting shapes and medium details. The general rule of thumb: if it can cast a visible shadow, build it into the model. Polycounts can vary, but generally small props are < 1000 verts, vehicles < 6000 verts, large buildings < 12000 verts. Everything is modeled in Modo.
Fading Reality Effect
We knew early on we wanted to create an effect where the world drops away and the dream “paints” in around you.
We also knew that in creating a Rogue styled game, players would be dying a lot, which means a lot of loading. We wanted to try to hide that as much as possible while fictionalizing and creating a visually compelling sequence for players to play through over and over.
Ian did some very early (very rough) pre-viz in Cinema 4D, just to float the idea out. We ended up liking the idea so much that it became a core part of the thematic styling of movement and level transitions.
In our master shader, Paul had created a Soft Edge effect to fade out the world on the edges. This was done by subtracting the brush stroke texture from a sphere mask.
By adding a lerp between two values for the Radius, Paul created a way to fade levels in/out.
Global Parameters can then be used to set the location of the sphere mask for portal transitions.
We relied heavily on UE4’s sequencer to give us the flexibility we needed to create this seamless load. The transition is composed of three sequences:
- Intro – Falling into the bed.
- Loop – Falling through the black dream space, we needed variable length to accommodate for load times on slower machines.
- Outro – Landing in the new dream as it materializes around the character.
To do this visually, we first needed to wipe away the world using animated global material parameters. These material parameters were hooked to the world mask, so changing one value would wipe out or in all the materials in the scene.
In the background, Rob did a lot of heavy lifting to build out the systems that would utilize Unreal’s level streaming tech to support our procedural generation. Here you can see the loading that is occurring in the background as the Dreamfall sequence plays through during level generation.
The goal was to leave the game and rendering threads available to run the sequences and effects needed to let us create what you see here.
This was one of the more technically challenging and collaborative pieces that we worked on. All of us had large parts to deliver in order to achieve the effect we wanted.
Generally, the VFX is done fairly ad-hoc. In the beginning, we spent some time trying out different styles until we landed on a more particle heavy, ethereal vibe. Much of where the VFX styling comes from is through the world that Paul has built.
Having a strong visual touchstone has helped inform the other visual elements of the game, like the VFX and UI.
But ultimately we didn’t spend too much time getting the VFX styling perfect in the beginning. As a small team trying to work as fast as possible, we haven’t had much time to frontload the visualization process for VFX. Instead, we looked at it iteratively and have continued to update and tune the styling as our game has grown.
That being said, having some core pillars to hang our hats on, so to speak, have helped guide the direction throughout the process. For instance, we knew we wanted the VFX to pop because it was important that the game feels really satisfying and VFX is a large component of that. We also knew that we wanted it to fit with the dreamy, more ethereal styling of the world, so there was some balance that needed to be done there.
We make heavy use of UV distortion, transparency, and smaller, softer particles to help define a style that feels otherworldly.
Lighting has been a relatively straight forward process with some caveats for our procedural levels.
We wanted our shadows to be soft and ambient and objects to feel grounded in the world. UE4 makes this very easy. Using a directional light with Area Shadows gave us soft, diffuse shadows.
Since we use baked lighting for all the ‘background’ elements of each level and then place dynamic objects on top, we needed a way to keep the baked and non-baked elements visually similar – each object needed to cast an ambient shadow and affect the color of the underlying material.
We created a 64×64 RenderTarget Texture that is updated on entry to the level. This texture is channel packed, and each channel affects the material in a unique way. We use primitive shapes with an emissive color (R/G/B) that is captured by the Render Target. These are paired with our dynamic meshes when they are placed in the world and add to the Ambient Occlusion and PrecomputedAOMask in the master material. This was a cheap way to maintain visual consistency across dynamic and static objects.
Since we have 2 other color channels to play with, we could use this to build up a variety of other effects:
Ian: I actually wrote up a post about my animation process recently, you can find it here.
In a nutshell: we start with the design of the move. What’s the intention? What gameplay implications does it have? How might that translate to animation? How would this specific character act?
After that’s fleshed out, I capture reference of myself for more complex animation, then work off of that reference video to keyframe in Maya. The animation is then brought in and tweaked so that it feels appropriate for the design of the move/weapon/interaction etc.
Since we’re a three person operation that’s spread fairly thin, there’s a tradeoff between quality and time. I try to be realistic about creating something that feels good for players and doesn’t take too long to execute.
In the engine, we use UE4’s montage system and some custom montages to determine much of the gameplay, such as active hit window or the time when the player can recover after attacking.
As you can see, we have a few different notifies and windows:
- Pivot. Determines when the player can pivot the character.
- ComboOpportunity. Determines the window in which another button press will be registered to trigger the next combo move.
- ComboTrigger. The earliest that the next combo move starts if a button has been pressed within the CombOpportunity window.
- BonusOpportunity. The time when a discrete window of .15 seconds begins that the player has to hit the attack button again to trigger satisfying VFX and receive bonus damage on their next combo attack.
- SpecialAttack. We also built a system that allows me to leverage UE4’s really powerful sequencer tool.
I create sequences where I layer in VFX, audio, hitboxes and even camera movements if necessary. These sequences get pulled from a table and spawned at the time I specify via the animation montage notify (in this case, “SpecialAttack”)
Because these sequences are abstracted from the attacks themselves, they can be mixed, matched, and used in different ways to create powerful, flexible, additive VFX / gameplay effects.
An example sequence – here you can see the cascade particle effects, the skeletal mesh and the hitbox spawning in the world. This is what gets built in sequencer itself, divorced from the attack.
It then gets triggered at the appropriate moment due to the animnotify in UE4’s animation montage.
In terms of inspiration, I have this rolodex of films, games, and animation in my head. Many of the designs for weapons and animations go hand in hand. I’m always thinking about how to convey movement and speed in a compelling way. Things like anime and hand-drawn animation are a fantastic inspiration because the fluidity and emphasis of key poses create a beautifully stylized animation I try to emulate in 3D.
One final note – as our camera is pulled fairly far out, it’s been helpful to go bigger and overexaggerate movements to really sell the motion and impact.
One of our biggest initial challenges was just learning the right process for working together. We all work remotely, so we fell into a pattern of just using Slack early on. This created tension and often times just messaging each other leaves out too much context, especially when discussing important issues. We moved to daily calls to discuss progress and just chat for a bit and we found that we were able to work much more smoothly.
Outside of that, there was a lot of learning on the individual level in each of our disciplines. We had to really stretch ourselves to cover all our bases, so it became a challenge to develop a title and learn some of the new skills required to do so at the same time.
Ian Cofino – 3D Designer, Animation, VFX & UI Artist
Robert Taylor – Engineer
Paul Svoboda – 3D Environment and Characters Artist
Dale North – Music Producer