Erasmus Brosdau, CEO and Director at Black Amber Digital, talked about the creative process of designing their own unique IP Zenobia, character animation, use of ray-tracing technology, target hardware, optimization, and more.
In case you missed it
Read our previous interview with Erasmus
History of Zenobia Project
The project Zenobia (previously, Origin Zero) has quite a bit of history and it’s typical for every creative process on new IPs.
When I originally started the idea in 2017 after having worked on my previous animation project “The Lord Inquisitor” I knew that I wanted to work on a Sci-Fi related project that also was designed for a transmedia purpose. The story should be narrated through a variation of media types like animated series, video games, board games, books, and others.
At the very start, the project was called “Heart of Zenobia” and based around a center core story: Humanity has set out to build colonies on a new earth-like planet called Zenobia. However, after many prosperous years, a massive tower appeared in the first colony, bringing death and diseases with it. It has risen from the ground leaving everyone puzzled where it came from, who built it and what else might be still hidden underneath the planet’s surface. Creatures came out of this massive tower and captured many citizens of the city, dragging them into the depths of the gigantic monument. As the population was never prepared for any kind of alien invasion on their own planet, the police units suffered heavy casualties and were unable to protect all of the citizens.
Concept Art
This essential storyline never changed and was always the core pillar of the IP as it carries so many things I like: mystery, science-fiction, police units with cool robots and a whole new world to design. My very small team and I worked on the first episodes for a release on Youtube, which allowed us to quickly build a small fan group. Shortly before the release, I changed the name into “Origin Zero”, as I was looking for something more Sci-Fi sounding and found that the first name had too much of a fantasy vibe to it.
In regards to characters, my plan was to have a female lead for the series on Youtube and another female lead, her half-sister, to be the lead in a video game. Both stories would interconnect and paint an even bigger picture of this world.
While this was all a great idea, I also spent two years on the business side of things as well as I had opened my own company called Black Amber Digital. Of course, having an in-house IP, Origin Zero’s purpose was to turn into a fully-funded production at some point, however, I quickly realized it was impossible to receive any support for a transmedia IP that would work only nicely when having a series and a video game. At that time I just had finished Episode 02 on Youtube and received a lot of nice feedback, however, I knew the time had come to shape Origin Zero into its final form for the best future it could have. With the success of the two episodes, I had more people joining the team which made me confident that we could improve the quality now quite a bit as I met an incredibly talented character artist for facial animation. In Episode 02, I showed our female lead Luna for the first time and you can clearly see the stylization approach I was choosing at that point as I was doing everything on my own.
As I’m a gamer at heart I made the decision that Origin Zero will from now on only focus on the video game idea and rewrote the story so that the two storylines from the transmedia idea found their way into one single product only. Besides that, we updated the characters and the environments and rebranded the entire product to shape it into its final form. During the times of the release of Episode 01 and 02, some other games and projects with “zero” in the title were announced and I thought it really has become too generic and confusing, so to finalize the plans for the project I went back to its roots and made “Heart of Zenobia” into “Zenobia”. There are a lot of meanings to that name that will all unveil during the full product, but eventually, it turned out to be the perfect name for the final form of this project for its first release.
Suit Designs
The most important aspect of the general design of the project was, of course, the suits for the military division called “EDEN”, which is an acronym for Earth DEfense Navy. There are a million different sci-fi suits out there but I really don’t like most of them as they’re being too generic or totally over-designed. Having worked a long time on Warhammer 40.000, I also have a strong interest in making suits work properly, as there were so many problems animating Space Marines in their heavy armor - they basically are unable to move. So my designs were based on functional design with a touch of fantasy elements.
I didn’t want the suits to look like you would expect them to be, but also not like they are crazy and not working at all. My designs usually are very round and smooth instead of lots of blocky and hard-edged shapes, so the flow of the designs was made rather quickly. The soldiers are called “divers” and the aspect of diving appears throughout the entire project, so I added all these straps and grid elements on top of the armor which results in a really nice and unique look. It’s not the reinvention of the wheel, but it makes them stick out.
Actually, all soldiers had a robotic arm on their shoulders with a lamp that would move around to light the way in dark environments, however, with the release of Death Stranding it looked like I had copied that design, so I removed it. The designs in Death Stranding also were using the same color schemes as my soldiers (blue and orange), so I changed that one as well. With every new trailer from Death Stranding, I was surprised how similar the designs were and was constantly adjusting to that to avoid looking like a copycat. When you design an IP as an Art Director and it hasn’t been announced yet, you start to look at shows like E3 with quite different eyes. Every new announcement could be a game that just looks totally like the one you have been developing for a few years, making you look like you just stole all of it. I have a very strong urge to design and create very original content, so I always try my best to make it outstanding in all areas as much as possible.
Character Production
The recreation of Luna and Zoe, of course, was priority No.1 for me. They are the face of Zenobia and absolutely essential. In Episode 02, I have very briefly shown the stylized Luna character, so I started with her as she was basically already complete. For Luna 2.0, I think I probably reused around 5% of the original character, a few pieces from the original suit were still usable. Everything else was completely reworked from the ground up with the most attention spent on the face.
I had worked on digital characters for many years and had a lot of experience in fine-tuning all the details and what's important, so I combined all of that in this new version of Luna. I started with a digital scan from Ten24 and reworked the face and the textures, new eyelashes, and completely new skin and eye shaders based on Epic’s own shaders from the Digital Character Demo. All these elements were great starting points that I developed further into our own materials specifically tailored for Luna and Zoe. The hair assets were made by Airship Images. I knew it would take too much of my time and I’m not as good in this area as I need to, so it was great to collaborate with them on these two hair versions.
The face of Luna got better every day, her look and also the amount of realism and details. After many tests, I finally had a final version which was the starting point for her half-sister, Zoe. Luna and Zoe have the same father, but different mothers, so they are looking quite different in skin color and overall look. Besides that, their characters are quite different as well, which also is visible in the designs. During the production of Zoe, I was able to boost the shaders again a little bit, which I then also could apply easily to Luna as they both use the exact same topology and UV spaces. In the end, my friend Max worked on all the blendshapes and the facial rigging to give them the final touch and breathe life in them. We’re still not 100% ready for all facial animations, but the entire system is already working for most parts, which can be seen in the technical demo (see at the beginning of the article).
Facial Animation
The rigs are actually fairly simple as they run in 60 frames per second. During other productions, we worked with way more complex facial rigs, but they need additional plugins and are performance heavy. We worked on a system that is easy to handle and maintain and gives great results. Our facial animation can very likely not beat the absolute state of the art, but we’re getting pretty close there for a fraction of the performance and creation/setup cost. It was very important to me that everything we now develop actually runs in-game and that there is no difference in a game character or a cinematic character - it’s exactly the same model. Facial animation gets created by having an actor perform the expressions in front of a camera which then gets analyzed by the software “Dynamixyz” to drive all the blendshapes. Of course, it usually requires a bit of cleanup animation from an animator, but even the raw results can often be used for simple idle animations.
Already for the Origin Zero episodes and also for the cinematics that are in production right now for Zenobia, we shot all sequences over at our friends from metricminds's place in Frankfurt am Main, Germany. I have worked with metricminds for a long time already and they provide art motion capture for all different projects, no matter how complex or crazy your ideas are. We equipped the actors with head-mounted cameras and were able to have a full performance capture on set. As metricminds provide a very large capture volume, it was great to be able to shoot all sequences with multiple actors and longer running distances. It’s amazing to receive the body and face animation at the same time, so they really feel connected and alive. However, for an early test before the production, it was me who made a short video of my face looking around, which then was transferred to Luna and Zoe. Funnily, as I showed just the end result in UE4 to a friend, his first reaction was if I had played that facial animation myself - he knows me for over 20 years and it was very interesting that the performance can be transported so well. We now have a really good facial animation system in place that captures the actors' performance and already another cinematic in the pipeline, so it will be great to show more of Zoe and Luna in the future.
Use of Ray-Tracing
Ray-tracing is a technology I've been always hoping to see available for game engines. I come from an offline rendering background and I've made many different animations using V-Ray and Mental Ray. Going into the games industry, I had to give up all the nice fidelity of ray-tracing and figure out ways to fake the effects. Now, finally, I feel like these two worlds are starting to merge together and I can utilize the best of each industry for the best experience.
It’s still very early for ray-tracing in games, so it comes with a very high cost. Basically, to use all features of ray-tracing in a game running at 60 frames per second is simply impossible when striving for a AAA high-end look. Yet, it’s possible to let games like Minecraft run with elementary Global Illumination as the geometry is just absolutely simple.
One of the best use cases of ray-tracing is definitely shadows and reflections as they are very affordable in render times and produce significant differences. To see you shadow penumbras fade out softly in the distance instantly gets you a high-quality look like in offline rendering. Accurate reflections are another way to greatly enhance realism - another performance-cost-saving aspect is that you can define when ray-tracing reflections should kick in by the roughness of the materials. It makes a lot of sense to only use the ray-tracing of roughness values from 0 - 0.2 or 0.3. With higher roughness values, the reflections become so blurry that it’s hardly visible anymore and just eats up performance for no visible benefit. Raytrace Ambient Occlusion is finally able to give accurate depth to the scene, something that Screen Space Ambient Occlusion is unable to do. This effect is very complex to render, so it’s more suitable for cinematics than actual gameplay. To make use of fully ray-traced Global Illumination is by far the most expensive of all features and I did not use it for the Zenobia Technical Demo. The latest version of Unreal Engine (here version 2.24) already offers another way of calculating the GI (Final Gather) that reduces the rendering costs dramatically. I know, however, that it’s very likely not possible to use it for any game production in the nearest future which is the main reason I did not use it in this technical showcase.
What will be difficult for the majority of players to understand is the difference in fully ray-traced games, especially with global illumination. When you look at games like Uncharted 4, it is full of very accurate Global Illumination with the use of traditional lightmap baking. Technically, Uncharted 4 is completely ray-traced, it’s just not happening in real-time but has been pre-calculated and baked into a texture. Achieving the same effect in real-time will increase the rendering cost multiple times and output the same look we already had on Playstation 4. The real benefit for every studio will shine once ray-traced GI becomes affordable for rendering times in every scenario which will then allow us to skip the entire process of lightmap baking. The latter process is time-consuming, produces lots of errors and technical problems, and most importantly, eats up a massive amount of texture memory.
What we have now is a glimpse of what’ll be soon possible for many games as hardware and software get improved within the next years. The first early adopters look great by making clever use of the ray-traced shadows and reflections - those are usually the first features to find their ways into PC games being the ones with the lowest performance cost. Ray-tracing is one of the biggest game-changers for real-time productions next to AI, we finally have the chance for proper lighting that follows physical rules as much as possible. While we can’t use it right now to its full potential, it will be amazing to see how fast it becomes more stable and usable and the times of lightmap baking will be over.
GPU Particles
Particles add a lot of life to the environments but the more you use them, the heavier they affect the performance cost. Traditionally, most particles are being calculated on the CPU which can be fairly limiting. Transferring tasks to the GPU adds a lot of freedom for showing a very large number of particles easily. They come with other limitations, but in general, it’s a very good way to showcase many particles the player can interact with. For example, you have many trees and want to show them in an autumn scenario where all the leaves fall down. This is a great task for GPU particles as you can display thousands of leaves falling down, colliding with the floor and even piling up to later interact with the player. It’s these small things that the audience is often not used to seeing in games and that makes everything feel just a little bit more alive. Players are very used to the fact that many things in video games are simply not possible because it’s a game - so the more you can surprise them with interactive elements, the more they can immerse themselves in the game world.
Effects & Hardware
All the effects and shaders I was creating and modifying are kind of simple to render. The eye shader might be probably the most complex one as it uses many different layers of rendering for the pupil, the reflections, the refractions, and the shadows of the eyelid which is faked as real-time lights are unable to cast these super subtle shadows.
Having a very broad knowledge of all the rendering effects and general understanding of how to make images look nice and realistic helped me to fine-tune everything as much as possible instead of using performance-heavy effects. Often, game engines don't offer these features anyway, so you just have to find artistic ways to fake effects to make the result look realistic.
For the entire demo, I’m using the Nvidia RTX 2070 Super, not the most high-end card available yet. I actually started the demo with an RTX 2080 TI but decided to go lower as I wanted a more realistic scenario for a PS5 / XBOX series X environment. This technical demo is not meant to show the best real-time rendering that is possible right now but to demonstrate the possibilities for a real game that runs in 4K in 60 fps. I think we have seen enough demos with ultra-realistic graphics, but it’s just nothing that players can expect anytime soon on the new consoles. So the entire Zenobia production is optimized to look as realistic as possible while actually working for a game production without losing the benefits for high-end cinematics. Another important aspect is that there is no extra cinematic model of Luna and Zoe, everything you see is the actual in-game player models running in 60 fps. It’s a real demo where nothing is faked by using incredible high-end machines, the entire demo is simply the in-game graphics of Zenobia with fully dynamic lighting.
Character & Prop Optimization
As I was aiming for a production on the next-generation consoles, I increased the polygon budgets a bit, but it actually doesn’t make a significant difference. Where it’s most obvious is maybe the hair assets, as we spend around 100K to 150K triangles on them, which is quite a lot. However, it’s very similar to the hair of Aloy in Horizon Zero Dawn, so it’s ok to do that for your main character. Other characters would likely get a hair asset with fewer triangles. For this generation consoles, an AAA character usually has around 100.000 triangles, so I upgraded my budget to around 300.000 for each character. In the end, I think it’s easily possible for your hero character to do that but actually not really required. I could make them much lower in polycount by still looking identical.
The buildings are all made using the medium poly workflow, which means they don’t have any unique normal maps, but chamfered edges and face weighted normals. I’m constantly looking for optimized workflows for characters and environments and I completely abandoned the old ways of modeling a high poly and baking it down to a low poly mesh. It just takes too much time and leaves little room for quick changes. They are all using trim textures, decal sheets and tileable materials, something that is incredibly powerful, time-saving and a lot of fun to work with. It took a lot of time to build the libraries of decals and materials first, but once they were established I was able to create assets and content so much faster than ever before. The natural environments of rocks and mud were a mix of Megascans and my own assets. The very big muddy cliffs are all hand-sculpted and textured by myself and then detailed with Megascans rocks and some assets from the Kite demo, I think. I usually skip the parts of modeling trees, bushes or rocks - there are so many great libraries available, that it would not make any sense to spend much time on creating them yourself. Afterward, I worked on the LODs for the assets as otherwise, your scene becomes too heavy very quickly. I only used the LOD tools inside UE4 and they are great time-savers as well.
Future Plans
While you're reading this, we are working on the actual gameplay demo for Zenobia. The final form of Zenobia is a video game and will not be a movie or a series like Origin Zero once was introduced as. However, Zenobia has many cinematics and within the next months, we will publish the first one which is around one minute and focuses on Luna and Zoe. We already captured Motion Capture for around 10 minutes of cinematics and hopefully will publish more of them. However, the most important aspect is to finalize the gameplay demo and to push it to the next level.
My company Black Amber is working on many other projects as well, but we have completely self-funded this production so far - we probably want to partner up with a good publisher for the final product though. In the meantime, we will keep working on it and publish progress on our Youtube Channel Warpgazer so that people could get more impressions of Zenobia.
I founded Black Amber in 2019 with the goal to produce high-quality content using real-time technology. My team and I are having a significant amount of experience in the games industry having worked on games like Crysis, Witcher 3, Ryse, Star Citizen, Resident Evil, Call of Duty and many others. Besides that, I already have another base in Tokyo, Japan, where we collaborate with CG studio Safehouse to work on many different Japanese projects - there’s just so much going on, it’s really exciting. I can’t announce anything yet, but anyone who is interested in the projects that Black Amber is working on can subscribe to our Facebook page. Zenobia is, of course, the flagship of our in-house production and I would be more than happy to transform it into a full AAA production.
Erasmus Brosdau, Founder & CEO at Black Amber Digital
Interview conducted by Ellie Harisova
Keep reading
You may find these articles interesting