Guillermo Moreno and Emilio Serrano shared a detailed breakdown of their latest game teaser, talked about various procedural systems used in the project, and explained how animations were set up.
Guillermo Moreno: Hola, I'm Guillermo Moreno and I'm currently working as a Level Environment Artist at Room 8 Studios. I've worked on games like Evil Dead: The Game, Call of the Sea, Gylt, and last year I joined Room 8 Studios to develop awesome AAA games.
Emilio Serrano: Hi, I’m Emilio Serrano, a Principal Animator at Studio Gobo. My latest shipped game is Hogwarts Legacy. Previously, I was a Lead Animator at Tequila Works and I was working for over 10 years at Lionhead making most of the Fable franchise games.
After the development of Gylt in Tequila Works in 2019, a lot of new ideas appeared in my mind and I started to develop a small game with high-quality art, a bit like Inside or Little Nightmares. This small prototype took me 8 months to complete and after evaluating the scope and the possibilities to make a complete game, I finally needed to discard it because a few things related to the game design didn't work. The puzzles were very confusing, and most importantly the prototype wasn't fun. The foundations, on the other hand, were pretty solid.
A few times I redid the prototype entirely. I made different pitches, GDDs with the pillars of the game, etc. At this time, I got myself starting working on Call of the Sea with the talented team of Out of the Blue Games. In my spare time, I continued working on my idea but I kept making the same mistakes: the game wasn't fun! When the pandemic started, I was so tired from working on it that I needed to leave the idea in a drawer and I continued with some small portfolio projects like The Mandalorian Tribute, Wayne, etc.
During this period I spoke a few times about game design, puzzles, etc., with Tatiana Delgado (CEO of Out of the Blue). She is such an incredible and talented person. She gave me good tips, and after the awful personal experience that I lived through throughout Gylt's development, I tried to make something good on my own again, with more enthusiasm. It was when I started working on “Project Witch”. For the project, I had a clear idea: it would need to have a dark ambiance and good mechanics, it should be fun, and the most important part for me – working on it would help me improve my programming and technical skills.
So I started to prototype game mechanics, no graphics and no VFX at all. Only a cube and game mechanics to practice my programming knowledge, just like a test. I'm a big fan of Uncharted, Jak and Daxter, Zelda, Halo, and Portal 2 and I tried to recreate the mechanics of these incredible games. I prototyped a climbing system, cover system, puzzles, combat, vehicles, and some interaction with objects.
After all of these tests, I tried to integrate them into “Project Witch”, and, well, these systems that I was developing were for a different game, not this one.
“Project Witch” was a puzzle game with melee combat and horror cinematic sequences, and I was working on something different. Unconsciously, I was developing game mechanics for my original idea.
At this point, I thought that the moment to start again with the prototype was coming soon.
With the game mechanics ready, I developed the first small blockout area of the prototype. This area combined the cinematic moments, on-rail cameras, and the third-person/first-person. They change depending on the situation, whether you are exploring the environment or resolving a puzzle.
Plan & Gameplay Teaser
At this point, I had a lot of doubts about where to start. I had a few “concepts”, like top-downs, schematics drawings, and a general idea of the puzzles, enemies, and a minute-to-minute of this first area, but all of these ideas and schemes were useless without a real test in the game.
The moment to make a plan was needed and I had to organize everything in my mind to start creating tasks.
Those tasks were clear:
- Migrate game mechanics.
- Configure mechanics in the final project.
- Create level blockouts.
- Test game mechanics within the blockout.
- If all of these previous things work well, I’ll be starting with art, lighting, dressing, etc.
The level design of this first small area took me a few days because I needed to redesign a few areas and correctly integrate the information that you found in the first steps of the prototype – the environmental storytelling and additional information for the puzzles.
After I redid the initial idea of the blockout, I checked that everything worked as I wanted. Then I got started with the final art, but for this project, I was keen on using new tools, not only 3D modeling software like 3ds Max or Maya. I wanted to learn a bit more about Houdini and discover its tools and how powerful they could be in order to help me achieve this.
Houdini & Teaser Trailers
The first teaser was a concept trailer. I used Houdini on other projects but for this one in particular, I wanted to make a short video using only Houdini.
It was my first experience with procedural image plus music, the effect was so cool, I sent the geometry to Unreal Engine 4 and rendered it there. It was such a perfect test to understand clearly the pipeline between Houdini and Unreal Engine.
But the peak of using Houdini was in the second teaser trailer and everything after.
A few days after finishing these small tests, I made some procedural systems to test them in Unreal 4 (and Unreal 5), tested procedural sci-fi panels with a parametric texture, some procedural modeling, and particle system simulations. Obviously, a lot of these systems were complicated to have in a real-time project, for example, the river fluid (that I used in the last video) was complicated to have in real production because the alembic was 24GB. It might work as a VAT but will lack interactivity. However, for this cinematic trailer, it was enough, I knew that I could have gotten a similar result using the Niagara system, but I preferred to create this simulation for the love of working with Houdini and as a personal whim.
Adding Houdini to my pipeline has been such an awesome experience (and a bit frustrating too).
For this project, I also made a lot of procedural systems, for example, procedural cables/pipes/scaffolding systems, destruction systems (that I can be precalculated and integrated into Unreal to create script sequences), small asset tools (to create easy and fast small assets, like bottles, cans, dirt, etc.) Making all of these was an awesome experience, very intense but at the same time very satisfying for me.
All of these systems helped me a lot to dress the environment faster because I had different variations in one system.
In the cable system case, it had a collision system that adapts automatically to the floor, walls, or objects in the environment. It was impossible to avoid clipping, but it worked very nicely, and just for a few cases I needed to move these assets.
The cable and ivy systems worked similarly. They adapted automatically to the environment in Houdini and in Unreal Engine, generated different LODs to optimize meshes, and had a UVs parameter to generate UVs automatically, with a trim texture that worked well.
The ivy system worked like a charm when mixed with SpeedTree. I know that in SpeedTree you can make realistic ivy too, but I decided to use Houdini again. The main reason was that loading a tree file on my PC was taking a few minutes, and many times the app closed the session abruptly. Another reason was that I wanted to mix both software and find out the outcome. I think the final result looks pretty nice.
The fracture system is another system that I made for this teaser. It was also very simple to use: import an FBX of the asset that you want to break, Houdini makes a re-mesh of this asset, and you can fracture the asset manually or with Voronoi. If you want some fine control, you can use it as a manual system, but if you prefer debris fast and easy, go for Voronoi.
All these new assets that got generated after the fracture inherited UV channels from the original assets and generated vertex color within the new meshes too, in other words, you don't need to create extra textures or materials (DrawCalls), for the project. I used layer materials that add a “break” material around the white areas of the automatic vertex color.
I carried on making more stuff in Houdini besides procedural systems. VFX like the stream in the subway was a fluid simulation in Houdini. As I said earlier, this stream could have been done using a Niagara fluid, but I preferred using Houdini to check the final result and the viability of using it in real production. The look of it was incredible, but the technical result – not so much. The stream alembic size was 24GB and the whole simulation took around 12 hours to be generated. I needed the cache file and 5 more of them to generate the final geometry cache, so it's definitely too much for real production. Also in addition it had a few important problems like that it was not interactive (Captain Obvious strikes back) and although the final result looked so cool, if I wanted to have a more awesome result, I needed to increase the size of the file a lot more. It did work out this time for a cinematic trailer but there's not much sense otherwise.
A few effects were discarded due to the time and problems with Unreal Engine 5 and Houdini Engine, for example, light bulb explosions when the boy walks close to them, a mirror breaking after the final pulse, etc:
Houdini and Next Teaser Trailer
For the next teaser trailer (and the final one!), I’ll be making more of these systems, for example, for placing procedural assets in the environment by modifying the landscape in real time, creating a 6km x 6km landscape. I'll change this and create a new one with 64km x 64km in a single click to create different biomasses or areas. Also, I’ll be creating the most important extra tool: rivers, roads, or specific player paths that will be able to be modified in real time. This tool will be very useful because it will allow me to modify the path of the player if you don't have a clear path to follow and to create additional paths for "secret" or secondary areas. Looking forward to it!
I checked it out, and the possibilities blew me away, it is complete madness! I could create a procedural forest, an extra world extension, or a series of background landscapes dressed with “a few” clicks away... absolutely stunning.
I know that I need to be learning a lot of Houdini to be able to create more ambitious systems and polish the ones I created so far, but at the moment I'm very happy with the results I am getting.
Of course, not everything was developed with Houdini for this project. I used other software like SpedTree. I made a lot of vegetation: grass, bushes, moss, and trees (in the last teaser you can see a small part of all of them). ZBrush, Megascans' textures, and 3ds Max were an important part of the pipeline to generate clusters and textures.
The vegetation had between 200 and 30K polys. For the overland area, including the next teaser, I will have a lot of vegetation and I'll have to be careful about having a good performance. It will be key not to exceed that polycount.
Unreal Engine 4 to Unreal Engine 5
After the launch of Unreal Engine 5 and its new tools and features, I decided to move to this new version straight away. To be honest, the process was a bit frustrating. Although the project migration was perfect, a few things didn't work correctly (like Layer Materials or Tessellation Material), but overall, the transition was clean and easy, but some problems were waiting for me later on.
I experienced some problems with Blueprint compilation and with the landscape tools. Some problems were fixed with the latest Unreal 5 updates, but some of them are persistent.
I had to redo the climbing system because it did not work correctly (IK problems) and had some problems with AI behavior after the migration.
However, not everything was bad when using Unreal Engine 5. Thanks to it, I have the visual style that I was looking for and the procedural tools (Electric Dreams) look incredible and I'll use them for the final teaser trailer.
Guillermo’s idea for the teaser trailer was to keep a continuous camera shot for each of the 4 corridors. Without any camera cuts and with everything exposed, it's difficult to hide anything by framing it elsewhere.
The main focus was Bboy's inner journey throughout the corridors, as he was picking up and carrying the container, and following him closely all the way down to the final encounter with the girl.
Motion Builder Blockout
Guillermo provided some storyboards containing the rough idea for each corridor. Before doing anything real, we talked about each shot and agreed on how to proceed with animation.
When it comes to creating long motion capture animation sequences, there’s no other software like Motion Builder. It is the perfect software for it. I didn't have any specific motion capture for any of the shots, so I stitched all the mocap data captured with an Xsens suit carefully in story mode, blending clips and creating the transitions, getting the boy moving forward through the environment.
Although a big portion of the clip transitions was pretty good, I still relied on a heavy clean-up process once back to Maya’s mGear rig.
All shots were pretty long, from the shortest corridor 01 with a length of 38 seconds to the longest corridor 04 with 1 minute and 13 seconds.
Maya, mGear Rigging, Keyframe Animation & Mocap
For quite a few years already I have been relying on mGear as a rigging framework. mGear is an open-source and super powerful rigging solution for Maya that, in combination with some Python scripts, allows you to solve most of the rigging needs for all sorts of assets, from complicated characters to simple props.
mGear is fully compatible with Maya’s parallel mode and it keeps Maya’s performance steady at 30 FPS, even when multiple characters are in display. It features amazing animation tools like flawless IK/FK matching. It also comes with a powerful range switch tool for editing mocap quite easily, limitless space switch setup, soft IKs, and a very useful contextual viewport menu when right-clicking on controls to access most of the functionalities linked to it such as reset, mirrors, selections, etc.
mGear shifter contains a series of Python components like arms, legs, spine, springs, cables, chains, etc., that can be combined laying out a guide from what the rig is getting built.
After the whole guide placement is set, you can add some postScript steps for finalizing the rigging process. The whole rig is able to be rebuilt on demand, preserving all rigging features and the skinning and allowing a quick iteration where modifying or adding new components, controls, or bones to the characters is not a hassle.
For this teaser I rigged a bunch of characters: Bboy, the girl, the cyborg, and the main prop – the surge container.
mGear and HumanIK:
Not sure how many animators out there know about how easy it is to use mGear in combination with HumanIK. You can apply mocap to the control rig and easily clean, fix, polish, and enhance animation.
As it was a simple skeleton hierarchy, you can also characterize mGear control rig instead bones.
Setting the T-pose to achieve the all-green and I'm ready to go:
Import the Motion Builder FBX animation into Maya and set the source to it and HumanIK will retarget the motion into the mGear rig automatically.
mGear comes with a space transfer tool. This can convert the usual mocap data applied to FK controls to the IK ones. You can transfer arms and legs easily and you will have all the rig functionalities as when animating a keyframe on top of it.
Rough-rendered Maya Viewport 2.0 can be watched here:
Exporting Assets Data
All characters are 100% joint deformation solved. I used 137 to 400 bones for characters, including many extra bones for details like faces, the hat, sleeves, and cloth simulation.
Skeleton export data from mGear is game-friendly. The skeletons are separated joint hierarchies that are easily exported as FBX using Maya’s game exporter tool.
The only requirement is having the skeleton structure at the scene level:
Cloth and Cables Simulation
Once all animation is finished, I set up the cloth for the girl's gown. I used Maya’s nCloth setup with a few hundred frames pre-rolling.
I set the girl’s body as a collider and some default nucleus wind and noise for some nice-looking fabric motion.
The 4 cables attached to the girl’s back were also simulated with nCloth. I used a simple plane proxy mesh for it keeping it pretty much real-time working. The plane was pretty long to allow the girl to hang and move around. Once the planes were simulated, I stored the cache and used a warp deformer to bind them. The high-poly geometry of the simulated plane:
The FBX and Alembic data files and sizes for the corridor 04 scene:
Overall, the whole animation process was pretty much effortless and quite enjoyable.
Motion Builder was, once more time, key for successfully crafting the complicated transitions of the boy interacting with the environment, and eventually Maya and tools like mGear and animBot were essential for a fine keyframe animation and mocap editing.
I have been using the Motion Builder>Maya>Unreal Engine workflow for years now. I’m trying to learn some Python in order to get a more sophisticated pipeline for easing up some tedious processes like transferring data to the HumanIK control rig, browsing assets, and exporting animations in a better way.
I think you can really see the level of quality, passion, and dedication that Guillermo put into this and I definitely had a blast helping him get his vision out.
Hope you enjoyed the teaser, see you in the next one.