Kyodai: Inside Game Production

Students from New3dge Art School talked about their amazing VFX work and shared team workflow.

Introduction and Career

I’m Jonathan Hars. I was in charge of the environment, lighting and level/game design on the Kyodai project. I’m from a small town near Paris. I got into the 3D art soon after I became a bachelor, I’ve just finished 5 years in graphic design and print and wanted something different. I’ve been playing video games since I was a kid, so I naturally went this way. I found out about New3dge, and it seemed very cool, I tried to get in, and it worked.

I’m Jérémy Bergot, the character artist of the project. I’m 24 years old, and I live in Paris suburbs. I like manga, drawing, music and video games. I come from Alfortville, a city in Paris suburbs. For now, I’m just applying for a company in Japan as a character artist.  I’ve worked on a small project, who had to be the DLC of Deus Ex. I did some mobile games with a friend, too. And some augmented reality games as a prop artist in a company. At first, I was more attracted by drawing. But after learning some stuff on 3D at New3dge, I enjoyed working in 3D.

I’m David Quoniam, the character artist who worked on the robot enemies.  I come from America and was raised in France. My passion is video games ever since I could play games, I then chose to try and make my career out of them.

I’m Michel Lozes, the technical artist who worked on the Kyodai student project. I love video games to the point of dreaming to make my own. I come from the countryside of France, Normandy. Something that looks like this. As a technical artist, I have one foot in the scripting world, and one foot in the art world. When I got to talk with my group, I told them to give me any job they wouldn’t be able to do for any reason. At first, I was making props, objects that our environment artist would use to decorate the game. Here are some examples.

The necessity to have someone scripting in each group appeared in December 2018, when Jonathan Hars began working on the environment on his own, while I would do scripting on my own.

As a New3Dge student, I worked on many small projects used as exercises, the previous big one from last year was an Assassin’s Creed Origins DLC fanart, I was making props, too, in modeling and texturing.

I worked on many personal projects, but none were really made public. They are video game designs and concepts I made in hope that I would find someone to script them. I know people who could do it, but they always had other priorities, so we couldn’t really work on them. Kyodai is the first group project I worked on as a Technical Artist.

At 16 years old, I knew I wanted to work in the video game industry. Make my own games. I grew up playing video games. Tons of various world, expressions, stories, music, and the freedom to be a part of it, act the way you want in a whole new world. But I was bad at scripting and drawing. But I figured if I were to work hard, I would get better. I didn’t think I would get really good at it, but at least better than the level I stood at. 3D Art seemed a lot easier. So I thought I could enter the video game industry by working in 3D. It quickly became clear that having some drawing knowledge would help to make better 3D. So even though I chose 3D to avoid drawing, I began learning how to draw. And the understanding of 3D art became a lot clearer, the more I worked in 2D and 3D. This led here, eventually.

About  the New3dge Art School

Jonathan: New3dge is a very good school set in the south of Paris, I started by learning about photoshop, 3DsMax, and for 5 years they teach you absolutely everything you need to know. The teachers are all great, and a lot of professionals from the industry came to teach us amazing techniques.

Michel: You can learn how to draw and work in 3D there, the course let you grow at a reasonable pace, the school has a great spirit. The will to share and have fun is being spread from instructors to students. After a bit of time, the students themselves are also talking and acting the same way, which makes the whole school a great place to work in. Not all art schools are like that. At the last parts of your course, you can choose to go to the Game Art sessions or Visual Effects sessions. One has lessons mostly made for a video game art production (it doesn’t include much of scripting yet, mostly art), and the other has lessons made for short animation films production. Both are related to 3D works. During the development of our graduation project, we had many instructors helping us to make this project and our presentation at the reality theater: Pierre Gabriel Chouzy, Fabien Roumazeilles, Michaël Baratian, Arthur Trouslard, Boris Miszczak, Guillaume Rebender, Yannick Gombart, Damien Brisson, Anthony Lemétayer, Michaël Floury, Alexandre Vong, Guillaume Deschamps, Jeremy Vitry, Camille Delmeule, Olivier Lafay, Laurie Maigne.

Alexandre Vong, Mickael Floury, Anthony Lemétayer, and Arthur Trouslard helped me for scripting and VFX whenever I could meet them.

The Idea of Kyodai

Jonathan: The main thing that I wanted to do was to create a game based on what Jeremy Bergot and I both liked. We created a story about the gang, mega-corporation in a cyber-punk universe, gameplay close to what we like and most importantly, what we could do with a team of four students in one year. In the beginning, we didn’t know that real gameplay was possible, but very soon in the development, we discovered that Michel was actually really good at tech, so we made a fully playable demo. It’s started with “the game has to look good” and quickly became “the game has to be playable”.

Michel: I wanted to make a demo of a video game, that would be playable. With mechanics that would follow some game designs,  which would set rules for the players as they play. I wanted it to be more of a video game than a visual demo. At first, I was the only one interested in working that way. Then as we kept working on it, everyone in the group had fun making suggestions for new game designs. This is also how I wanted the production to go, even though I wasn’t expecting everyone to have fun, that was great.

Gathering the Reference

Jonathan: There was a choice we made very early into the production we wanted a fully playable demo, so I had to create a pretty large level. If I wanted to get it, but I couldn’t spend too much time on every asset, I searched for games that were made with simple but impactful graphics, realized I already played Ruiner and tried to understand the way they worked on it. It became the main reference.

Jérémy: For characters, I used references of cyberpunk clothing that I was able to find. I had a clear idea of my goal, so I took many references here and there to compose my idea.

Michel: References could be used for VFX when a suggestion from the group was given. Since it’s hard to express how some abstract FX look, I would demand either a reference before starting to work on it or a detailed explanation. The more information I was given, the less time I would have to spend to adjust it to please the group.

As for the references, there wasn’t a lot :

  • The counter mechanic of Akari is inspired by Vella’s XCounter in Vindictus. I picked it to check if the group’s concept was in line with what I had in mind.
  • The switch FX idea is from Agents of Mayhem, they look similar but I built the material completely differently. Jonathan got this reference for me.
  • The latest dash of Kaneda and Akari trail system setup was inspired from Devil May Cry 4’s FX dash. When looking at gameplay footages it’s hard to see, but if you pay a bit of attention you can guess how they made it, either that or something that works in a similar fashion. Jeremy found this reference for me. The reference was showing this action:
  • The red laser behavior that the Cobra turret uses when aiming at the player is inspired by the Guardians’ red aim behavior in Zelda Breath Of The Wild. Basically, the shaking of the laser. Explanations could have been enough, but I had videos attached to them. David brought this reference to help me. Like this one:
  • The boss version of Akari fires pellets in circular ways sometimes. This specific pattern was inspired by a Touhou gameplay footage. The way it works is quite simple, so you may see it in a lot of games, not just Touhou, but that’s the reference I was given. Jeremy wanted me to use this reference:
  • We didn’t need references to make LilB trails on the shoes since they’re so simple and in the end, they don’t even have the same settings as the reference. But the group gave me a reference for them. It was some gameplay footage of Hover: Revolt Of Gamers. The characters always have some trail traced starting from their shoes. Ours last shortly since they kinda pollute the space on-screen with our top-down camera. In Hover they last really long since it can’t disturb the player himself, the trails are behind the camera. Jeremy had this reference in store for me.

If it seemed easy to figure enough, then I didn’t need a reference. Sometimes I was given both explanations and references. I would work using the explanations only, not studying the references. If the group was satisfied then I’d consider it done, if not I would check their reference to better understand them.

Project Direction

Jérémy: I made the concept design of the characters, their 3D models, textures, and animations.

Michel: All scripts, most FX, most materials for the FX, some parent materials, and I did some voice acting for an enemy. You can find most of the Tech Art done on Kyodai on this youtube playlist, it contains more than 90 videos.

Jérémy: For the characters, I wanted the spectator to recognize them on their first look. On top of that, I also wanted them to appeal to people and make them want to play the game.

Michel: I wanted the game to be really fast-paced and explosive. Full of energy. But we also wanted it to be playable. I didn’t take into account how much different one player is from another, I was mostly setting the game for myself at first. If the team wouldn’t have persuaded me to not do it my way, then nobody at our booth could have finished the game. I managed to reduce the difficulty while still finding it interesting to play. That way it has some energy. And it’s playable for most people. You can see some bits of our booth in the following video after 1:44.

2019 Game Art Jury

Two weeks ago we had the great honour to welcome worldwide professionals for our 2019 Game Art Jury.The event was an amazing experience and took place in a theater and then back at the school .We are very proud of our students who made extraordinary games. Thank you very much !!We also want to thank all the companies who answered our call : Airborn Studios, Allegorithmic, Arkane Studios, Asobo Studio, BackLight, Black Forest Games, CCP Games, Cyanide, CyberConnect2 Montréal Studio Inc., DONTNOD, GFactory, Guerrilla Games, Lightbulb Crew, Louis Vuitton, Machinegames, Nki, Riot Games, Sloclap, Tsume-Art and Ubisoft Studios Paris.We hope to see you again !

Posted by New3dge on Friday, July 19, 2019

 

The reference I had for the explosive energy I wanted to get, assuming I could, would have been Nelo. Explosions, fast movements, every inch of the screen is busy during a fight. That’s probably too much even for me, but the energy inside this game is a great motivation. Here’s an old trailer they had, which I kept checking once in a while during the production of Kyodai to reinvigorate myself.

Challenges in Building the Levels

Jonathan: For the creations of the level I first started with primitive shapes to block the level layout:

Then some repeatable assets such as ground/wall tiles and generic props:

By far the biggest challenge for me was to end the final level design and mood of the game, we needed it very fast so everyone could start working, knowing in what environment their characters, robots, FXs will be in.

Then I slowly created every room of the game improving assets and lighting, also modifying some of the level design:

and some other iterations:

I did this until the Level was done, it’s what you see in the gameplay and presentation of the project.

David: We first drafted the level through colored blocks on a drawing, with red and blue ones for “fight” and “rest” zones for the player to have some downtime. We quickly ditched the latter. Our goal was to make a bunch of assets to place and multiply over the level, in a modular fashion, to facilitate level-building without having to make different assets for every single room. All of the nice looks and effects were made by Jonathan and Michel with many post-effect filters, special lights, and shaders.

Michel: Whenever Jonathan Hars, our level designer, would need a custom tool to help him build the level. I would make it as soon as possible so that he doesn’t get slow down. Most of the custom tools I made can be checked in the playlist starting here:

The first idea we had on the very first day of production was to have multiple pieces or objects that could be combined with others. That way with fewer objects, you achieve a more complex look. We wanted everything to be modular. We couldn’t apply this to everything, but we tried whenever we could.

Texturing

Jonathan: For the material, it’s actually very simple, there are few shades that I used on the entire game (with some exceptions for a very specific one), and I used lighting, normal/roughness/emissive decals, and different positioning to create dynamics in the environment. I kept everything light so we can have a high frame rate and smooth gameplay. Also working on a big level with low FPS is hell.

Michel: All the textures and most materials you see on the objects in the environment were made by Jonathan. Only three of them were upgraded by me at the end of the production for some pipes of the Boss Room. I’ve made the various FX placed in the environment which does not look like constructions.

Creating the Metaballs

Michel: The merging balls you saw, are commonly called Metaballs, but programmers might prefer to refer them as IsoSurfaces. They were made by following this tutorial from Epic Games. When the Epic staff is sitting for more than an hour, it’s not a straight tutorial anymore. They’re not teaching you one thing, but many things in one video. It makes it really valuable but really hard to follow as well. I had to learn a whole lot of things to simply understand the content of this video.

I didn’t know a different way of making them, and I thought it would be simpler at the time. It’s only coding within an Unreal material, using a specific node within the material editor, which allows you to add hlsl code in your material, and I knew none of it. To understand hlsl (material coding using DirectX), I also read glsl (material coding using OpenGL) documentations. So now, I know about both of these things, they’re quite similar. I even have some materials coded on my old phone and rendered using glsl.

The metaballs here are not optimal because they’re not objects, they’re spheres rendered within a 3D sphere. Every invisible screen space of this big 3D sphere cost something. And I’m using all that just to render 3 little spheres on screen. Not only this, but the 3 spheres also have a cost.

But I spent a whole week learning about all this and I didn’t have any more time to spend on this FX so I had to move on to the next thing, even though it wasn’t optimal, it still looks the way I wanted it to.

Visual Effects

Michel: My favorite visual effect is the dash of Kaneda. I had different versions, and the latest one is quite rich. For the dash to happen, there are multiple effects taking places. Cutting enemies is nice, too, but it’s more related to the blueprint script.

Although I do have something I prefer even more than VFX, it’s the parent material I made to distort a simple 250×250 pixels noise texture in various ways, which would produce different abstract textures used to make VFX.

From video 32 to 41 and 55 to 58, the parent material used for these effects is the same as the one from video 32 in this playlist:

In fact, the texture is so small I even remade the material in glsl on my phone, for fun and practice. I can safely say that right now this material is more like a tool to create textures. It looks a little complex at first.

I could have used the render target system within Unreal to simply “print” these textures, it would probably be more optimal most of the time, but since it was still running fine I’ve decided to use the material as it was. It also gave me the possibility to adjust every parameter of the material at different times to “Animate” the render.

Here’s the workstation I would use to make a base for a custom FX. In the video, I’m messing around through the parameters randomly in the previously mentioned material, you may see interesting shapes in the mix we could have used for our game too. The video is quite long so don’t hesitate to skip parts, it was recorded recently as a demonstration for this specific interview.

Challenges in the Animation

Jérémy: I’m not an animator, so to make the animations I took reference on some videos that looked stylish because I wanted the animations to look cool.

Kaneda is a blade master, so I found a youtube channel as a reference for many attacks with a katana. I combined that with some animation of Devil May Cry as a reference. Lil B can slide, so I had to take some video reference of the real roller and roller game.

The main priority was to combine the coolest, with the visibility of the top-down view. But I knew perfectly well what I wanted to do, so I have been able to choose the right way.

Michel: Most animations are played just the way they were made by our character artists. My work on handling these animations was to combine them so they can be played at the right time and interpolate from one state to another. That’s the easy part.

The challenges were to have physics interact with the coat of Kaneda. To have pieces of the robots fly when they get destroyed using physics. And the hardest of all, LilB.

The coat of Kaneda has thickness. Which means we can’t simply use the Cloth physics system from Unreal Engine, this specific system better works with planes, some geometry with no thickness involved. I could have demand to a plane like a coat, it would have been more optimal and the difference wouldn’t be noticeable with our top-down camera; but I’ve decided to take this opportunity to learn more about cloth physics.

To keep the thickness of the coat, we rigged the coat with additional bones, and we’ve put physics on the bones themselves. Getting familiar with the physics settings of Unreal proved difficult.

For the pieces of robots to fly, I had to make sure the animations aren’t playing even after “death”. Find a way to freeze their animation when they die. Replace the main body area of the skeletal mesh by a static mesh copy of this part. Animate the static mesh as if it was being cut in an anime style using translations through blueprint script. Parts are being separated slowly before falling. Make sure the static mesh copy has the same origin as the bone of the skeletal mesh main body.

But we can’t use the axis of the bone since it got altered by its parent, so we have to take the values of each axis used by the bones and apply them to the correct axis on the main body static mesh copy to get the same rotation.

Find a way to detach each bone from its skeleton hierarchy. Find a way to add an impulse to each bone from the mesh origin. Find a way to stop the physics from playing when the pieces aren’t moving that much anymore (for optimization). Make them all disappear then, finally, destroy the actor.

LilB was the hardest of them all because he has not one set of animations, but two sets. One when he’s sliding which happens if he has a high speed, and one when’s he’s running which happens when he didn’t reach high speed. Not only that but even in one set of animation he had more than the other characters. That’s because running forward, to the sides, backward and aiming left, front, right were all different states that needed their own animations. He even had 3 ways of aiming. Aiming while standing, aiming while running, aiming while sliding. All that had to be set to smoothly interpolate between every states and action. It’s a lot easier when the trigger to a next state is one step, instead of many smooth ones.

And for all these to work, you have to make sure that the bones axis and hierarchy from each animation file are the same as the ones from the original skeleton. If it’s different then I might have to waste time fixing. It’s quite delicate when you think about it, now we’re all kinda used to it but before that, it was quite tough on us.

Detailing

Michel: The VFX was used for the projectiles of the Cobra Turret, and LilB’s mechanical arm was based on free Epic’s Paragon assets, some smokes and explosions are also based on Paragon’s assets. Their settings have been edited to fit our pacing since we don’t need that much detail I removed whatever parts of the FX wasn’t noticeable in our game, making them cheaper to use. We used these to save time. Since I had to script everything, I couldn’t afford to spend much time learning how to make these cool FX so I used some paragon assets as placeholders. At the end of the production, there wasn’t much scripting left to do so I could afford to work on my own FX and materials.

That’s when I learned how to work on materials that I previously mentioned. After editing and making my own FX, I can safely say that the hardest part of this job is to make the materials. Once the materials are ready, setting up a particle system whether it’s in Unreal’s Cascade or Unreal’s Niagara system is quite easy. The particle Tools in Unreal are flexible and simple enough to let you achieve what you want step by step, combining one function with the next. As long as you know what you want to achieve, planned it, it should be fine.

The way you should plan an FX is to get the essential parts working, have it working as described, then find some ways to polish it, make it more convincing. Maybe work the pacing more, find a way to have all steps flow better, maybe add an FX to mark each step, or add some to interpolate between the steps. Find a way to mark Anticipation, Climax, and the Aftermath of this FX.

Since I had other stuff to make, I most likely didn’t exploit this mindset enough, but I tried to keep this in mind when making them.

I gave Niagara a try during the production, it’s a really nice concept, but a lot of situations could make unreal crash and the qualities of Niagara weren’t as easily exploitable as I first thought. I might need to learn a lot more before I can properly handle it, I know Niagara will keep improving so it will be easier later. The VFX I made using Niagara was still usable, but after I gave it enough tries and seeing how I wasn’t making really complex FX, I decided to stick with Cascade for the rest of the VFX.

All Paragon assets were also made in Cascade, viewing them, studying them, editing them. I got some basic understanding of how they were made and why. I then felt more comfortable working in Cascade, mostly because I knew the probability of it crashing on me was far lower than Niagara.

I used my abstract FX material to render textures for the projectiles used by machine gun turrets. I planned to do more and unfortunately, I had to stop, so the current render isn’t what I wanted. But I kept working on this abstract material, and could properly use it to make projectiles for Boss-Akari. Her projectiles have a more shiny spot, that’s where I spawn some particles whether it’s smoke or little shiny dots, the shape of the projectile itself could be either 2D or 3D. If I could make the shape in 2D using my abstract material I would do it, if not I would use a cheap 3D model to put the material on it.

This shape is a 2D texture:

This one is a cheap 3D sphere, scaled up on one axis:

Particles in Unreal are being displayed either as planes with materials on them or as meshes with materials on them. In any case, a material is needed to display colors. If you can achieve a specific look with only one plane and a material. Go for it. If you need multiple little planes to render shiny dots flying in a specific motion, maybe you don’t need a complex material, and you’ll focus mostly on the functions used by the particle system to move everything. If you need the particles to mesh, then you use meshes.

A particle is called a particle if it’s spawned by a particle system. Unreal itself consider a particle to be one only if it’s spawned by Cascade or Niagara built-in particle systems. Other than that, a particle is either a plane or a mesh. Remembering this lets you think out of the box. Technically, you could make your own “Particle system” using blueprint only, by spawning meshes or planes in a specific pattern, but it won’t be recognized as a particle system by Unreal. Most of the time you might not want to do that since the built-in particle systems offer a lot of useful functions to work with. But one regrettable aspect of the particle system assets from Unreal is that you can’t have a “Parent-Child” relationship between them. While blueprint assets do. So depending on what kind of FX you’d like to make, if it doesn’t need all the neat functions from Cascade or Niagara to animate or spawn particles, maybe simply having a blueprint handling these would be a better idea. And then you could easily iterate it through Child Blueprint.

Right now, the only way to have iterations with particle systems is to duplicate them. You make a copy, and you modify a part of the copy. It’s fine if you do it once or twice. But if you have a specific pattern that should be used for more than 30 different particle systems and you have to modify this pattern, that’s when it gets tough. Now you have to modify each and every one of these 30 different particle systems, one by one, to fit your new pattern. While with the parent-child relationship, you simply modify the parent, and all +30 children will act the same way.

Cascade and Niagara systems can be customized through the blueprint, so it may be possible to have all settings prepared through blueprint and iterate these settings there. But it wasn’t always easy to do, and sometimes it didn’t even work in my project. I’ll try to experiment with these some other time, as it would save a lot of time and effort if it were to work properly.

Lighting

Jonathan: There is only one dynamic (movable) light in the entire game, it’s a very discreet light that follows the character in the game, otherwise all the light are baked. The directional light (sun) is almost at zero, the demo takes place during the night so I used it to bring a very small amount of dark blue in the scene. Places that are not meant to be played on and lights used to bring a “general lighting mood” for differents rooms are always lights with baked static light, and I use stationary light to have some baked and dynamic shadow in a specific spot.

The use of lights in an action pack way is pretty straightforward, always make sure that the player knows, where he is and where is the way to go, I tried to do that but without having a level that is always lit, I created some places darker than other and some brighter to have some changes.

The Biggest Challenges Overall

Jonathan: I think the biggest challenge was to make sure that everyone knew what they had to do and do it in time, the more you work on a project the more you get new ideas. Sometimes it’s very hard to say “no, we can’t add this we do not have enough time”, some ideas during the development were added, but a lot was way too long to make and was never added to the game.

David: My biggest challenge with the project was that coming in the last year of school, I still had a fair amount of weakness, and I had to spend a lot of time to address them before I could then work much more reliably. Without the help and support from my team in learning and facing my difficulties, I wouldn’t have made it; Kyodai has been a huge learning experience. My special thanks go to Michel for sticking with me and introducing me to Blender, which revealed to be infinitely more useful and reliable than 3DS Max.

Michel: There was mostly a technical one, I had to verify every piece of information I could find in the documentation. Whatever you may read online or share with your peers, every situation, projects, and engine’s version can make everything different. So whatever information you’re gathering from exterior sources, you have to verify if what you heard and what is actually happening on your workstation is the same. You may waste a whole lot of time if the information is false since you’ll have to find how a function may work on your own through testing, opinions, trials, and errors.

Almost every time, the information I got wasn’t enough. It can be for various reasons, information wasn’t fully detailed, examples were highly situational and it becomes hard to apply the same plan in another situation, a vital chunk of information for my own situations kept missing. There may have been a mistake in the explanation. The information may be outdated and you have to find what the updated version is on your own. There may be a bug in the specific area you’re trying to work with, maybe in this engine version, the tool is not working properly anymore.

And it’s all up to you, the guys who are just using the engine, to figure out that you did not make a mistake, the theory was correct, the tool itself is faulty. This is most likely the worst that may happen since doubting the tool you’re using is the last idea that comes to mind, hours may pass before this idea becomes plausible. But it is something to consider, when no explanation can be made, maybe the tool has an issue. Sometimes the issue is even temporary. An issue might arise at a specific spot, no explanation can be made as to why it’s not working, and simply restarting the engine, delete an asset and import a fresh new copy of it will solve the issue. It’s not perfect.

Afterword

Doubt Everything. Verify Everything. Nothing can be really trusted in such an environment, but what you personally saw working on your own workstation is the absolute truth, as long as it’s still taking place on the same project. Once you switch project, engine’s version, be ready to have your next project work differently, when it happens, you have to adapt. Build flexibility, don’t be surprised, stay focused.

Jonathan HarsJérémy BergotDavid QuoniamMichel Lozes, 3D Artists

Interview conducted by Kirill Tokarev.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more