logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

VFX for Games Explained

VFX master Francisco García-Obledo Ordóñez talked about the way he blew watermelons up, created lighting and other cool stuff in UE4.

VFX master Francisco García-Obledo Ordóñez talked about the way he blows stuff up, created lighting and other cool stuff in UE4. Some very nice breakdowns for UE4 users inside.

Introduction

Hi, my name is Francisco García-Obledo Ordóñez and I’m a Senior VFX Artist working at King Barcelona, city where I was born 34 years ago. I started working in the industry in 2005 as a 3D artist in a small company here in Barcelona. At that time videogames productions weren’t that big so being an artist meant having to deal with a lot of different tasks: modeling, uv mapping, rigging, texturing, lighting, etc.

My transition to VFX Artist started in my second company, the Barcelona studio of Grin, swedish company creators of Bionic Commando and, the one I worked on was Wanted: Weapons of Fate. This was my first time moving away from intensive art tasks and I took care of more technical tasks being responsible of helping the environment/props team to export assets and dealing with programmers to implement new features (shaders, gameplay, tools, etc). When I joined Mercury Steam in 2011 I officially became a VFX Artist. I worked on Castlevania Lords of Shadow 2 and I was responsible of almost all the VFX work: power ups, magic, character effects, gameplay effects, etc. I learned tons from this experience as this allowed me to start getting deep into the VFX community, knowing terminology, plugins, software and, most importantly, people in the industry who also were working as VFX Artists.

When we finished the game I decided to move from my comfort zone and try in a new country to join a team of 4 VFX Artists. I was hired at Splash Damage (London) in 2014 where we developed Gears of War Ultimate Edition. It was my first experience working with commercial engine (the original Unreal Engine) from start to the end of a project and it gave me another perspective of how VFXs should be done. I took over tasks such as Unreal Script, shaders creation, etc. It was a huge responsibility working on such an iconic game but I think it paid off because, after working on Gears of War 4, I decided to join Rockstar North (Edinburgh) to work on such a huge game as Red Dead Redemption 2. I was part of a team of 10-12 people including FX Artists and graphic/gameplay programmers. It was very challenging/rewarding going back to work on proprietary engine and being able to implement new features, shaders, etc.

For personal reasons I moved to King in Barcelona and at the moment I’m working on games currently in development as a VFX Specialist. I’m assuming a new role within the company trying to bring the specialization to mobile game development, where usually roles have been more of generalists.

Main Tasks of VFX Artists

When talking exclusively about VFXs, I would talk about 2 major types of tasks: gameplay effects and environmental effects. It depends a lot on the project we are working on how these are divided. For example, in a game like Castlevania (hack and slash), 90% of the vfx tasks consisted on character powers, magics,… that really make an impact on the gameplay. These kind of tasks require a big understanding of the game mechanics and involve constant communication with the design team, with which you have to constantly negotiate. For example, let’s put a flamethrower as an example. Gameplay designers define a damage area for the attack and then you have to create the effect on top of the debug cylinder. As the fire lingers while it’s dissipating, it will not follow the damage area completely and designers will come to complain about it. This is one example of “conflict” between two disciplines because if they are very strict, fire won’t feel like fire so you have to come up with alternatives to convince them that the player won’t notice that a tiny lingering fire won’t cause any damage.

Similar game genres where the gameplay effects are more “important” would be fighting games or rpgs, amongst others.

There are other game genres, like for example shooters (specially realistic ones), where the environment effects are equally important to the gameplay ones. Here, the VFX Artist in charge of environmental effects becomes almost an environment artist and most of the collaboration will be with this team. Examples of environmental effects are waterfalls, mist, rain, etc. The game in which I created more environmental effects was Gears of War 4, where we were working on the multiplayer maps and we had to take care a lot about the performance hit on these effects because the game had to run 1080p at 60 fps.

Realistic Effects

The more powerful the technology becomes the less limitations we have to achieve realistic visuals. Especially in cutscenes we are reaching similar quality to movies VFXs.

This is partly thanks to the amount of software we’ve had access during the last years to create amazing textures. These are just a list of examples of the tools In order to create:

And for all above: Houdini. The guys at SideFX are making an awesome work, especially in their latest release getting closer their amazing software to game development. Having the vast range of possibilities the software offers I can see Houdini becoming a standard in the game industry, at least in more realistic projects. Once they manage to ease the steep learning curve they’ll get more people using it, just exactly as happened time ago with Allegorithmic’s suite.

But all this software is mainly focused on precooked asset creation (textures, meshes, animations) that we’ll use later in our game. You get high quality realistic results but no variation whatsoever. A precooked explosion texture will show always the same look and motion. And here is where the real challenge comes, we have to achieve that quality in real time. And we are getting there, companies are improving the real time techniques and we are already seeing from real time physics and fluid simulations to complex shaders, such as the motion vectors frame interpolation by Guerrilla (Anatomy of a Nuke) or the real time raymarching implemented for the fire/smoke in Uncharted 4 (Siggraph 2015 volume shader).

Particle Effects

A good way of understanding how particle effects work in real time is to analyze how a popcorn machine works:

  • First you have to decide how much popcorn you want. This would be our spawnrate

  • The corn kernels represent the points where our particles will be located

  • The pop would be the render of our actual particle. It’s up to us to decide if we want it to be a mesh (3D) or a sprite (traditionally a 2 triangles plane)

  • After the pop, we get a popcorn flying and eventually falling. This represents the initial speed plus gravity (dynamics).

  • The difference between the popcorn machine and real time particle effects is that, while you cannot get enough popcorn in real life and you can even choose between butter flavoured, sweet, salty, etc. in a realtime engine you have to make a compromise. The more sophisticated the shader the less number of particles you can have on screen so you don’t harm framerate.

    Production

    I like to see us, VFX Artists, as some kind of magicians and I cannot help but seeing all the tricks I used to create that effect.

    This effect was made as a result of an art test. The requirements of the test were:

    The purpose of this test is to show your ability to create “AAA” effects.

    • Create a multilayer, magical based, attack that destroys a cube.
    • The effect must consist of: anticipation, attack and impact, followed by the deconstruction of the cube.
    • All textures must be of your own making.
    • Make use of single frame textures, and/or at most, 1 animated flipbook.

    Whit those guidelines, the first thing I do is create a library of references (vídeos, still images, pinterest, etc). Some people, who are better at drawing than me, prefer to proceed with story boards, drawings or paintovers but for me it’s much quicker to gather references and assemble everything in my mind.

    When I decide the overall look of the effect I identify the main element. That is, the rays. I decided I would go for the same technique I used to create the Teleport Smoke effect on Castlevania LoS2.

    3ds Max

    Directional rays

    First, we create one spiral-like mesh that will define the trajectory from the attacker to the victim. As the effect can be seen from different angles and I prefer not to use trails or beams, the mesh will consist of three planes rotated 60 degrees each.

    Without the texture/shader it seems odd but the end result will look as a three dimensional ray.

    The next step is to create a spline that will define the trajectory of the ray. We’ll use this spline to apply a path deform modifier to the mesh.

    Now we have the static ray. The way to make it appear and disappear is as simple as scrolling vertically the uvs of the texture via shader. The trick here is to set the texture in Unreal as Clamped. This way, when the UVs are outside the 0-1 space the mesh will be transparent. Otherwise we would see the ray repeating.

    You can check the differences here between the texture set to tile:

    And the texture set to clamp:

    Arc rays

    Ok, now we have the main element of the effect. Let’s dress this a bit more. We need some kind of anticipation from the attacker. We need to make the player aware that something is going to happen. For this purpose I decided I would use the same rays but with different trajectory. This time we’ll create a 200 degree arc and we’ll apply a path deform to the cross-section mesh.

    Fracture

    The next element of the sequence is the explosion of the cube. There are some ways we can approach this in 3dsmax but I made it with Rayfire plugin. I exported it as an animation to be used later in Unreal.

    UNREAL

    Now is the time to put it all together. I started by adding the anticipation effect. To do that I created a particle effect using the arcs we created. Making those meshes spawn randomly rotated 360 degrees and moving their UVs creates a surrounding effect on the sphere which resembles a shield. To break the stiffness of the meshes I decided to add GPU particles with a Vector Field. At first, with these two elements I didn’t see very clearly when to trigger the directional rays.

    I realised I was missing something very important and and that turned out to be a status change. In order to trigger an attack I always like to give more punch to the anticipation for a short amount of time. Then I decided I would change the color of both the meshes and the GPU particles to something more harmful. I chose orange-ish because it reminded me to fire and red hues always help to express something is dangerous. Also, to give more power to the effect I added a orange light which helped to transmit the feeling of energy accumulated.

    With the anticipation effect done it was time to emit that energy forward, to the enemy. I wanted the effect to be powerful so I I created a particle effect consisting of three emitters, each one of them with a copy of the directional ray rotated 60 degrees from the previous emitter. I adjusted the emitter delays and the scrolling of the UVs so not all three spawned at the same time and with the same speed but they impacted in the same frame. This subtlety is something a VFX Artist has to pursue and spend time on. Most of the viewers/players won’t notice the tiny adjustments but if you show this effect with and without any variations on delays/scrolling/sizes… they will notice the differences.

    The last part is the impact. The first step in Unreal is to trigger the animation in the exact time in Blueprints. Then I created a particle effect for the destruction. The particle effect consisted of different elements: light, sparks (gpu particles and not all colliding for performance reasons), lightnings, aftermath smoke, screen distortion (which shrinks the cube for 0.1 seconds and adds extra power to the explosion). The special element in this effect was the lightnings. I created a single sprite texture with UV Distortion to animate it via shader:

    Also, I added Erosion, which is a very handy shader to make the particles disappear not just with a simple fade out:

    This way we have the whole sequence done. Of course after putting everything together there are some tweaks to make both in the art side or timing.

    Tools

    To visualize the effects the most obvious choice here would be getting started with Unreal 4. Not because I think it’s the best but because it has the biggest community and you will find tons of tutorials, examples, marketplace, etc. To get started. And from that you can search in youtube for tons of tutorials.

    Then we have Unity which I haven’t had the chance to use much because most of my career has been with proprietary engine and the only time I worked with a commercial engine was either Unreal 3 or 4.

    Talking about asset creation, there are some pieces of software which are very useful for us the VFX Artists. Fume FX, for example, is very good to get animated fire/explosion/smoke textures and I have to thank the developers because it was one of the first plugins I found when I started in this field and it’s pretty impressive how they managed to build a tool with which you can get very nice results in a very short amount of time.

    Lately I’ve been using Blender a lot. I started just because it is free and I didn’t have to mess with licenses and stuff but now I really like it. You can get the same as Fume FX plus fluids, rigid body simulations, fracture tools, etc. I managed to get done in this software some ocean meshes for some multiplayer maps in Gears 4, blood textures for Red Dead Redemption 2, oil splashes, etc.

    Blood splatter simulated and rendered in Blender

    My most used software is 3d studio max mostly to create low poly meshes, UV Mapping, rigging, etc. Then we have also Adobe Photoshop to tweak some rendered images or create some textures from scratch to apply to the particles.

    Animation

    Most of my experience being a VFX Artist so far has been working on the gameplay side of the effects. That said, most of the time I was given an animation I had to stick to and add the extra layer of effects (blood, magic, impacts, explosions, etc). The animation I’m referring to could be an in-game animation or a bespoke one for a cutscene.

    When working with animations, usually VFXs are attached to bones, helpers, objects, etc that are in the scene. This is a back and forth workflow between the animation and the VFX teams where the latter can ask for changes or additions to the sequence such as adding extra keyframes to make a sword trail smoother, rotate a helper to use this rotation as the direction for a blood splatter or change a camera cut to avoid compromising point of views for a mesh particle, for example.

    Also, there are some features from the game engine that can help for the animation or the dynamics of the effects. For example, when I was working at Mercury Steam, as we were using proprietary engine and I was very close to the core technology team, I was lucky enough to be allowed to request new engine features. I benefited a lot from it as I could ask for new ways to link effects to a bone: I was able to link an effect to a bone using its position but not the rotation or taking the bone position and rotation but using always the Z (height) of the ground (very handy for fire linear attacks on Castlevania).

    I remember it was especially satisfying when one of my requests was implemented without the programmer being very convinced to if what I was asking for was going to work. I asked a way of setting the initial velocity of the particles over the lifetime of the emitter. This way, using this method with a trail, I was able to get a blood spurt that never disconnected from the point of impact. This extra layer of meticulousness added the viscosity feel I was demanding and I was very proud of it. In fact, in all the projects I’ve worked on (I kind of got obsessed with blood after working on Castlevania) I always demanded that feature very useful for other things as water, oil, etc.

    I remember one programmer approached me one day and asked me: “How the hell did you do that blood effect that comes from the hand of the enemy in this cutscene?” I had to remind him he was involved in the implementation of the feature and it finally paid off.

    Framerate 

    One of the big tasks of a VFX Artist when it comes to implementation of the assets in game is the optimisation. The word we say and listen to the most is overdraw. Usually game engines render the final image in passes (albedo, roughness, metalness…) and every time there is a particle on the screen, depending on the rendering type of it, it impacts the rendering pipeline. For example, if we have 2 translucent particles on screen which are overlapping, the rendering cost of the pixels which are overlapped will be double. This is especially bad for performance when we have full screen particles, as the engine would have to render x amount of times 1920×1080 pixels (if Full HD). Here is a good thread to know more about overdraw.

    Most of the time our elements are not the most complex ones: mesh particles have low triangle count, shaders are not very complex (usually), particles are lit in a much cheaper way than the characters or environments, some times particles are not lit at all, etc. Even with that in mind, particles can get very expensive at runtime due to the way they are rendered. Roughly there are 3 major ways of rendering the particles (in Unreal called Blend Modes):

    Opaque: this is the cheapest, but it’s barely used unless we are talking about mesh particles. This is the cheapest method because particles overlaying others occlude them so the engine discards pixels more easily.

    Alpha masked: this is also called alpha test where we have 0 or 1 transparency values. So we either have visible or invisible pixels, no semi transparency at all. This is the cheapest way of achieving transparency, at least in console games. The opaque pixels on the sprite occlude the pixels underneath just like the opaque method explained above.

    Translucency: this is also called alpha blend. In this case we don’t have any occlusion whatsoever. This is the most expensive method to render a particle. On top of the performance issues it’s hard for the engines to sort the rendering of semi transparent pixels because usually they don’t store the Z Depth (opaque and alpha masked do). So this is why most likely you’ve seen popping on some games (fog, dust, windows, etc)

    We also have two lighting methods:

    • Unlit: particles are not lit at all. All the lighting has to be baked in the texture or do it via shader (not totally accurate).

    • Lit: particles get similar lighting to the environment and props. This method is trivial on opaque/alpha masked particles because they store Z depth information. However, by default translucent particles don’t store that info, therefore the engine doesn’t know where these particles belong in the world. Enabling lit shading method on translucent particles adds an extra layer of complexity to the render of every particle.

    Adding lit shading method to translucent particles allows semitransparent particles to store Z depth info. This is expensive per se, but nowadays graphic cards can handle it if used carefully.

    Best practices:

    • Use the texture space wisely. When possible, avoid too much transparent pixels in the texture trying to get the opaque pixels as close as possible to the borders.

    • Use alpha masked when possible.

    • If the engine supports it, use clipping or hexagonal shapes when possible.

    • Use spawnrate with caution and make a compromise.

    • VFX Optimization guide

    Example:

    When we were working on Gears of War 4 multiplayer maps we had to be specially careful with the performance impact of our work. Our goal was to get 1080p at 60 fps so we had to be clever to achieve the quality it was expected but always getting smooth gameplay.

    There was a level with a grass field and after the profiling phase, the technical art team pointed the grass to be the most expensive element on the map. So after some discussions between environment art, technical art and vfx art departments we decided to try one technique. I was in charge of, somehow, make the grass sprites so that the further they were from the camera the less transparent pixels they had. I created a diffuse texture covered with grass blades but the opacity was done in a way I had groups of blades with different grey values. This way, the shader increases the value of that grey the further the camera was so that the blades where visible. As the shader was set to alpha masked those blades only were visible when the alpha reached 1. This way we saved tons of overdraw in the long distance.

    Another example of optimisation and tricking the player is deleting the particles when they get close to the camera. Actually particles don’t get deleted but they shrink so that they are invisible. We did this on Gears 4 too with smoke particles, lightshafts, etc. In the shader we did that the particles became gradually transparent when they were at X distance from the camera (set in the shader) and when they were totally transparent we shrank the vertices. This last step was done because Unreal wasn’t clever enough to not render a particle which was totally transparent so we were getting overdraw issues.

    Advice

    I don’t know any VFX Artist who doesn’t know about imbueFX. Don’t worry about the content being too “old” it’s still invaluable what these guys helped us all with these videos.

    There are also some breakdowns around there than can be easily followed by anyone interested in the field and has a basic knowledge about some 3D Packages. Also, the official Unreal Engine 4 website is a good spot to learn:

    Unreal 4 Particles Systems

    Some years ago I was invited to a Facebook group called Real Time VFX. I never imagined I would get so many knowledge from this platform. Recently some guys decided to create an own domain for it and, although we still share our knowledge in Facebook, we have a more structured content now.

    Real TIme VFX

    In there you’ll find the best VFXs artists in the industry, people from Naughty Dog, Epic, Infinity Ward, etc. all of them showing their tips and tricks and always open to help.

    Francisco Garcia-Obledo Ordoñez, Senior VFX Artist at King

    Interview conducted by Kirill Tokarev

    Follow 80.lv on Facebook, Twitter and Instagram

    Join discussion

    Comments 1

    • James

      Thanks so much for this blog. I have learned a ton from it.

      0

      James

      ·6 years ago·

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more