Professional Services
Order outsourcing

Sharks and Blood: How Tripwire Interactive Developed Maneater

Bill Munk, Game Director at Tripwire Interactive, shared the story behind Maneater: open-world and underwater lighting, shark design, blood VFX above and in water, fish behavior controlled by the LODs system, and more.

In case you missed it

You might find these articles interesting


Hello, my name is Bill Munk. I’m the game director for Maneater, the very first action-RPG where you play as a shark! Prior to Maneater, I’ve had the good fortune to be a partner at Tripwire Interactive for 15 years, working on titles like the original Killing Floor and Killing Floor 2. Tripwire Interactive is both developing and self-publishing Maneater.

Maneater: Concept

The original concept for Maneater came from Alex Quick, who folks may recognize as the lead for the original Killing Floor mod. And he actually also worked on Depth! For readers who might not be familiar, Depth is a multiplayer title that pit sharks against human divers.

While both games feature playable sharks, that’s really where the similarities end with these two games. The biggest differences stem from the genre. Maneater is a single-player, open-world, action-RPG (we’re calling it a shArkPG). With that, comes a variety of different environments to explore, our combat system, evolution system, and a story! Lots of people forget that Maneater has a plot that finds the player starting off as a baby bull shark, torn from its mother’s belly by the hands of a shark hunter named Scaly Pete. It’s a dueling tale of revenge and we hope players enjoy the ride!


The Environment Team worked very hard to make the world of Maneater look as good as possible and provide players an interesting place to explore. The city in the background is actually all modeled and exists in the world. So it was honestly pretty straight forward.  Everything done by the Environment Team was modeled in 3D Studio Max and textured using Substance Painter and Designer.


With stylized outdoor spaces like what was created for Maneater, you want lighting to be based on reality without becoming too photo-real which might create a weird disconnect between the lighting and world art. The key issue to solve is creating a general lighting palette that works for every area of the world. As far as actually populating the thousands of light actors across the game, that was made possible thanks to auto-placement tools created by Michael Kissinger.

The two main lights to consider for open-world lighting would be the influence of the sunlight and skylight. In our case, we pushed the sunlight exposure a bit to make things above water a little "hotter" and to expand contrast, while the skylight (ambient) lighting used hemispherical HDR images based on sunrise, morning, noon, afternoon, sunset, and night time situations that were sampled from reality. Having real lighting data drive ambient lighting and reflections allowed reflective materials and the water surface to respond more like what you'd expect. 

1 of 2

Lighting is a little different for underwater since so much red light is scattered away in reality. This was a realistic behavior we chose to emulate by subtracting the red channel contribution using post-process color grading. It's difficult to make underwater feel realistic without this step! Heavy use of post diffuse blooming helped soften the look in addition to the direct interaction of world and local lighting with the water thanks to the use of volumetric fog and shadowing. You can see the volume fog interaction clearly in sewers where you have strong shafts of light along the sides of the tunnels, or during sunset times when long tree shadows pierce the swamp water. 

1 of 2

With these base behaviors dialed in, the only major changes we made for the different regions were volume fog extinction scale (depending on the water type and how dirty it was) and tweaking the red subtraction behavior; more amber-colored in muddy swamp waters due to suspended dirt particles, and more blue for open ocean water.

Character Design

Designing the shark and all the underwater lifeforms players will encounter in Maneater sure is a different beast! Getting a realistic look and movements was definitely important to the team, but our overarching goal was to make the shark in Maneater feel like a culmination of all the things everyone loves about sharks – and make it feel fun to play.

Part of that comes through in combat, especially when players come across a rival apex predator. In these encounters, players will need to keep an eye on their enemy, looking for “tells” or hints at upcoming attacks. This back-and-forth dance when facing a stronger predator in the wild feels much different when preying upon weaker targets or human prey. And that variety is important for making sure the game feels fresh and exciting with each new encounter. 

The feeling of weight and mass was also really important to capture. We want players to feel more powerful when controlling a more powerful and evolved shark and more vulnerable during the earlier life stages. In this sense, mass plays a big role in how it feels swimming through the deep blue sea as a large and powerful shark as opposed to a baby bull shark exploring more shallow waters. Mass also affects the players' ability to grapple and damage other wildlife and humans. How players approach rival apex predators, underwater lifeforms, and human threats will change depending on the context of the situation and stage of life of the player’s shark.

1 of 3

As an aside, while we did use motion capture for human characters in the game, weight and resistance still came into play. We had a lot of funny moments trying to capture the actions of humans moving in shallow water. We wound up using a rope tied around the models’ waists to simulate resistance. It was great fun to watch and direct. We also had the idea of pretending to have our colleague dressed as a shark to capture his mocap as the shark. You can see the full thing in our latest Maneater development diary here:

Shark Animation

Actually, all of the shark and wildlife animations were hand-animated. We worked very hard on creating the illusion of water resistance and floating through keyframed animation. This entails thinking about force, momentum, where force is created, and where it's going. I got really in the weeds reading academic studies on shark fins and what they use each fin for and why just trying to understand HOW sharks move. We watched a ton of footage of real sharks in their element and then got to work on creating that illusion.

Blood VFX

Maneater is a game where you take control of one of the world’s deadliest marine predators. So you can imagine that blood would play a vital role in the development of the game. Working on the blood for Maneater was a unique experience that was accompanied by unique challenges. We needed to keep in mind that the blood effects had to have a distinct look to them above and below the water. We also had to be aware of the multiple size variations that our player shark would have access to (Pup - Mega). One of the biggest challenges that we faced throughout the production was getting particles to “play nice” with the large transparent ocean plane that spanned the world. Particle sorting, particle performance, and overall particle interactions with the ocean plane were a constant challenge for our VFX team. 

For Maneater we decided to take the blood and turn up the dial to eleven. We had to separate the blood into two styles, above water and below water. For the underwater blood, we used a combination of volumetric cloud emitters, animated smoke ribbons, small masked particulate particles, and animated smoke textures for impacts. We used the volumetric material particles because it gave us the ability to blend the blood with the underwater fog lighting. This made the blood puffs look shadowed and 3-dimensional when they were in the correct lighting scenario. It was a big win for us because of the constant lighting changes from the game’s night and day cycle. These particles also helped us fill more space with less particle count which was a big plus since we were on a tight budget. 

The animated blood ribbons gave us a particularly tricky time to get them looking how we wanted. Using Unreal 4, we could not get animated textures to read correctly on ribbon particles. You are usually able to control texture animation values with a module inside Cascade, but we needed to have the texture animate in the material instead of the emitter itself. So we created a material that could have the number of frames set inside and have its values managed through material instances. The material would be able to run through the animation frames (both advancing and cropping) without the assistance of the subimage index module inside Cascade. We used this material on many different types of underwater particles like blood, smoke, oil, and radioactive goop. This removed the need to get into Cascade which made it so much easier for any other artist, who was not familiar with the tool, to go in and add underwater liquid particles to the levels while still keeping the fidelity we originally wanted.

For the blood above water we used a more traditional approach. We relied mostly on dissolving masked materials and animated textures. The dissolving particles gave the illusion that the blood was dissipating and helped us dramatically with performance costs. These particle types were also reused when we needed to create any kind of water splashes because of our concerns with overdraw on the ocean plane. At this point, we realized that things were going to get a little tricky with drawing particles in front of the ocean. If we drew underwater particles in front of the ocean then the above water particles would sort behind the ocean. We decided that we had to write a sorting system that would know where the particles were located relative to the ocean plane. Depending on their location, the particles were given a sort value that would draw them in front of the ocean plane regardless of them being under or above water. During this time, our tech team was also busy with the look and performance of the ocean, and our environment team was busy making changes to the look of the world. We all had to coordinate and make sure that particles and sorting values were always up to date with the new art changes. The look of our ocean changed often during production so there was a lot of back and forth until we were happy with the final product. 

The second issue we had was dealing with the transition from underwater to above water and back. If the player has an enemy in their mouth which is spewing blood underwater, what would it look like when the player jumped out of the water? Blood has a very distinct look when in air vs being in a liquid. We needed to make sure that the underwater effects only played underwater and the above effects only played on the surface. So, we answered this question by creating two systems. First, we created a material function to separate the above water materials from below so that no underwater particle would cross the waterline and vice versa. It used the absolute world position node to test if the particle was above or below the waterline, then it blended between opaque and transparent. This would make the particles fade out and disappear as they were crossing the waterline. We then had to set up the timing on both ends of the waterline so that when one particle system would start to disappear the other would start to appear. This helped give us the illusion of a seamless transition between the waterline. The second thing we had to do was to create a custom Cascade module that would stop the particles from spawning when transitioning through the waterline (to help with performance concerns). With both of these systems, we were able to make the jumping in and out of water feel way more natural. When you play, you can see that the underwater mist, bubbles, and blood disappear and change into above water blood and splashes, then transition back into underwater effects when you land back into the water. It was a very satisfying time for all of us on the VFX team when we were able to get this system looking good. 

When it came to the blood pools on the surface of the water we tried various different methods but we could not get them to look the way we wanted because of the constant movement of the ocean surface. We decided to use a material simulation on the ocean surface itself so that it would follow the waves and movement more closely and give us a more believable overall effect. The simulation code was able to read where the action had happened and then it would alter the surface color channel of the ocean plane using a mask. After spending time dialing in the values for this system we reached a point where we were all happy with the overall look.

Our next issue was that all of the blood effects needed to run smoothly on PC and all of the consoles it would be ported to. Since we were working on them in tandem, we needed to create separate resolution scaled emitters (high, medium, and low) within each particle system so that code could switch between them when needed. We ended up using the Unreal Cascade detail mode bitmask to separate the emitters by resolution and used additive or dithered masked materials on the lowest detail. This way we could easily scale the fx up or down depending on which platform they were on. It also helped us remove emitters that were too expensive for lower end consoles but keep them in for the higher end ones. This technique would be used for almost all of the effects in the game.

Creating Fish/Birds Behaviour with Particle LODs

One other major challenge that we faced was finding a way to fill the world with life other than enemy AI. We were looking for a way to populate the environment with dynamic fauna but did not have the technical bandwidth to accomplish it. The team had experimented with different methods of adding ambient fish and sea life but unfortunately, because of the size of the world, everything that was tried ended up being too heavy on performance. The main processing cost had to be saved for all the enemy AI that would need to interact with the player shark. So the VFX team decided we would try and tackle this issue. In the end, the idea we came up with consisted of us “hijacking” Unreal’s LOD system. The LOD system works by creating multiple Levels of Detail relative to how close or far away you are from the emitter. Normally, the farther away you are from said emitter the lower the number of particles you will see. This is meant to keep particles from using too many resources when far away. For our solution, we used this system in the opposite way. Let’s take the fish, for example; when you swim around in the world of Maneater you might notice small fish that swim off and disperse when the player shark gets near. Or you might see a group of birds that fly away if you get too close. Other things you might see that behave this way are frogs, squid, anemones, and some garbage clumps. 

We achieved this by having two systems working together. First, we set up the particle LODs in the opposite way you normally do. Since each environmental effect would be first encountered at a distance we started with the farthest LOD which would reset or turn on the particle system. Then when you get closer to the emitter the next LOD would kick in. In this second LOD, the full fish school is spawned and the fish idly swim around in a designated volume, we call this second LOD the idle state. As the player shark approaches the fish school, the next LOD would switch to an emitter that would have a shorter lifespan so as to do a constant “check” if the character was close enough to trigger the action. Once the player reached the nearest LOD, the emitter would turn on an acceleration module making the idle fish “escape” in various directions. In the case of birds, it would kill the idle particles and spawn an “escaping” particle (a bird flying, squid swimming or frog jumping away). Once the subject “escaped” the particles would die and this nearest LOD would have no more particles spawned until the shark was far enough away to spawn the idle particles again (Farthest LOD). Then the cycle would repeat if you swam close to the emitter again. This method saved us a lot of code work and computation time since it was all being done within the particle system itself. We essentially were able to create fake “interactive” particles that used the LOD system that we were already paying resources for.

The last bit of the puzzle was making sure that emitters that were far away or not visible were turned off. We couldn't have tons of emitters spawning particles in the distance because of performance issues. So we worked with the programming department to help create a culling system that was designed to make sure that these particles were only active if you were within a certain range from them. There was a lot of time spent to get the values for each LOD range setup correctly. We needed to make sure that fish/birds didn’t spawn/pop in while you were close to them, this would kill immersion. Once we got this all working the world of Maneater felt more alive than ever before. 


It was a wild ride working on Maneater and the bumps we ran into made us better developers in the end. Every game is a new learning experience, each one offering different puzzles to solve. On to the next one!

Maneater has been released on May 22, 2020, and is available on PlayStation 4, Xbox One, and PC via the Epic Games Store. It will also be heading to Nintendo Switch soon, though we haven’t formally announced a release date for that platform yet. 

Bill Munk, Game Director for Maneater at Tripwire Interactive

Interview conducted by Daria Loginova

Keep reading

You may find this article interesting

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more