Simulating Liquids in Bottles with a Shader

Gil Damoiseaux has recently shared the result of his attempt to recreate the liquid in a bottle effect from Alyx via a surface shader in Unity. In this article, he shared the technical details of his experiments with different kinds of liquids.

In case you missed it

You might find these articles interesting

Introduction

Hi everyone, My name is Gil Damoiseaux, I’m a lecturer at HEAJ (Haute Ecole Albert Jacquard) in Namur, Belgium. I mainly teach technical art, specifically shaders development the usage of technical art tools like Substance Designer and Houdini as well as provide guidance on projects. I’m also working with JangaFX as a technical advisor and R&D developer, we are currently working on Embergen, a real-time volumetric fluid simulation software for games and film. Initially, I studied computer science during a master's program at the local university in Namur, the usual database and OS-related matter.

I then worked for a game company called Appeal where we released a couple of games, including Outcast in 1999, a voxel-based adventure game. I also spent 8 years at NeuroTV, developing a real-time rendering engine for television used for virtual sets, avatars animation, or just show branding. I went back to the game development with AMA studio where I worked on Fighter Within, a Kinect fighting game, part of the Xbox One launch titles. Alongside this, I was already giving some courses, but I decided then to focus on that and go full-time as a lecturer in video games. Six years ago, I started working as a self-employed freelancer and created a real-time procedural planet generator with Grand Designer, on which I’ve worked for 4 years adding features and improving it. I joined JangaFX a year and a half ago because we shared common goals for tools development and it made sense to work with them because I was building a similar product by myself.

Inspiration

I’ve always been fascinated by art creation and the creative process in general, so when I had some free time, I learned different DCC tools from my art colleagues and on my own, but all the time, I kept coming back to programming. And so, naturally, as the technology was evolving, I embraced the shader development as my main center of interest.

I was and still am inspired by the demoscene in general, it’s always extremely creative and impressively technical in many ways. I was part of it 30 years ago at the time of the Archimedes scene (the first desktop computer to use an ARM processor) when I coded in assembly language... And more recently, all the creative coding scene with people like Etienne Jacob or generative art in general is a huge source of inspiration. Of course, all the game industry inspires me too, I can't help but analyze the effects I see on the screen and sometimes try to reproduce them.

Alyx VFX Experiment: How It Started

Like many of us, I was quite impressed by that VFX as soon as I’ve seen it, wondering how it was done to look that natural and real. Then, I saw here and there on Twitter and Discord people trying to reproduce it and took the challenge to reproduce the effect too, as a fun weekend project. It is always a good exercise to try to reproduce a nice effect, and always an opportunity to push things further, to learn new techniques, and to see how close to the original you can go. I’ve done that previously with some other effects, like the root animation in God of War (see the final effect here and the breakdown here) or the interior mapping in Spiderman on PS4 (available on my Github). I always learn new things by doing this kind of exercise.

Previous Work on the Liquid in a Bottle Effect

Some time ago, I made a liquid in a bottle effect, but much much simpler, without any physics, simply a level that you can change, - but even that had limitations. The idea was to render the inner part of the container and collapse the highest vertices on the surface of the liquid. This has a big advantage compared to the usual alpha test technique used to cut an object: you can have the actual surface of the liquid, with proper lighting. For that, I simply used a planar projection using a noise texture with the normal texture matching that noise. This shader can be found in the community samples in Amplify Shader Editor.

But there is also a big restriction on this technique, you can’t have a container that is narrower at the bottom as the liquid will hang out of the container. You can see in the following example that I’m not lowering the liquid too much to avoid the problem. 

You can see the original effect here, with the wireframe version to watch the collapsing in action: 

1 of 2

Last year, I also made some shaders during one workshop held at school, they were using compute buffers to store values on a per-vertex basis. The main problem with shaders (vertex shaders, in this case) is that they are stateless, you cannot store anything from one frame to get it back to the next frame. So, it’s not really straightforward to have, for instance, some physics going on, inside a single shader… except if you use compute buffer in the read and write modes. This will allow you to store and recall as much data as necessary per vertex. Shaders may also be called multiple times per frame, for Z pre-pass or shadows or any other reasons, so I had to implement a double buffering on the compute buffers: one was in the read mode, the other one in the write mode. 

This is the shader done during the workshop in action, a simple pendulum effect on every vertex with a normal pointing down:

So, the original idea was to mix those two previous experiments into a new one. The surface of a liquid is basically reacting like a network of springs. To get more control over it, I decided to use springs moving along one axis, the up-down one, as it is the most visible movement of the liquid and it will also prevent the surface of the liquid going out of control, outside of the container. It replaces the noise texture I was using before and adds physics, although it is not a real liquid simulation. I was also expecting that I could get something decent without having to propagate the spring movement on the surface, but just keeping the movement local… it was indeed the case. So, when the object is translated or rotated, I simply translate the position of the spring up or down, depending on their position relative to the pivot of the mesh: if I move the object to the right, vertices on the left will be raised and vertices on the left will be lowered in order to simulate the accumulation of liquid in the opposite direction of the movement. And after that, the spring physics will kick in and it will try to go back to its rest position. The further away the spring is from its rest position, the strongest the force to bring it back will be. In addition to that, the current velocity is dampened every frame to avoid never resting springs. And that was the first implementation of the surface physics, leading to the following result: 

So far, I’m storing in the compute buffer: 1) the previous position of the vertex, from which I can get the local velocity of the vertex (translations and rotations); 2) the position of the spring, either above or below the liquid surface; 3) the velocity of the spring and whether it's moving upward or downward.

There were still some issues, the first one was that I was limited by the form of the container that prevented the liquid from hanging out as mentioned before, hence the first tests in spherical containers. But I have partly reduced the issue by projecting the vertices toward the inside of the container instead of projecting simply on the plane. It’s not a perfect solution, but if you choose your container shape wisely, this will not be visible.

Another important issue was that I was not able at that point to have a proper normal on the surface of the liquid and was only using a normal pointing up, hence the lack of nice specularity.

As I was still trying to keep all of this in a single shader, I met the issue that vertices were not aware of the geometry and did not know their neighbors. This is true unless you give them the information: you can store plenty of static information in a vertex, normal, tangent, or mapping coordinates. So, I made a little script that was improving the mesh by storing in an unused texture coordinate channel the indices of two of its neighbors. As the position of the vertex is also stored in the compute buffer, any vertex is now able to get the position of two neighbors hence is able to recompute its normal. In fact, it’s the normal from the previous frame, but it’s just impossible to tell. 

To improve the shape and movement of the liquid surface, I modulated the injected forces with some sine waves based on the world position of the vertices. That immediately added lots of details and created a much more believable liquid surface. I also increased the mesh density at that time for more details.

And after that, the usual screenspace refraction was added to have some translucent liquids. I also sampled multiple times during the refraction to have some roughness on the translucency and avoid purely translucent liquids. You can see here the regular refraction and the multi-sampled one. Notice that the glass is also using that technique for the frosted glass band.

1 of 2

Bubbles & Foam

I got awesome feedback from the Twitter community after posting the first version of the liquid, but a comment I often got was ‘When will you add the bubbles?’. 

To do the bubbles and foam, I needed to keep track of how agitated the liquid was. The simplest thing to do was to add a variable stored in the compute buffer that will accumulate all external forces applied locally, by adding to each frame the absolute value of the impulsions mentioned earlier. This variable also decreases each frame to allow me to simulate the foam and bubbles dissipation and I used this as my mask.

For the rendering part of the bubble, I computed a 3D noise which is scrolling upward and which is using some smoothstep depending on the mask to make the bubbles bigger or smaller. And for the foam, the mask is used on the surface of the liquid to add some extra displacement on top of it and also color the surface differently and modify the local translucency.

Here is the first version of the bubbles and foam:

As the original effect was already doing something quite convincing with bigger living bubbles, I had to improve what I had, especially when coming from Belgium, the land of beer. I modeled a beer bottle in Fusion 360 and found a nice fake brand texture to finish it properly.

I used another 3D noise, on the vertex shader this time, to randomize the dissipation of the foam which was a bit too artificial at first. 

Working with Viscous Liquids

I wanted more of this, so after digging into the possibilities, I thought about more thick and viscous liquids, like syrup or blood. I disabled the foam on this and in order to keep track of the sticky part on the interior of the container, I added another variable, maintaining the local stickiness of the liquid. This is just set to the maximum when the vertex is under the liquid surface and slowly and randomly (using the same 3D noise as before) fades away when above the surface.

But the rendering had to be done a couple of times more, as the vertices of the liquid were already used to display the surface. So, for that, I had to do 5 passes: the back glass of the container, the back of the sticky part of the liquid, the liquid itself, the front part of the sticky liquid, and the front glass of the container. The sticky pass is quite lightweight, only reading the mask value on the vertex shader, but it’s still 2 extra passes anyway.

Here is an early version of that technique:

The shader is quite flexible and has some parameters like the strength of the spring: the higher the value, the quicker the surface will go back in place. Another useful parameter here is the damping of the spring, defining what will remain of the speed from the previous frame: the higher the damping, the thicker the liquid will feel as it will take more time to move as the velocity is mainly lost between frames.

Here is a more dense and viscous liquid:

Having a liquid that is too viscous may be a problem for the projection correction mentioned earlier as you have more chance of having liquid hanging out of the container.

Theory: Bottle of Champagne

If I were to make a bottle of champagne, the main issue would be liquid hanging out of the container again, or you'll have to cap the bottle as low as possible. That’s what I’ve done with the beer bottle, using the label on the neck of the bottle to hide that.

Pouring the liquid out of the bottle raises more issues: the first one is that going under half the capacity of the bottle will put emphasis on the fake physics of the liquid, the second one is that you will need some better physics to handle the pouring, like particles at least or a proper liquid simulation at best. Ryan Brucks has made some really cool experimentation in that direction recently, but his approach is quite different than mine. So, that would be a much more challenging setup and I will probably go toward a different technique than this one to do it. 

Theory: Color Changing Liquid

If you want to change the color of the liquid when shaking it, there is an easy way of doing that by using the agitation variable mentioned above. The more agitated, the more secondary color is visible. It’s working as you can see on the following test, but it’s a bit rough.

A proper mix of multiple colors will probably imply some color diffusion or even some low definition real-time fluid simulation at best. 

The simple solution will probably be enough in the majority of cases though, as in shader development, making it look complex and staying simple and fast is preferable in most cases.

Afterword

The challenges of this effect are mainly tied to the usage and your final platform. In my case, I did not put any limitation on what I was doing, I just wanted it to look convincing and cool. This technique is certainly usable in a project if you accept the limitations that come with it. To give you a rough idea, the current implementation, with all the bells and whistles, is taking around 1.5ms on my 2080RTX Ti, using a 50K triangles model. This experimentation was not meant to be production-ready, so I did not spend time ton optimization, but there are plenty of things that can be improved.

I’ve made and improved it to see if that specific technique can be used to have something convincing, and I have lots of other techniques that I’d like to experiment with to make it less expensive or less restrictive. There is always room for improvement, for new ideas, for new usage of a given technique. There will always be creative people that will do better-looking implementations or faster ones and that is what is always driving me to try new things and continue to look for inspiration everywhere. Once you’re addicted to creative programming or any other domain, you will never stop asking for more and learning and experimenting with new things to open new doors.

Gil Damoiseaux, Tech Art Lecturer at HEAJ & R&D/Tech Advisor at JangaFX

Interview conducted by Arti Sergeev

Keep reading

You may find this article interesting

Join discussion

Comments 3

  • titangate1202

    Great read! I've been trying to reproduce your result as a process of learning ComputeShader and its interaction with vertex shader.

    I couldnt figure out what you did to "partly reduced the issue by projecting the vertices toward the inside of the container instead of projecting simply on the plane.". Do you mind elaborate a bit further on this?

    2

    titangate1202

    ·3 years ago·
  • titangate1202

    @Andy i've tried various way of collapsing the vertices but all had issues. i dont think what you described can work well for a slim shape, like a beer bottle, the radius would vary widely and you get uneven distribution of vertices if you tilt the bottle sideways.

    anyways i've totally forgot about my attempt and great to see other people experimenting on this.
    you can reach me at twitter @titangate and i look forward to more discussion.

    0

    titangate1202

    ·3 years ago·
  • . Andy

    @titangate1202 I might be able to shed some light on that, I've been looking into my own implementation of this technique recently.

    Now it might not be the same approach, and as far as I know it only works with primitive shapes, but when Gil says projecting the vertices back into the centre of the mesh, I presume he means this:

    To stop the water falling out of the container, you'd move the verts back towards the centre of the mesh, at the height of the fluid. With a sphere this is quite easy as it's simply Pythagoras.

    First you construct a triangle (abc) which is the vector from the center of the sphere to the vertex point you wish to constrain, this vector is the hypotenuse ‘c’.

    Side ‘a’ is the height of the water from the origin along your up axis. This assumes the origin is in the centre of the shape.

    Side ‘b’, the perpendicular vector from the up axis is then b = sqr((a * a) - (c * c)). This gives us the distance from the height of the water plane, to the vert’s actual position.

    We then construct another triangle (def), where the hypotenuse 'f' is the radius of the sphere, and ‘d’ once again is the height.

    We calculate ‘e’ in the same method to 'b' above, to get the furthest distance our vert can be placed before it hits the wall of our spherical container.

    Finally, translate the vertex towards the centre line by differing length of sides 'b' and 'e'.

    I've not looked into other shapes yet, but I hope is this helpful for anyone else curious.

    1

    . Andy

    ·3 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more