Shapeshifting Object VFX in Unity
Subscribe:  iCal  |  Google Calendar
Cologne DE   18, Aug — 20, Aug
Cologne DE   20, Aug — 25, Aug
Vienna AT   23, Aug — 25, Aug
Anaheim US   27, Aug — 30, Aug
SEATTLE US   30, Aug — 3, Sep
Latest comments

Found it here:, just in case anyone else is looking for it.

The link at the end is pointing back to the article. Couldn't find the Quad Remesher and I would really love to test it.

by Step Down Transformer
11 hours ago

Are you looking for Step Down Transformer? your search is India. we are leading step down transformer manufecturer in Delhi, India A step down transformer is meant to reduce the output voltage, which means it functions to convert high voltage with low current power into a low voltage with high current power For More Information Visit Us:

Shapeshifting Object VFX in Unity
4 September, 2017

We’ve talked with Michał Piątek about the way he created his entry to The Great Transmutator VFX contest.


Initially, I did not want to make VFX at all. I always wanted to become a film director. When I was 15 or so I started doing some silly short movies. My idea was that I should make a lot of them to get some experience and increase my chances of passing exams for a film school. I started with comedy genre and I was slowly drifting into action genre with each movie I made. Then I discovered four things roughly at the same time: Freddie Wong and Corridor Digital channels on Youtube, “Escape from City 17” fan movie and Andrew Kramer’s . I was shocked by what these people were capable of. Corridor guys and Freddie Wong were doing exactly what I wanted – short actions movies packed with low budget but high quality VFX. Andrew Kramer made me realize that great effects can be made by a single person and I can learn that for free. And then Escape from City 17 – it was ten times better than anything Freddie or Corridor could do and it was still made by just two people. And they spent zero dollars on it. I was blown away. I thought that if I want to impress people, if I want to make it into a film school, I need great VFX in my movies. Few years later I finished my first music video. It looks very corny and cringe these days but I put all knowledge I had to make it. I tried to make as much VFX as I possibly could. This was very eye opening and I remember I was enjoying creating special effects more than anything else. After finishing high school I tried my strengths at film directory department in one of the Polish film schools. I failed so next year I tried at animation and VFX department in different school and I failed again. This made me think that maybe this is too hard at this point in my life and I should try something else. I found a job offer in Warsaw in a new company called CreativeForge Games. They were looking for VFX Artist with any amount of experience. I thought that maybe if I can do effect for movies so maybe I could do the same for games. I sent an application, they liked what I had in my portfolio and so my gamedev journey began.

Creating Effects

I think the general idea of creating effects is the same. Both in films and in games what matters at the end of the day is a great looking effect which is both artistically and technologically groundbreaking. And what can be done in both medias is limited to technology available and time constrains. But there are a lot of differences too. Effects in games are not restricted to budget so much. A lot of effects in movies are practical and each practical effect is a major expense which needs to be considered. If you don’t own you render farm you need to rent it which costs money. You need to buy very high-spec machines to do simulations for you. This is all a lot of money, Whereas in games there is almost nothing stopping us from creating the effect you want except the time itself. Tools are so cheap these days I don’t think this is any concern anymore. You rarely have to simulate anything and most of these things can be done on a single, high-end home computer.

But the biggest difference in my opinion lies in workflow and the role of effects. In film an effect does not have to be functional. You don’t have to make it match specific gameplay requirements. It does not have to reflect any design ideas or does not have to be prepared with some code inputs in mind. In most cases it needs to match artistic needs, not functional ones. In games this is what VFX artist very often have to do. They not only make effects look good.They also have to incorporate certain design elements in them. There can be a gun with different gameplay states such as: charging up, ready to fire, overheated, out of ammo etc. All these things, they need to be communicated to a player and very often this is VFX artist’s responsibility. Think of aura effects in games such as MOBA or hack & slash. They need to tell you visually if they are offensive, defensive, passive, active, are they healing you, damaging you etc. Amount of health points your spaceship have left can be communicated by the intensity of damage effects playing. If it’s a flying fireball you know you are about to die. In movies you just do an effect which looks great and feels right in this particular scene. It still needs to communicate emotions or some ideas but this does not have to by systemic. But then again, all animation principles which can be applied to film VFX can be also applied to realtime VFX. Anticipation, readable shapes and motion, squish&squash, all these things still work in games.


When competition was announced I started googling for inspiration. I found some great macro shots of acrylic paints. They were creating awesome shapes, colours and very often they had some small air bubbles. They really caught my attention and I started experimenting with ways of replicating these bubbles on a mesh. There are few possible ways of doing this type of mesh displacement which I am aware of. One is baked Houdini/Alembic animation but I have never done that before. I tried blend shapes but they were very limited in many ways. Then I thought about creating custom textures for vertex displacement similar to the rain system made by Sebastien Lagarde in which each channel of a texture represents some piece of data which is generating procedural, UV and mesh independent rain ripples. That didn’t work very well too and was too time consuming. With each approach I started doing something simpler and simpler. I ended up with a shader which was just bulging the mesh. This can be done with one line of code: += v.normal * _Amount;

This gave me an idea. I thought that maybe I could use Shuriken particles as reference points and bulge the mesh around these points. I also needed some falloff and strength for each bulge region. Shuriken was perfect for this task as you can read any particle parameter from code and just sent it to a shader. And I did just that: I just sent each particle’s size, position and color parameters into a mesh shader to drive my displacement. I limited amount of particles which could do the bulging to 8 for optimization purposes and simplicity. I made a shader function which was taking this particle data and was bulging a mesh around each particle’s position with the radius equal to particle size and strength equal to particle’s Alpha. And bi-product of this approach is that it is fully procedural. This works on any mesh which has my shader applied to it. This came in handy later on as I could just swap the mesh from ugly cube to nice torus knot without any problems. You can see how particles are set-up here:

Dissolve Texture

Dissolve texture is used in a most classical way possible. I used a grayscale texture to hide or show a mesh in a more organic way. My effect can be broken up to three stages: torus knot exploding turning into bubbles, bubbles travel into a new location, bubbles turn into a teapot. I wanted to make this effect as seamless as possible and sometimes simple is better than sophisticated. So in order to turn a torus into a bunch of bubbles I used dissolve texture to hide torus mesh and I used the same texture to unravel the bubbles. Then I did it again with the bubbles and the teapot. The texture itself is not that great actually, it’s just Photoshop’s “Render->Clouds” with some contrast adjustments.


I made a shader which I used for both torus and teapot. It works any mesh out there as long as it has proper, tiled UV on which I can apply a noise texture. Here are all my exposed settings seen in the material inspector:

Basically, all these parameters are driven by a particle system. In the material itself I am only setting default values or some multipliers for particle-driven params. I won’t go through every single parameter but I made a short video showing some of them in action.

I can explode a mesh and apply fake gravity force to it. The shader is squashing a mesh the closer it gets to the ground. This is using a very naive implementation which is assuming that ground is flat. I could improve the shader by using terrain heightmap as ground level etc. But this was not needed for the contest.

Other parts of the shader shown on the video are very simple – texture based dissolve and noise texture sampled in vertex shader.

One thing you probably noticed is that distorted mesh is not very smooth. I tried implementing tessellation but Unity has a bug which prevented me from using tessellation while I was passing data from vertex to fragment program. I would have to re-write the whole shader and implement deferred rendering from grounds up to get it running. Obviously I skipped this step and simply subdivided the mesh in 3ds Max.

Now we come to particle-driven data. I am using Size and Color to drive some shader parameters. I also made a custom module with few curves – this is driving mesh cutout, light emission strength multiplier, collapse strength (gravity) and allows me to reference specific mesh I want to modify.

For the final version I disabled particle rendering as they do not do anything except passing data to the shader. But for breakdown purposes I enabled rendering so you could see what’s happening:


In terms of the motion itself I wanted to resemble chemical reactions so very explosive, rapid but smooth, liquidy and soft. I decided to use a mixture of classic particle explosions and a bit of shader work. I used the one-line-code-trick shown before to explode the mesh, added particles on top of that to incorporate some secondary motion. For the bubbles I used a sine wave to move vertices and add wobbliness to them. They are also squashing when they are close to the ground just like the main mesh. The teapot is using exactly the same shader and setup as the knot but it is in reverse. So instead of exploding, it is imploding, instead of disappearing it is appearing etc.


I didn’t create all these things for the sake of creating them. The project evolved organically. Whenever I noticed that some functionality was missing I was trying to add it. I knew Unity didn’t have good air resistance module so I couldn’t make particles slow down over time in a natural way. I googled for drag formula and made custom module using that. I also needed a way of driving all my shader parameters in very fast and iterative way. I used curves for that because I knew how to hook them up into Particle System update easily. I thought that it would be cool to have attractor module which could pull particles from position A to B so I implemented that too. I was not succeeding every single time though. I had some crazy ideas which didn’t look good or were too complicated so I had to drop them. For example, I spent quite a lot of time developing fake 3d bubbles rendered on a billboard. I ended up not using them.

True 3d bubbles was something I added very late. I knew that I need to bind two separate objects together. I first tried with a lot of particles mimicking “atoms”, changing colors from dominant hues of the torus to hues of the teapot. This felt very thin and not very connected with the rest of the visuals so I decided to try paint droplets. I was looking into how bubbles were made in Portal 2 and how screen space fluids work. Both things felt very complex and not achievable in this tights time frame. In the end I just did what I already tried in the past – few simple shader tricks layered one on top of another to create an impression of very polished tech. The reality is that methods I used are very bare bones, naive and primitive.

What you actually see is the teapot and a lot of 3d bubbles all in the same place, all being bulged a lot and imploding at the same time. Ambient occlusion and HDR glow is helping to bind these elements together. Without post processing this looks very bad. I also used tiny bit of refraction on top of that to give it a slight blur.

Using Unity

It is a love-hate relationship which is biased towards love with each new version of Unity. I started using this engine back in 2012 and at that time it was very basic and limited. It evolved a lot since then and by a lot I mean a lot lot. It has some unique features which I really enjoy. It is fast and very flexible. Creating custom modules is very straight forward and well documented. I don’t think I could do the same effect in any other engine. Unity has it’s flaws but there are no perfect tools and the key is to understand the limitations and know how to work around them. I think Unity is very good at this.

Michał Piątek, VFX Artist

Interview conducted by Kirill Tokarev

Follow on FacebookTwitter and Instagram


Leave a Reply

34 Comment threads
1 Thread replies
Most reacted comment
Hottest comment thread
35 Comment authors
Capital Smart City IslamabadAcacia hary99 Names of AllahBusiness Telephone Solutions Sydneythesis writing Recent comment authors
buzzmytech is give free dofollow backlink and high pr backlink. buzzmytech have good alexa rank and have many more dofollow free backlink site list and very easy to fast make backlink.


Zukexchange is Most trusted and secure cryptocurrency exchange globally. Buy, sell and trade Bitcoin, Litecoin, Bitcoin cash, Ethereum and other cryptocurrency

episcoin is Digital Cash You Can Spend Anywhere Use epis coin to make instant, private payments online or in-store using our secure open-source platform hosted by thousands of users around the world.

devshope Biggest Online Shop In India – Buy mobiles, laptops, cameras, books, watches, apparel, shoes and e-Gift Cards. Free Shipping & Cash on Delivery Available

Daniel clark
Daniel clark

Our expert guides dependably work in a state of harmony with the necessities given to us, and this makes our task arrangement a perfect one. We provide best Online Assignment Help . Allassignmenthelp deals with all the contextual investigations and assignments relating to aces.

Admin – all the files are currently available for free!


May i ask you a few things about your shader?
How did you do the “cutoff” effect look like its 3D? It looks like the cutoff not only moves over the whole image but through the knot mesh. Could you write me a PM if thats okay for you? Would like to reproduce the effect but im still in the shader learning phase.

Yousef BuHazza
Yousef BuHazza

This effect can be made in Unreal, just follow the concept

Whyyyy, why its not for unreal? =((

Nathan Sheppard
Nathan Sheppard

This is totally inspirational!
Thankyou so much for the breakdown!


I am Amyra Lyall, a Fashion content writer and editor, associated with GotoEssayHelp for a decade. It’s a leading plagiarism free assignment writing help in Australia.
We provide customized assignment help service.We are leading the market for more than a decade now and have acquired the name of being the best academic help service for our comprehensive services at pocket-friendly rates.
Visit us :

1 2 3 4