This is amazing! Please tell us, What programs where used to create these amazing animations?
I am continuing development on WorldKit as a solo endeavor now. Progress is a bit slower as I've had to take a more moderate approach to development hours. I took a short break following the failure of the commercial launch, and now I have started up again, but I've gone from 90 hour work weeks to around 40 or 50 hour work weeks. See my longer reply on the future of WorldKit here: https://www.youtube.com/watch?v=CAYgW5JfCQw&lc=UgxtXVCCULAyzrzAwvp4AaABAg.8swLeUjv7Fb8swt1875FAT I am hard at work with research and code, and am not quite ready to start the next fund-raising campaign to open-source, so I've been quiet for a while. I hope to have a video out on the new features in the next few weeks.
Someone please create open source world creator already in C/C++.
As you must have heard, Nicolas Pirot has recently released a great project, showing Dynamic Decay in UE4. It’s a great way to show how you can do real-time changes in the environment with some Bluprints magic. Specially for 80.lv, Nicolas talked about his project and the way he achieved these amazing results. You can purchase Dynamic Decay project for just $5.
Destructive Time-lapse Project
The project currently is focused on the material deterioration, although it has a lot of other interesting things going on as well. The concept was an idea I decided to undertake as part of my graduation work as a game graphics student. I’ve always had a strong interest in visual storytelling through interesting environment design in both games and film. As a student, my graduation work was the perfect opportunity to combine my love for weathered environments with a strong interest in Unreal Engine blueprints and Substance Designer materials.
The unique aspect of this tech showcase is giving the user the possibility to completely transform the environment during runtime. Right now it’s based on a set of open parameters, but the gameplay possibilities using this type of workflow are endless.
The scene has a bunch of various materials, which are changing in realtime depending on the surrounding circumstances. Could you talk about how is the whole thing organised in Unreal Engine 4?
Everything in the scene is done using UE4 blueprints and materials. Both been absolutely fantastic to work with, giving me as an artist the tools to create awesome things, even without a coding background. This combination of visual beauty and backend ease of use is why I chose for Unreal Engine 4 over other popular game engines.
The framework I’ve built creates a link between the UI, the level blueprint, and the actual materials/shaders, in order to pass on input information to the materials.
One of the challenges I faced early on was finding a way for materials to be editable in runtime, and to control them without having a hardcoded sequence of material changes over time. The issue was that regular materials or material instances could not be edited once the game starts running, which was quite problematic at first. The solution came in the Shape of Unreal’s Dynamic Material Instances. Even though they can only be created and assigned during runtime, they have the major advantage of being editable on the go. Once I got them integrated in the system, I could create material instance as normal, and use the UI input as parameters.
The materials themselves use a custom home-grown greyscale map I’ll be referring to as a decay map. As pictured in the accompanying image, a decay-map is a greyscale map that defines different grungemasks, depending on an input parameter. This input parameter is a scalar input value, calculated using the UI values defined by the user. Using some clever shader tricks a certain area of the decay map will be converted to a greyscale mask, which can be used to grunge up areas. The input parameter defines the lower limit in the range of brightness that should be interpreted as masked. When the input parameter is 0, all values between 1 (255,255,255) and 1 are masked. The more time progresses, the larger the mask range becomes. This newly created mask is then passed on, and the masked areas that were once clean materials, become totally grunged up. This process repeats a few times for each different effect, as to prevent obvious tiling and repetitive damage.
The materials were ALL made using Substance Designer 5, Substance Painter 1 / 2 and Bitmap2Material. Working with Substance is particularly great because of the immense amounts of control I get as an artist in creating and fine-tuning details in my materials. Some of the materials such as the clean tiles are fully procedural in substance designer, while others (such as the moss) use base-maps created in Bitmap2Material.
How did you approach the production of the materials here? How did you approach the creation of the materials and their gradual deforming? How did you change the qualities of the materials to make them feel more influenced by time and other factors? From the visual perspective.
For each surface, a set of materials was created. Each set represented the far end of a certain type of destruction (plus a clean set ofc). All surfaces have at least a clean set, a grungy one, and a mossy one. All materials seen in the scene use these material sets and combine them over each other to ensure a convincing result at any stage of decay.
Each material was sized to 1K, and uses the standard PBR material workflow. Each set has 3 maps, one for BC, a BW one for roughness and a normal map. The metalness was handled by using the decay mask in combination with a set of shader nodes, which saves 3*1K maps for each surface.
Once I’ve got the different versions of each map, they go into a single dynamic material instance, which can then be created, assigned and edited during runtime.
Could you walk us through the process of the creation of changes in the materials? How did you manage it all there with Blueprints? It just seems like a crazy task.
The big catch was to divide the huge task of user input based dynamic textures into smaller bits. The first step was to get the user input from the sliders, and to convert it into something I can use in blueprints for events and the likes. Once I had gotten those values, I used dynamic material instances to pass them on to the dynamic material instances, which can only be created during runtime. Once the materials had the 0-1 values, I could start playing around with them, the same way regular material instances work.
The real challenge in creating the shaders was creating something that both looked good, and blended in a natural and logical way. This is why I’ve created the Decay map, which has served it’s purpose well.
Could you talk about the way you’ve worked with the post-processing and fog? What did you use here and how did you add it all into the general system of the environment?
After I had completed basic versions of the materials, I moved on to create the atmosphere. While weathered textures do a lot for the mood, things such as atmospheric depth fog, colour grading, and lens effects also contribute massively.
My favourite of the above was of course the exponential depth fog. Usually used for large-scale outdoors environments, it worked like a charm on this indoor scene. The main benefit was again that I could edit a ton of things in runtime, such as distance to camera, strength and colour. Using a set of relatively simple blueprints, I’ve rigged up the colour to the heat/humidity in the scene, and the opacity threshold and strength to all parameters.
As for the lighting, the same reasoning took place and led to some experimentation. After researching on how to efficiently tinker with movable/dynamic lights in runtime, I chose to use the LinearColour Lerps, in which I blended between white, and the colour the situation required. The same thought process that went into the fog, which both worked like a charm.
The best thing about this is that, while the blueprints may be very simple, the gameplay possibilities are immense. The environment can quite literally change around a player, depending on his/her moral choices in the game.
How did you work with the destructible meshes here? Could you talk a little bit more about how these elements are integrated into the general flow of things?
The props were added to add more of a dynamic feel to the scene. Some events were scripted to be exactly the same every time something happened, while others were dynamic and physics based, to improve the diversity of the outcome.
The destructible meshes were mostly fragile props such as wall tiles, lights and bottles, which were set to become visible physics-actors after a certain point in time. The tile themselves were part of a larger system of events too, as they were attached to a specific decal above them. The point of this was to alter the base textures so that every time a tile dropped, a strong stain would become visible. I wasn’t too happy with the way the tiles were falling, so I added invisible physics props to bounce them around a bit, and create the illusion of dynamic tilefalling.