Found it here: https://exoside.com/quadremesher/, just in case anyone else is looking for it.
The link at the end is pointing back to the article. Couldn't find the Quad Remesher and I would really love to test it.
Are you looking for Step Down Transformer? your search is India. we are leading step down transformer manufecturer in Delhi, India A step down transformer is meant to reduce the output voltage, which means it functions to convert high voltage with low current power into a low voltage with high current power For More Information Visit Us: https://www.servostabilizer.org.in/what-is-step-down-transformer
VFX artist Andreas Glad talked about his approach to visual effects production, showed how these could be used in games and when VFX are used. Andreas worked on a bunch of cool games, including Battlefield 1 and Battlefield 4. He’s also an author of some great courses, which you can check out here.
My name is Andreas Glad and I’m a realtime VFX Artist. Stating that sounds like I’m suffering from some kind of addiction and thinking about it, maybe that’s not too far off the mark. I’ve been doing vfx for games since 2008 at a couple of different studios. My first “real” job in the industry was at Eurocom in the UK where I started by handplacing godray meshes in the canopies of Ice Age: Dawn of the Dinosaurs – The video game. Eurocom did a lot of smallish games so I got to jump around and try different things. After my last day on Ice Age where I had just made a leafy flower destruction effect, I got thrown onto Dead Space Extraction and asked to decapitate a guy. Weird contrasts…
The last few years I’ve worked on larger games though, namely Battlefield 4 and 1. That was an amazing experience. The first time I could tell my fiancée when viewing the E3 stream “Look, the skyscraper is going to collapse! I worked on that!” was a really cool feeling. And seeing the viewcount on the BF1 reveal trailer climb over 50 million and knowing, that many people have seen my explosion/gascloud/blood effects is still hard to grasp.
After BF1 though, I left DICE and started my own company, Partikel, where I provide remote contract work and try to train the next generation of realtime VFX artists. Oh, and I’m studying mechanical engineering at university as some sort of a masochistic hobby.
Peculiarities of VFX for Games
Well, if I put it in film terms, I’m a FX-TD with no budget. Where a film simulation can take minutes or hours per frame and the rendering even longer, I have 2 milliseconds to do both. That means the way I do it is quite different. I can’t afford to run an epic smokesimulation, but I can afford to render one out, and use it as a texture on some camerafacing spriteparticles. Everything needs to be hacked and cheated in some way. On top of that game effects need to look good from all angles so you can’t tailor it to just one camera and ignore the rest. Effects in games are very dynamic as any amount of players could trigger effects at the same time and you must build them in such a way that they interact believably and don’t kill the framerate. Not an easy task when you have 64 players equipped with gasgrenades…
There are many different types of effects in games, but they can be summarized into three categories.
- Gameplay effects – Muzzleflashes, explosions, destruction, powerups and so on.
- Environment effects – Weather, backdrops, wildlife.
- Cinematic effects – Custom events that happen at only one time in the game. These are the most similar to film as they often have a locked or at least controlled camera.
Combining Houdini and UE4
Houdini is allowing me to build setups that I can reuse. I’ve used Maya a lot in the past, and a lot of the time you have to start over from scratch if you want to tweak something early in the workflow. Take the example of a destruction simulation. Let’s say you are simulating the destruction of a barrel to bake and bring into game. In Maya you go ahead and split it up, build the constraints simulate it and bake it. In Houdini you do the same. So far, so good. Now, the AD comes in and says; “Sorry, it’s going to be a crate, not a barrel. My bad” In Maya, you start the process over, curse the AD and work late. In Houdini, you plug the crate in, bake and go home. The setup stays and if you took care when making it the first time, you don’t need to redo it when creating similar effects.
In my course I show how to set up a looping smokeball to use as a texture in Unreal. When you are done with the course, and you realize the settings I used for the simulation were crap. “I want a much thicker smoke!” You simply fix the simulation and the rest of the setup, like the looping and masking is already done. I’m a very strong believer in being able to reuse assets to keep a high velocity.
Smoke Ball and Explosion Material
The course is aimed at people who aren’t super experienced with Houdini so it relies a lot on using presets and the shelf tools to get the setups up quickly. That way I can point at things to tweak instead of spending lots of time trying to get the basic setups up. So in the course you start by simulating a smoke pillar. Then you make the volume loopable, just like you would in after effects, except this is in 3d so you an relight and shade it afterwards. Then you mask out the top and bottom of the pillar and you are left with a ball that fits nicely on a sprite.
This can then be used as the base for an explosion. You use it for smoke and fire by applying some shader love. Combine this with a healthy helping of cascade and you end up with the base for a pretty cool large scale explosion.
Exporting Houdini Work into UE4
Yes and no. You can’t use Houdini in the traditional sense to make realtime effects. There are always steps in between Houdini and the engine. Same as with any 3d package. Houdini is just a very good way of creating the building blocks you need to create effects in engine. That said, with the new vertex animation tools that’s about to change. Now you can bake out some really impressive looking stuff and pretty much play it back in engine. The technique works by recording the delta position of all vertex points and storing it as a colour on a texture. In engine you simply do the opposite to get the shape 3d back. Oversimplified, yes, but that’s about all you need to know to start using it these days. This is a really good tool to create the cinematic effects I talked about. You can get some really impressive stuff done with the technique. The drawback is that it’s prebaked, and therefore not as dynamic as you often need gameplay and environment effects to be.
The End Result
The end result is everything. You always need to keep the bigger picture in mind. A sprite based effect on its own VERY rarely look good. But if you add all the necessary details, put it in the environment it’s built for, add some camera shake, some nice lighting and suddenly, it can start looking really convincing. A common problem I see juniors make is that they create their effect in a vacuum. Even worse, on a black background. You can take a cube, give it a bright color and put it on a black background and it will look good. But once it’s placed in game, where the player will see it, it will lose a bit of that impact. My lead at DICE once told me that when I did my work test during the application process, I was neck and neck with someone else. The test was to create an explosion. What got me the job was my use of camera shake. I did this in an engine where I didn’t know how to add camera shake so I had to hand animate the camera move to get the effect I wanted. Watch a film with explosions in it and count how many are shown with a perfectly still camera. The shake makes the effect feel a lot more powerful.
It’s the same with decals, and debris. They ground the effect make it feel part of the world. Perhaps the player was looking the wrong way when the explosion happened. But if there’s a scorch mark on the ground and debris bouncing around, he’ll understand what just happened behind him.
Of course there can be too much of a good thing. We had an instance during the development of Battlefield 1 where the camera shake from artillery impacts had no distance limit so whenever there was an explosion anywhere on the map, everybody shook. Guess how often there are explosions happening somewhere on the map in a game like that… It’s a balancing act. You want the effect to feel impactful, but you don’t want to overwhelm the player. Same with debris. It’s cool when there are pebbles raining down around you after a big explosion. But if it does so continuously for ten minutes you are going to start looking for an umbrella.
Start out simple and iterate your ass off. Make a quick and dirty simulation, render it out and stick it on a sprite in Cascade. Does it look good? No? Why? Analyze what’s not working and what is and start tweaking. Finish the setup before you start polishing. A common mistake is to spend hours on getting the simulation perfect in Houdini, and then realize that it doesn’t work in game for one reason or another. Don’t be scared to mix in real elements as well. Grab some stock footage and make sprite textures from it. Maybe that can work as a base, and you only need to simulate that one little piece that’s missing. Mix and match techniques. A lot of cool stuff can be done in shaders. A lot of games rely on a single texture frame and then create the movement with UV-Distortion or flow maps in engine. Your effect is done when it looks good (and performs) in the situation it will be used in game. Don’t fall in love with it in the preview lighting/setting. Always be prepared to kill your darlings.