Found it here: https://exoside.com/quadremesher/, just in case anyone else is looking for it.
The link at the end is pointing back to the article. Couldn't find the Quad Remesher and I would really love to test it.
Are you looking for Step Down Transformer? your search is India. we are leading step down transformer manufecturer in Delhi, India A step down transformer is meant to reduce the output voltage, which means it functions to convert high voltage with low current power into a low voltage with high current power For More Information Visit Us: https://www.servostabilizer.org.in/what-is-step-down-transformer
Nikita Shilkin did a wonderful talk about his work on visual effects for VR games.
Hello! My name is Nikita Shilkin and I’m Senior VFX Artist at 4A Games (Metro games). Before that, I worked on films and ads as a Generalist Artist, and then as a VFX\Onset Supervisor on sci-fi and other types of films.
Some of the scenes I worked on:
At the moment, I am working on effects for the ARKTIKA.1 project. This is a sci-fi VR Shooter with a, traditional for the company, focus on immersing audience through story and high-quality visuals that make it possible to talk about it as an AAA product.
To begin with, I would like to note that making effects for VR is essentially no different from producing them for ordinary games, with the exception of few nuances that I have noticed during the production.
1.The first and the most important one – player’s freedom and as a consequence, the unpredictability of almost all his actions.
2. Focus on performance. The requirement of constant 90 frames damages your technical and creative freedom, forcing you to constantly balance on the verge of game quality and player comfort.
3. The final checkpoint is a headset. Due to the difference in resolution, gamma and the features of the virtual reality, what looked wonderful and beautiful in the editor might not look so good with a headset.
Based on these three rules, we can start analyzing the production. So, let’s begin with some core things.
Since we are talking about VR, we don’t have fixed camera, animations, timings or other constant values, which means we can never know how the player will shoot and from which side he sees the weapon. And the only way out is to make the effect work beautifully from all sides.
And the first standard mistake is trying to make one mind-blowing sequence, which unfortunately will work only with a classic fixed camera, becoming ridiculous when turning the weapon.
The solution is quite simple – no matter how complex the effect is, break it into simple fixed parts using all three directions. So you get not only volume, but also a visual randomness that will make a shot unique.
Since the VR does not feature a classic gun sight, nor the center of the screen, and aiming with a foresight or a scope is not a common thing, the projectiles of the weapon should be clearly visible. Most of the players will rely on this factor, making corrections for the bullets and their impacts.
In this regard, there are several tips:
1. The muzzle flash must not block the sight of the bullet.
2. The bullet should be clearly visible (size, brightness, length). The lower the rate of fire, the better the bullets are seen with the trails behind them. The faster, the higher the brightness is.
3. Don’t be lazy and create different bullets with variable impacts for all weapons, as this will also help to understand shooting direction.
And finally, a little piece of advice, if you have any firearms (or any other weapons with smoke particles), put them into a separate system, away from the flame and set free in the world, that looks interesting.
I love using distort in different situations as with the right approach it is possible to achieve the effect of additional volume due to the refraction of other particles and the liquid effect, which helped me when I was working on plasma and the related effects. You should know that you don’t need to add distort to the muzzle flash as it will make your players feel dizzy.
Even if it seems to look safe, the headset might give a different feeling.
And here are some examples of non-aggressive distortion, comfortable in VR.
Now that we’ve sorted out some weapon stuff, I’d like to share my experience of using effects in different situations with a little explanation for each.
It’s not a good idea to take control of player’s camera in VR, since this always leads to unavoidable motion sickness and separation from the world inside the headset. Neither is the camera shake, which might seem like a good idea for a robot landing. But it turns out that using two vertical cross movements (for example, the stones falling from top to bottom and a cloud of dust from the floor to the ceiling) while the player in the center, is a nice way to fake this effect, without shaking the player’s camera. You just use the world around, without using static floor and ceiling.
Let’s talk about performance. The ability to reuse the content is a necessary ability, since any extra texture will damage the frame rate. All the effects of steam in this room used the same static texture, but with different post-processing. The most important thing is to understand the nature of the behavior of the effect, speed, inertia and then with conventional tools like rotation, motion, alpha channel and color curve, you can get anything, both beautiful and cheap for the engine.
I want to show another example, where there is a completely static room, but due to quite simple effects, again, and correct light, you can get a sense of dynamics, which in general is cheaper than animating individual objects. It is clear that the animation would be much better, but it would take the lion’s share of the available resources.
Talking about the effects, it’s hard not to discuss the software part. Since I’ve worked on some movie projects, I love Houdini, Maya and even 3ds Max. Unfortunately we do not use Houdini. Just look at the 16th version and you’ll understand how everything can be done now with ease when it comes to game development and I’m very happy about how the tool evolves, becoming absolutely universal. I use it because of an old habit (I just love the nodes), the usual free version, making some references. For example, I had to show the animator what kind of animation I would like to get from a robot charging through a wall and instead of using a thousand words it was easier to use a flipbook from Houdini.
The VR makes it extremely interesting to study small details, play with physics and much much more, so do not forget about this opportunity when creating your own worlds!
The most important thing to remember is that creating effects for VR is the art of faking and optimizing, because on the one hand they should be voluminous, and on the other hand, any real simulation will damage performance and the balance is important if you really want to get some quality.
There are several tips for optimizing a game:
1. Use RGB channels wisely. Often for the same smoke you only need one of the channels, while the remaining ones you can use, for example, for a map of brightness or transparency.
2. Try not to use lit particles too often, just for noticeable cases.
3. Remember about overdraw. Try to use quality, not quantity of each particle, minimizing the overall counter.
4. Do not make highly recognizable unique sprites, it is better to get a picture inside the editor than initially rendering a beautiful sequence. This way you can repeatedly use one texture, without making it deliberately noticeable.
5. If there is an opportunity to do something procedural with a shader — do it. Less used textures, less problems with streaming.
6. If you have a collision of particles with static, try to minimize it by all means, or, by having a constant distance, break the effect into two different ones – drop and rebound – this way you reduce the calculation of collisions.
7. Do not use distortion with shots.
The game is still in production, so I was able to show only a small part of the content and if the topic seems interesting to you, I can share more in the future.