I have read all the comments and suggestions posted by the visitors for this article are very fine,We will wait for your next article so only.Thanks! order now
At GDC 2017, we had an amazing opportunity to see some closed-door demos that Amazon Lumberyard was showing to the press. One demo was entirely devoted to achieving better, more realistic rendering in real time.
Rendering has developed considerably over time. We’ve made huge leaps from Pixar’s pre-rendered experiments within the movie industry in 1987 to modern day 3D, real-time rendered video games like Destiny and For Honor. However, there are still issues with jaggies, broken shadows, specular aliasing, and texture noise that need to be addressed within the rendering world. These issues have become the sworn enemies of Hao Chen, Senior Principal Engineer with Amazon.
Over the years, Chen has worked on solving complex problems with video game rendering. He was the Senior Graphics Architect at Bungie’s Destiny, where he worked alongside some of the best real-time graphics engineers in the industry. If you’re interested in his work, please check out an interview he conducted for GDC 2017.
We often sacrifice pixel quality. PC games usually have that typical ‘PC game look.’ Very noisy, very aliasing. Jaggies. Gamers actually came to accept these defects. But, I think now is the time to tackle pixel quality because the power efficiency of the geometry engine is very, very good and it’s going to help improve the throughput.
If you take a look at my career, at all the games I’ve worked on over the years, you can see that the throughput progressed immensely. We’ve observed a 5000x increase of polygon count. In terms of pixel resolution, we have 54x more pixels. And it’s still growing with the introduction of 4K and 8K displays.
However, this incredible progress doesn’t solve the most basic issues with pixel quality, which are jaggies, broken shadows, messy hair, noisy textures, pixelated foliage, strobing highlights, and more. And these defects are especially irritating for artists who want to get that perfect V-Ray look in their real-time renders but simply can’t.
Some believe that frame rate and higher resolution are the only aspects of rendering that truly matter to gamers and that they don’t notice other defects. This, however, is not entirely true. First, hiding some of these defects will hurt the throughput, and second, these defects are bad for VR experiences.
But, it’s fairly straightforward to implement these changes to get rid of these digital artifacts. Rather than requiring individuals to remake content, all developers would need to have is the right engine to do the job. And Amazon believes Lumberyard is the solution.
To show these improvements, we’ve picked a scene called The Bistro. It demonstrates our latest renderer, which runs on DX12 with HDR and features a lot of pixel quality improvements.
Let’s look at some of the math behind it and figure out what is actually aliasing. From a signal processing perspective, there are a number of steps the signal takes to get to our eyes from a computer game: discretization, sampling, reconstruction, and “DA” (display) conversion. And each of these steps can actually cause digital artifacts. The main source of aliasing is not sampling the signal at a high enough frequency. This causes the wheel to go backward in this Youtube Video.
As the wheel starts turning faster and faster, the video frame is still sampling at 30 frames per second. One starts to see that false disguise signal, which makes the wheel in the video start slowly spinning backwards.
There are only two strategies that can help you combat aliasing: taking more samples and pre-filtering the input signal. In film, we generally see plenty of samples and expensive filtering. It may take an hour or even days to render a single frame. Games, on the other hand, can’t work like that. They have to hit 60 FPS. So, games use fewer samples and very cheap filtering. These two factors are the source of jaggies seen in games.
So, how many samples are enough for games? Where do you place these samples? How do we pre-filter the sample? What’s the support of the filter? These are hard questions that need answers, and Amazon Lumberyard has worked with NVIDIA Research to find these answers to eliminate bad pixels from the rendering world.
When we first started out, we wanted to get the coolest and latest techniques as well as work with the best researchers in the field. We found Yves Lafont’s team, and we implemented a number of the latest techniques from NVIDIA Research in Lumberyard.
To come up with the solution, Amazon Lumberyard first had to look at the existing techniques of anti-aliasing.
For Lumberyard, the development team picked up a special form of Temporal AA.
Temporal AA is very cool. Instead of spreading the sample signal spatially, it spreads the signal over time. You render some samples one frame, render some other samples other frame and you blend them together. This is both efficient and it solves temporal artifacts.
Below is how this solution works.
To battle ghosting, Lumberyard relied on a new technique from NVIDIA Research, which was introduced last year.
This approach allows for less ghosting and helps to achieve particularly interesting results. Have a look at this video.
On the left, you can see the Temporal AA – Variance Clipping, and on the right you can see the original technique. It does look pretty amazing. The picture is very smooth and detailed while containing no blurring or jaggies. But that’s not all guys!
Another technique the company uses in the engine’s renderer is designed to fight specular aliasing.
There’s a number of specular aliasing techniques, but for their engine the Lumberyard’s team used Anton Kaplanyan’s most recent work (published in 2017). And here’s an example of what this technology lets individuals do. On the left, you see the shot with specular anti-aliasing on and on the right it’s turned off. This technology even takes into consideration the curvature of the geometry to blur the specular highlight.
One of the final advancements made in the the world of rendering is Order-independent Transparency. Here’s a quick overview of the problem and the solution from Marco Salvi, who discussed these topics in Lumberyard’s GDC 2017 presentation.
In the end, all these techniques combined were instrumental in achieving a very high level of visual fidelity for Amazon Lumberyard. With global illumination on, PBR engaged, and very complex scenes loaded, the images we got were almost movie like!
Although not entirely there yet, it does look incredible and is reminiscent of beautiful visuals as seen from a handful of Sony Computer Entertainment games (such as The Order: 1886). And it looks to be a very promising area for exploration within the artistic community. We’d love to see its potential explored by industry leaders to see how it works with various environments and different scenes. We believe the open world aspect of the engine will benefit from these new developments.