Neural Network Ambient Occlusion

The technical paper from Siggraph Asia 2016 shows a new way to use neural networks in CGI-production.

Siggraph 2017 is right around the corner, promising a lot of exciting stuff. The biggest stars of this show are the amazing researchers, who find new ways to build and render 3d content. However, sometimes, there’s just so many talks out there, that it’s getting hard to get hold of everything. Only today have we learned about the amazing research, which was actually shown in 2016 at Siggraph Asia in Macao. This paper talks about new ways to use neural networks to calculate ambient occlusion.

It’s a very fresh and interesting look into the future of technical art and how AI revolution might influence the way we approach 3d. The authors Daniel Holden, Jun Saito and Taku Komura (The University of Edinburgh and Method Studios) have turned to machine learning because this tech provides two biggest benefits – gives a possibility to do things faster and more accurately.

So that you know the methodology of this research was very interesting. The scientists took scenes from Black Mesa FPS and rendered them offline with Mental Ray. You can read more about it in the released presentation file. While results are not universally groundbreaking, the proposed approach does prove to be faster and more accurate in many cases than previous methods.

What this means for artists is that instead of simulating an entire physical process like AO computers are “learning” what makes it “look” the way it does.  Which can be cheaper to simulate while also being more accurate.  Not too mention the opportunities for interface simplification and tool accessibility that it provides.

Andrew Maximov

This year the same team is going to show how neural networks could be used to animate the characters. We talked briefly about this tech here.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more