A Paper on Compressing GI With Neural Networks

Jure Triglav described a method of compressing 1.8 GB of baked imagery into 200 kB worth of neural networks.

Back in early September, Computer Scientist Jure Triglav published a comprehensive piece describing a neat method of working with lighting. In an in-depth article titled "Compressing Global Illumination with Neural Networks" the scientist demonstrated an uncommon implementation of neural networks for lighting, showing how they can be used to compress 1.8 GB worth of baked images into a roughly 600 kB demo, "of which 400 kB are dependencies."

According to Jure, the idea was to train a neural network with all the different lighting scenarios for a specific scene and make it do inference in real-time to get the desired fragment color. To test it out, the creator modeled a scene in Blender with one rotating light and trained neural networks to produce ten individual fragment shaders that change the scene's lighting scenarios. The final result can be seen in the demo attached above.

"Given the mentioned constraints it is unlikely that this technique in its current state will be useful beyond a very narrow scope of projects, but I firmly believe that making pixels smart is just beginning to show its potential," commented the author. "The ability to successfully embed a useful neural network in a fragment shader and still have it perform well was quite a surprise for me."

You can read the full paper here. Also, don't forget to join our Reddit page and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more. 

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more