It says "404 not found"
Wow, amazing job, really inspiring !
Where do i send gold, i want the game!
Facebook has decided to release public code and open source its DeepFocus research into ultra-realistic visuals for VR headsets. The DeepFocus approach to rendering visuals is said to produce “natural blur” by way of a neural network architecture that supports “the ultrasharp image resolutions necessary for high-quality VR.”.
One of the projects related to DeepFocus is the Half Dome varifocal hardware prototype which physically moves the panels of a VR headset to produce visuals that can solve the “vergence-accommodation conflict”. The goal here is to deal with the limit of the amount of time some people can wear a VR headset without discomfort.
A recent research paper from SIGGRAPH Asia shows the DeepFocus approach and explains how it can be applied to “support high-quality image synthesis for multifocal and light-field displays.”
“Though we’re currently using DeepFocus with Half Dome, the system’s deep learning–based approach to defocusing is hardware agnostic. Our research paper shows that in addition to rendering real-time blur on varifocal displays, DeepFocus supports high-quality image synthesis for multifocal and light-field displays. This makes our system applicable to the entire range of next-gen head-mounted display technologies that are widely seen as the future of more advanced VR,” states the team.
With this open sourcing effort, the team wants to “accelerate development in this area to benefit the industry as a whole.”
“Facebook Reality Lab is pursuing many ‘feature prototypes’ to explore the potential future for VR immersion – Half Dome is one of those,” said their representative. “The Display Systems Research (DSR) team at FRL continues to develop advanced display technologies, including DeepFocus, to explore the visual frontier of VR/AR. Half Dome and many other feature prototypes are constantly under development at FRL.”
You can learn more here.