The improved performance is achieved by using foveated rendering and eye tracking.
Sony might not have shown its PlayStation VR2 at GDC 2022, but there was a curious panel from Unity Technologies called "Building Next-Gen Games for PlayStation VR2 with Unity." According to Android Central, Unity developers talked about the differences in coding VR games for PS5 VR compared to current headsets like the Quest 2.
As previously reported, PS VR2 will use eye tracking technology to support foveated rendering, which reduces the image quality in the peripheral vision to decrease the rendering workload. Unity provided the numbers behind the technology, saying that GPU frametime is 2.5x faster with foveated rendering and up to 3.6X faster with both rendering and eye tracking.
Running the popular VR Alchemy Lab demo with demanding graphics like dynamic lighting and shadows dropped frametime from 33.2ms to 14.3ms, as reported by Android Central. Running another – 4K spaceship – demo, CPU thread performance was 32% faster, while GPU frametime went down from 14.3ms to 12.5ms.
According to the Unity developers, eye tracking can also provide multiple UI benefits, such as magnifying and interacting with whatever a player is looking at.
"The PS VR2 tracks "gaze position and rotation, pupil diameter, and blink states." That means that a PS VR2 could know if you wink or stare at an NPC, triggering some kind of personalized reaction if the devs program it."
Another useful thing Unity is planning to use eye tracking for is to create in-game heat maps that tell the developers where players look in a given scene. They can use it to see what interests the players in the environment or why they are struggling with a puzzle.
Combined with PS VR2 controllers' haptic feedback, finger tracking, and trigger resistance, it should give devs more realistic, varied reactions to make players' experience more immersive.