The team proposed a method of utilizing head-tracked motion parallax to bring 3D capabilities to everyday devices.
Daniel Habib, founder of True3D Labs and a former Meta engineer, presented an alternative way to watch 3D content by simply using a camera, without requiring any additional devices.
The idea is simple but powerful. By using the front-facing camera to track the viewer’s head position, the system reprojects the scene in real-time, allowing the display to behave like a window into a 3D world. No glasses are required, and the illusion of depth is created entirely through motion parallax. As you move, nearby objects shift more than distant ones, which is how our eyes naturally perceive depth.
Among other things, the engineer showed Disney’s Steamboat Willie rendered in this mode. Instead of feeling like a regular video, the scene responds to the tilt of your head, giving the impression that the animation is anchored in your room. Habib noted that most 3D media on personal devices collapses to flat playback because it expects the viewer to drag or scrub the camera.
Technically, the system works by detecting facial landmarks and iris centers through the front camera, estimating a six-degree-of-freedom head pose relative to the display, and treating the viewer as the origin of the camera. Each frame is reprojected accordingly, with temporal smoothing to reduce jitter. The screen is treated as a literal window: the camera origin is your eye position, and the projection math ensures that parallax and occlusion behave exactly as they would if you were looking through a glass window.
The engineer shared that this approach builds on Johnny Lee’s 2007 Wii Remote demo, which popularized the head-coupled perspective. The difference now is that modern on-device models and GPUs make it practical on everyday hardware. Habib’s team has refined the pipeline, focusing on latency budgets, filtering, and privacy. Keeping the motion-to-photon delay short makes the virtual world feel solid, while rejecting noisy outliers prevents edges from appearing to swim.
Under the hood, True3D’s platform uses volumetric video techniques, voxels, and Gaussian splats to deliver view-dependent rendering efficiently. For web developers, there’s a drop-in player component and simple APIs that make it possible to bring this effect to custom applications, game captures, or even live renders from engines like Unity and Blender.
You can try a demo here and read about the project here. Also, join our 80 Level Talent platform and our new Discord server, follow us on Instagram, Twitter, LinkedIn, Telegram, TikTok, and Threads, where we share breakdowns, the latest news, awesome artworks, and more.