logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
Newsletter
Advertiseplayer
profile_loginLogIn

Former Meta Engineer Builds Tech That Turns Any Screen 3D With Just a Camera

The team proposed a method of utilizing head-tracked motion parallax to bring 3D capabilities to everyday devices.

Daniel Habib, founder of True3D Labs and a former Meta engineer, presented an alternative way to watch 3D content by simply using a camera, without requiring any additional devices. 

The idea is simple but powerful. By using the front-facing camera to track the viewer’s head position, the system reprojects the scene in real-time, allowing the display to behave like a window into a 3D world. No glasses are required, and the illusion of depth is created entirely through motion parallax. As you move, nearby objects shift more than distant ones, which is how our eyes naturally perceive depth. 

Among other things, the engineer showed Disney’s Steamboat Willie rendered in this mode. Instead of feeling like a regular video, the scene responds to the tilt of your head, giving the impression that the animation is anchored in your room. Habib noted that most 3D media on personal devices collapses to flat playback because it expects the viewer to drag or scrub the camera. 

Technically, the system works by detecting facial landmarks and iris centers through the front camera, estimating a six-degree-of-freedom head pose relative to the display, and treating the viewer as the origin of the camera. Each frame is reprojected accordingly, with temporal smoothing to reduce jitter. The screen is treated as a literal window: the camera origin is your eye position, and the projection math ensures that parallax and occlusion behave exactly as they would if you were looking through a glass window.

The engineer shared that this approach builds on Johnny Lee’s 2007 Wii Remote demo, which popularized the head-coupled perspective. The difference now is that modern on-device models and GPUs make it practical on everyday hardware. Habib’s team has refined the pipeline, focusing on latency budgets, filtering, and privacy. Keeping the motion-to-photon delay short makes the virtual world feel solid, while rejecting noisy outliers prevents edges from appearing to swim.

Under the hood, True3D’s platform uses volumetric video techniques, voxels, and Gaussian splats to deliver view-dependent rendering efficiently. For web developers, there’s a drop-in player component and simple APIs that make it possible to bring this effect to custom applications, game captures, or even live renders from engines like Unity and Blender.

You can try a demo here and read about the project here. Also, join our 80 Level Talent platform and our new Discord server, follow us on Instagram, Twitter, LinkedIn, Telegram, TikTok, and Threads, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 4

  • Anonymous user

    This was done with a Wii remote 17 years ago by Johnny Lee
    Search for it on youtube...

    1

    Anonymous user

    ·30 days ago·
  • Anonymous user

    Head tracking 3D has been a think since about 2012. There was an android app that looked identical to the demo with the targets. Pretty sure it's the exact same.

    0

    Anonymous user

    ·28 days ago·
  • Anonymous user

    I had it in my to-do list to remake Johnny Lee's in the browser using vibe-coding. And then someone did it for me...

    0

    Anonymous user

    ·29 days ago·
  • Anonymous user

    Yeah, I jumped the gun. Sorry moderators, feel free to remove both my comments.

    0

    Anonymous user

    ·30 days ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more