logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Official website

radicalmotion.com

Powerful AI-powered tech for detecting and reconstructing 3D human motion from 2D content 


RADiCAL was founded to develop the world’s most powerful computer vision technology focused on detecting and reconstructing 3D human motion from 2D content.

 

AI-powered 3D motion capture — in the cloud and on every device.

 

To achieve this, the team is using multiple bodies of science in a proprietary configuration, including computer vision, deep learning, and anthropometry. Its system continually learns and perpetually improves.

 

The company's goal is to see the technology implemented across mobile, web, and enterprise environments, seamlessly powering 3D animation in film, TV, art, gaming, AR/VR, as well as industrial and health applications.

Any device

From smartphones to professional cameras, you can shoot on any device.

Any environment

Non-human objects are automatically ignored. Outdoors, at home, or in the studio - record in any environment and lighting conditions.

RADiCALRecent articles

Using RADiCAL Mocap Tool & Unreal Engine to Create Music Video

3D Designer and Music Artist Julian Basurto, also known as Jeulo, told us how he utilized the power of RADiCAL motion capture and Unreal Engine 5 to bring choreography into 3D for his music video.

RADiCAL Motion Capture Now Supports Finger Tracking

Along with an Upper Body mode and enhanced facial animation.

Creating a Weekly Episodic Show With UEFN & RADiCAL's Motion Capture Tools

ASTUDIOCALLDYO's Don Tyler told us about the World of Montezuma, Fortnite's first playable animated sitcom, getting started with Unreal Engine's UEFN, and working with RADiCAL's mocap solution.

HUE Team on Boosting Their Animation Workflow With RADiCAL

The team's members AndyHood, Zebulon, and BananaZ told us about the studio, its projects, and how they utilize RADiCAL, a software that uses AI to generate 3D motion capture data in real-time through a single consumer-grade camera.

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more