logo80lv
Articlesclick_arrow
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_login
Log in
0
Save
Copy Link
Share

AI Motion Capture Startup Meshcapade Now Part Of Epic Games

Check out Pooya Moradi M.'s demo.

Epic Games recently acquired Meshcapade, a toolkit for markerless motion capture that automatically generates realistic human models in an accessible 3D format using a wide variety of data sources. The core technology is expected to be integrated into MetaHuman and Unreal Engine, while Epic is also establishing a presence in Cyber Valley.

According to Pooya Moradi M., his team has spent the past few months building a robust pipeline around Meshcapade, integrating it with Cinema 4D, Maya, iClone, Marvelous Designer, and Unreal Engine. They have actively incorporated Meshcapade's technology across multiple production projects, focusing on precise body part isolation, including wrists and fingers, accurate ground elevation and foot contact, and the handling of multi-person interactions.

The technology developed by Meshcapade is based on the Max Planck Institute for Intelligent Systems' SMPL body model, a realistic 3D representation of the human form trained on thousands of 3D body scans. Meshcapade is reportedly closing in April, and its tools will then become part of Epic Games.

It will be interesting to see how this develops, especially given there's a somewhat similar AI-powered animation solution called Cascadeur. We've spoken with its founder, Eugene Dyabin, about the AI vs. anti-AI debate and the ethics of AI training.

You can learn more about Epic Games' acquisition of Meshcapade here, and explore Meshcapade for yourself.

Also, subscribe to our Newsletter and join our 80 Level Talent platform, follow us on TwitterLinkedInTelegram, and Instagram, where we share breakdowns, the latest news, awesome artworks, and more.

Ready to grow your game’s revenue?
Talk to us

Comments

0

arrow
Type your comment here
Leave Comment
Ready to grow your game’s revenue?
Talk to us

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more