Hi Elliott, This is a great breakdown and very generous in sharing your process and insights, you came a long way from the vending machine days!
Are you planning on releasing the UE4 project to the public? Or only builds? I'd love to play around with it in the editor if possible!
Faceware Technologies, the leading provider of markerless 3D facial motion capture solutions, today announced that its Faceware Interactive division has upgraded and released version 2.5 of its real-time facial mocap and animation product, Faceware Live. The latest upgrades are the result of advancements in the consumer-grade facial tracking technology developed by Faceware’s parent company, Image Metrics, which has been used in successful apps like L’Oreal’s MakeupGenius and Nissan’s Die Hard Fan, and which now has been incorporated into Faceware Live.
Faceware Live produces facial animation in real time by automatically tracking a performer’s face and instantly applying that performance to a facial model. Faceware Live requires just a single camera for tracking purposes. That camera can be an onboard computer video or webcam, the Faceware GoPro or Pro HD Headcam Systems, or any other video capture device.
Use Cases for Live 2.5
- Live performances that incorporate digital characters. Digital characters can be “puppeted” in real time, allowing interaction with live audiences and people.
- Digital characters interacting in real time on kiosk screens in theme parks and shopping malls.
- Creating facial animation content instantly for previz purposes.
- Person-driven avatars in VR and AR. Users can stream their own personas into digital and virtual worlds—perfect for training applications as well as interactive chat functionality.
With technology advancements in CG, VR/AR/MR, interactive marketing, and graphics processing, we’re seeing a growing number of inquiries for live CG performances and the ability to ‘drive’ digital characters in real time. To deliver on that, the market needs face tracking technology that is completely stable and tracks facial movements across a range of different conditions. With Live 2.5, we’re taking another important step toward making that future a reality. The quality of realtime, markerless tracking we can do now is absolutely mind-blowing.
Peter Busch, vice president of business development at Faceware Technologies
Upgrades to version 2.5 include:
- Advancements to Face Tracking Technology – Face tracking is even more stable than before, and better able to detect different types of faces (different skin tones, heavy facial hair, glasses, etc) in different lighting conditions. Once detected, Live 2.5 captures 180 degrees of motion with significantly less jitter than before.
- New Animation Tuning Workflow- Animators can customize the live streaming animation to, for example, amplify or suppress a specific motion, such as an eyebrow raise or smile. These fine tune adjustments are done on a pose by pose basis. Those animations (and all settings) can be saved to a profile. Animators can even isolate controls to pinpoint certain areas of the face that require extra attention. Lastly, Animators can simulate data without need for a live camera feed, which significantly helps in the character setup process.
- Command Line Calibration – Animators can now trigger the calibration and toggle via command-line for improved automation in your setup.
- Animation Preview Characters – Live 2.5 now includes animation preview characters, which animators can use to determine how certain types of motions might look on various facial structures.
Nearly 20 other improvements have also been included in this upgrade. A full list can be found here.
Technical articles and videos can be found in the Live section of Faceware’s website.
Faceware Live 2.5 is available free-of-charge to current Live customers starting today. New customer should contact Faceware sales pricing and information at email@example.com. For more product information, please visit the Faceware Live page.