logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

A Production-Ready Method for Face Re-Aging in Videos

The method allows for the preservation of facial identity across variable expressions, viewpoints, and lighting conditions.

The Disney Research team has presented FRAN, a fully-automatic production-ready face re-aging neural network capable of aging and de-aging human faces in videos without identity loss across variable expressions, viewpoints, and lighting conditions.

Trained on a dataset of photo-realistically re-aged, synthetic face pairs, the network provides temporally stable results on videos and incorporates simple and intuitive mechanisms that enable the artist to tweak and customize the re-aged results. According to the team, the network "reformulates re-aging as a simple image-to-image translation task that is naturally and effectively solved using the familiar U-Net architecture."

"Our first key insight is in addressing the problem of collecting longitudinal training data for learning to re-age faces over extended periods of time, a task that is nearly impossible to accomplish for a large number of real people. We show how such a longitudinal dataset can be constructed by leveraging the current state-of-the-art in facial re-aging that, although failing on real images, does provide photoreal re-aging results on synthetic faces," commented the team. "Our second key insight is then to leverage such synthetic data and formulate facial re-aging as a practical image-to-image translation task that can be performed by training a well-understood U-Net architecture, without the need for more complex network designs."

You can check out the full research paper here. Also, don't forget to join our Reddit page and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more