A New Neural Network for Creating 3D Outfit Deformations

The team used a new training scheme that removes the need for samples, enabling self-supervised training of dynamic 3D garment deformations.

Developers Igor Santesteban, Miguel A. Otaduy, and Dan Casas have introduced SNUG, a new neural network for adding 3D deformations to outfits worn by parametric human bodies. According to the team, the network was trained by using a new training scheme that removes the need for ground-truth samples, enabling self-supervised training of dynamic 3D garment deformations. The method they used allowed them to interactively manipulate the shape parameter of the subject, while producing highly realistic garment deformations, without using any supervision during train time.

"The key to our success is realizing that the solution to the equations of motion used in current physics-based methods can also be formulated as an optimization problem. More specifically, we show that the per-time-step numerical integration scheme used to update the vertex position (e.g., backward Euler) in physics-based simulators, can be recast as an optimization problem, and demonstrate that the function for this minimization can become the central ingredient of a self-supervised learning scheme," writes the team.

You can learn more about SNUG here and get the code over here. Also, don't forget to join our new Reddit pageour new Telegram channel, follow us on Instagram and Twitter, where we are sharing breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more