New AI Project Can Relight Portraits After the Fact

A group of researchers and engineers from UC San Diego and Google set up a neural network that can relight portraits according to any provided environment map.

A group of researchers and engineers from UC San Diego and Google set up a neural network that can “relight” portraits “according to any provided environment map.” The team states that their system can take any photo and adjust the lighting changing the direction, temperature, quality of the light, and more.

The tech is described in a research paper submitted to SIGGRAPH 2019, and you just have to admit that the results are impressive.

They trained the network by capturing photos of 18 people under different directional light sources in a studio. The team noted that each person was captured from 7 angles as “a densely sampled sphere of lights” illuminated them from every angle.

The researchers demonstrate this using several different photos starting around the 1:55 mark in the video up top. Then, starting around 4:20, they show how the

It might remind you of the Portrait Lighting feature on iPhones, but this project doesn’t use a depth map or any other data beyond a basic RGB image. The tech “can produce 640 × 640 images in only 160 milliseconds.” They noted that their “model may enable compelling consumer-facing photographic relighting applications.”

You can get more details here.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more