NVIDIA Teaches AI to Fix Grainy Photos

NVIDIA has presented a new deep learning-based approach that can allow users to use photos that were originally taken in low light and automatically remove the noise and artifacts.

NVIDIA has presented a new deep learning-based approach that can allow users to use photos that were originally taken in low light and automatically remove the noise and artifacts. The project is said to be developed by researchers from NVIDIA, Aalto University, and MIT.

This method might not seem unique, but the key thing key is that it only takes two input images with the noise or grain to get great results. This AI doesn’t even need to study a noise-free image to remove artifacts, noise, grain, and automatically enhance photos.

“It is possible to learn to restore signals without ever observing clean ones, at performance sometimes exceeding training using clean exemplars,” the researchers pointed out.“[The neural network] is on par with state-of-the-art methods that make use of clean examples — using precisely the same training methodology, and often without appreciable drawbacks in training time or performance.”

 

The team validated the neural network on three different datasets to check if the system does the work. The method is said to be capable of enhancing MRI images, so medical imaging has a great future.

“There are several real-world situations where obtaining clean training data is difficult: low-light photography (e.g., astronomical imaging), physically-based rendering, and magnetic resonance imaging,” the team stated. “Our proof-of-concept demonstrations point the way to significant potential benefits in these applications by removing the need for potentially strenuous collection of clean data. Of course, there is no free lunch – we cannot learn to pick up features that are not there in the input data – but this applies equally to training with clean targets.”

You can learn more about the paper here

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more