While announcing iPhone 11 Pro, Phil Schiller revealed a new feature that’ll be a part of their flagship smartphones. Deep Fusion is a new system described as “computational photography mad science,” which is basically something similar to Google’s Night Sight.
When a user is about to take an image with the new iPhone 11 Pro, the camera will snap 8 images before a user presses the shutter. Then it takes one long exposure and then stitches a new image together, “pixel-by-pixel” to create one with lots of detail and very little noise.
It was not specifically designed for shooting in the dark, but it’s still clear that Apple is just answering Google. Schiller didn’t say when we can expect Deep Fusion, but it’ll launched in the coming months.
Discuss the new feature in the comments.