iPhone 11 Deep Fusion mode reported in testing


The Verge is reporting the announced Deep Fusion feature has shown up in the developer betas for iOS 13. The feature is exclusive for the iPhone 11 and 11 Pro models for improved photo processing.

Other than the note about the feature perhaps getting closer to release, the Verge article has some good background on the feature beyond what Apple covered in last month's announcement. The article also has some photos using Deep Fusion, however, there aren't comparison photos to see the difference without.


1 - By the time you press the shutter button, the camera has already grabbed three frames at a fast shutter speed to freeze motion in the shot. When you press the shutter, it takes three additional shots and then one longer-exposure shot to capture detail.

2 - Those three regular shots and long-exposure shot are merged into what Apple calls a "synthetic long." This is a major difference from Smart HDR.

3 - Deep Fusion picks the short-exposure image with the most detail and merges it with the synthetic long exposure. Unlike Smart HDR, Deep Fusion only merges these two frames, not more. These two images are also processed for noise differently than Smart HDR, in a way that’s better for Deep Fusion.

4 - The images are run through four detail processing steps, pixel by pixel, each tailored to increasing amounts of detail â€" the sky and walls are in the lowest band, while skin, hair, fabrics, and so on are the highest level. This generates a series of weightings for how to blend the two images â€" taking detail from one and tone, color, and luminance from the other.

5 - The final image is generated.