Skip to main content

How to use Deep Fusion with iPhone SE 3, iPhone 13, and more

Apple’s Deep Fusion tech that the company describes as “computational photography mad science” first arrived with iPhone 11. Now it’s even supported on the iPhone SE 3 alongside iPhone 12 and 13. Here’s how to turn on Deep Fusion on iPhone including how it works and when the feature kicks in.

Deep Fusion is an image processing system that works automatically behind the scenes in certain conditions. Apple says the feature is able to produce “images with dramatically better texture, detail, and reduced noise in lower light.”

Unlike the iPhone’s Night mode feature or other camera options, there’s no user-facing signal that Deep Fusion is being used, it’s automatic and invisible (on purpose).

However, there are a few instances when Deep Fusion won’t be used: any time you’re using an ultra wide lens, any time you have the “Photos Capture Outside the Frame” is turned on, and when shooting burst photos.

How to turn on Deep Fusion on iPhone cameras

Keep in mind that Deep Fusion is only available on iPhone 11, 12, 13, and SE 3.

  1. Head to the Settings app then swipe down and tap Camera
  2. Make sure Photos Capture Outside the Frame is turned off
  3. Make sure you’re using the wide (standard) or telephoto lens, 1x or greater
  4. Deep Fusion is now working behind the scenes when you shoot photos (won’t work with burst photos)
How to use Deep Fusion on iPhone cameras

How does Deep Fusion work?

As described by Apple’s former VP Phil Schiller:

So what is it doing? How do we get an image like this? Are you ready for this? This is what it does. It shoots nine images, before you press the shutter button it’s already shot four short images, four secondary images. When you press the shutter button it takes one long exposure, and then in just one second, the Neural Engine analyzes the fused combination of long and short images picking the best among them, selecting all the pixels, and pixel by pixel, going through 24 million pixels to optimize for detail and low noise, like you see in the sweater there. It’s amazing, this is the first time a Neural Engine is responsible for generating the output image. It is computational photography mad science.

When does it work?

Apple told The Verge that it made Deep Fusion invisible to users for a seamless experience:

There’s no indicator in the camera app or in the photo roll, and it doesn’t show up in the EXIF data. Apple tells me that is very much intentional, as it doesn’t want people to think about how to get the best photo. The idea is that the camera will just sort it out for you.

Here are more specifics about when Deep Fusion is active:

  • With the wide (standard) lens in bright to medium-lit environments, Smart HDR will be used while Deep Fusion will activate for medium to low-lit scenes (Night mode naturally kicking in for dim-lit shots)
    • The telephoto lens will generally use Deep Fusion except for shots that are very brightly-lit when Smart HDR will take over
    • For the ultra wide lens, Deep Fusion is never activated, instead, Smart HDR is used

Read more 9to5Mac tutorials:

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel