Skip to main content

iPhone 11 and iPhone 11 Pro adding Deep Fusion camera feature with upcoming iOS 13 beta

A forthcoming developer beta of iOS 13 will add Apple’s new Deep Fusion photography system, according to the Verge. Deep Fusion is the new camera processing system introduced by Apple last month for the iPhone 11 and iPhone 11 Pro. [Update: iOS 13.2 beta 1 coming tomorrow, according to Daring Fireball.]

Deep Fusion was billed as an iPhone 11 and iPhone 11 Pro feature, though it was not available when the devices launched last month. Apple says that Deep Fusion processing will be most important for “medium-to-low light images.”

What differentiates Deep Fusion from Night mode, however, is that Deep Fusion is completely invisible to the user. That means you won’t be able to tell when it’s running in the background, but you should notice considerably better image quality. Apple told the Verge that this is part of its strategy.

The Verge offers a quick rundown on how Deep Fusion works:

  1. By the time you press the shutter button, the camera has already grabbed three frames at a fast shutter speed to freeze motion in the shot. When you press the shutter, it take three additional shots, and then one longer exposure to capture detail.
  2. Those three regular shots and long-exposure shot are merged into what Apple calls a “synthetic long” — this is a major difference from Smart HDR.
  3. Deep Fusion picks the short exposure image with the most detail and merges it with the synthetic long exposure — unlike Smart HDR, Deep Fusion only merges these two frames, not more. These two images are also processed for noise differently than Smart HDR, in a way that’s better for Deep Fusion.
  4. The images are run through four detail processing steps, pixel by pixel, each tailored to increasing amounts of detail — the sky and walls are in the lowest band, while skin, hair, fabrics, and so on are the highest level. This generates a series of weightings for how to blend the two images — taking detail from one and tone, tone, and luminance from the other.
  5. The final image is generated.

When you first go to your Camera Roll after taking an image, you’ll see a “proxy image,” while Deep Fusion runs in the background. This process should only take just one second longer than Smart HDR. It means, however, that Deep Fusion won’t work in burst mode.

Phil Schiller described Deep Fusion as “computational photography mad science” when it was first introduced last month. At that time, Apple only said that it was coming sometime later this fall. While the early results from the iPhone 11 and iPhone 11 Pro cameras are impressive, Deep Fusion has the potential to take things to the next level.

The Verge has a couple of sample Deep Fusion images courtesy of Apple:

While the beta of iOS 13 with Deep Fusion isn’t available just yet, it should be available soon – likely in the form of iOS 13.2.

FTC: We use income earning auto affiliate links. More.

Spike slack competitor
You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Subscribe to 9to5Mac on YouTube for more Apple news:

Comments

Author

Avatar for Chance Miller Chance Miller

Chance is an editor for the entire 9to5 network and covers the latest Apple news for 9to5Mac.

Tips, questions, typos to chance@9to5mac.com