If you’re looking forward to utilizing the upcoming depth of field effect on the iPhone 7 Plus, you should be aware of its potential limitations. Although it appears to be a great tool to have in the bag, it’s not going to have your DSLR collecting dust, depending on the type of photographer you are.
iMac Pro: The most powerful Mac ever
Limitations of the depth of field feature on iPhone 7 Plus
For starters, the depth of field feature, which isn’t scheduled to launch until later this fall via a software update, appears to only work with real people. On Apple’s website, it explains what the depth of field effect is for (emphasis mine):
Depth of field allows you to keep faces sharp while creating a blurred effect in the background. When you take a shot with iPhone 7 Plus, the dual-camera system uses both cameras and advanced machine learning to make your subject sharp while creating the same out-of-focus blur in the background — known as the bokeh effect — previously reserved for DSLR cameras. So no matter what’s behind your subject, it’s easy to create a great portrait.
Portrait: noun 1. a painting, drawing, photograph, or engraving of a person, especially one depicting only the face or head and shoulders.
Although we didn’t receive an expansive demonstration of the impressive new feature, the wording used in press materials suggests that the depth of field effect will be limited to those occasions when shooting photos of real people, not inanimate objects.
This means that product photography, for instance, won’t be able to take advantage of the depth of field feature. Although Apple hasn’t fully expounded on the ways that the feature can be wielded, it was very careful in its message on its website.
Phil Schiller, Apple’s SVP of Worldwide Marketing, was also careful in his wording as he explained the advantages of the 7 Plus’ camera system:
There’s one other use of this camera that we challenged our engineering team to do as an extra credit project. It really was, it’s something that is incredibly challenging and takes a lot of amazing invention, but what they’ve been doing is astounding. And it’s something that’s a big breakthrough in photography, and we want to give you a sneak peak of this feature.
What they’re able to do, when we take a picture, is to use the ISP to scan the scene. To use machine learning to recognize people and faces and then create a depth map of that image from the two cameras and the software. Keep the people in front sharp in focus, and apply a beautiful blur to the background.
This feature, once it ships, will be available via a new Portrait style within the stock Camera app on the iPhone 7 Plus. After selecting Portrait style, the software automatically switches to the 56mm telephoto lens, and displays a deep depth preview, which is generated in real time as you’re looking at the screen. That’s mighty impressive, and it’s something Apple notes that DSLRs aren’t currently capable of doing.
For the record, Apple isn’t claiming that its cameras replace DSLRs, as Schiller explained during the end of the 7 Plus camera demonstration:
Now we are not saying to throw out your DSLRs and that iPhone replaces all of the DSLRs. What we are saying is this is the best camera we have ever made in an iPhone. This is the best camera ever made in any smartphone. For many of the customers who have it, it’ll probably be the best camera they’ve ever owned to date. But more importantly it allows them to create beautiful pictures with incredible creative tools.
Apple’s new Portrait style in the iPhone 7 Plus’ Camera app appears to be, at least at this stage in the game, limited to portrait photography of human subjects. In other words, don’t expect to be able to use the iPhone 7 Plus’ camera to pull off stunning high-bokeh scenery pictures devoid of human subjects, or photos of inanimate objects with drastically blurred backgrounds yet. Such functionality could theoretically be incorporated by means of a software update, but it doesn’t appear that the camera will gain those capabilities initially.
Right now, it’s unclear how much you’ll be able to adjust shallow depth of field effect, if at all. You’ll also, apparently, be limited to using the effect inside of the stock Camera app, as there was no mention of third-party-accessible APIs. Of course, all of this is subject to change, and most likely will be changed as the software grows and matures.
DSLRs are safe
Apple’s ability to bring us DSLR-like functionality is very impressive, but one must remember that it’s the software, not so much the hardware, that makes this feature possible in such a diminutive package.
This presents advantages and disadvantages for Apple. It’s an advantage, because Apple is a much more nimble and capable software developer than any camera manufacturer can ever hope to be. At the same time, Apple is limited by physics. There’s simply no way to produce the true bokeh effects made possible by fast glass and camera bodies with full frame sensors.
Point and shoot manufactures will continue to suffer as Apple eats most of its lunch — the new slightly wider 28mm lens, ƒ/1.8 aperture, and updated glass and sensor will make videos and photos even more impressive — but true photo enthusiasts who want control over every aspect of their shots will not be fazed and will continue to buy standalone cameras. Software tricks aside, nothing can replace hardware solely built for taking great photos, not to mention videos, which Apple’s new shallow depth of field feature doesn’t appear to support either.
Schiller sums it up nicely:
It’s not for every style of picture you’re going to take, but for the ones you want to use it on, it’s a pretty big breakthrough.