Skip to main content

Opinion: How long before an iPhone completely replaces standalone cameras?

There’s a photographer’s saying that the best camera is the one you have on you at the time. It’s no use having the best pro DSLR in the world if it’s sitting at home when you spot a photo opp.

The camera most people have on them at the time is … the iPhone. Apple has stated on more than one occasion that the iPhone is the world’s most popular camera, and the evidence supports this claim. A year ago, for example, Flickr shared the top 20 cameras used by its members, and not only did the iPhone 6 top the list, but various iPhones took eight of the 20 slots.

For many, an iPhone is their only camera. But there are still many others who have one or more standalone cameras – myself among them. What is it those cameras can do that the iPhone can’t, and how long before an iPhone is the only camera we’ll ever need … ?

Shallow depth of field

Until recently, there was one immediate answer: only a standalone camera could offer shallow depth of field. The iPhone could do this in very limited circumstances, but for the most part if you wanted what is colloquially known as bokeh effect, you needed to use a camera with a larger sensor.

With the iPhone 7 Plus, of course, Apple introduced Portrait mode, which offers a shallow depth of field effect on photos taken with the 2x lens. The effect is artificial, and in beta form it’s far from perfect.

If you compare these two photos taken by my friend Julian Perry, you’ll see exactly what I mean. This is a normal shot, with the blur created naturally by the lens as per my earlier demonstration).

If we zoom in on the left-hand edge of the cup, the background blur is limited, but all is fine.

Here’s the same shot with the beta Portrait mode:

Much more background blur, but if you look closely at the left hand edge of the coffee cup in the Portrait mode shot, there’s a very obviously artificial break.

The software copes much better in some shots, but there’s still quite an artificial look to the bokeh at present. While it tries to make the focus fall off gradually, as with the real thing, it’s so far pretty crude.

But that’s why Apple labels it a beta. It will improve, and it will – at some point within the next few years – reach the point where the artificial effect is completely indistinguishable from the natural one. Here’s an example of the real thing, with graduated focus fall-off over quite a short distance:

Now sure, professional photographers will point out that different lenses have different bokeh, and you may sometimes choose a lens for the quality of the bokeh. But there’s also no reason why Apple can’t emulate different lens effects in software. So this argument for a standalone camera will, in time, disappear.

Macro shots

Extreme close-up shots require a macro lens: a lens which not only has an extremely shallow depth of field, but can also focus at extremely short distances. In this shot, for example, I had the lens literally a couple of centimetres away from the surface of the eye.

This isn’t something current-generation iPhone cameras can do, but they can focus at about twice that distance, and Apple could apply a more extreme artificial bokeh effect to get the extremely shallow DOF.

Again, then, Apple isn’t there yet, but I don’t see this as an enormous challenge.

Long exposure (static shots)

Sometimes, for creative reasons, you’ll want to have a long exposure – where the shutter remains open longer than usual, like this 30-second exposure on a bridge over a road.

A 30-second exposure is also my default setting for ‘blue hour’ shots – the time around 30-40 minutes after sunset when the sky takes on a blue glow. Only a long exposure can capture this.

I adore the blue hour, especially if I can get up high, so I take a lot of these.

On a standard camera, these type of shots are taken by leaving the sensitivity of the sensor* at 100iso – the sensitivity normally used for a shot taken in bright light – and leaving the mechanical shutter open a long time to acquire the amount of light needed. This is not something the iPhone – with its purely electronic shutter – can do.

*Just to avoid a bunch of comments correcting me, yes, technically a sensor only has one sensitivity and all you can do in low light is amplify the signal from it.

There are apps which take multiple exposures and stack them. I’ve tried a bunch of these and not found anything that can properly capture the blue hour. If you know of one that can, do please let me know in the comments.

But again, this is solvable by software. The iPhone can’t do it today, but there’s no reason a future one couldn’t do it via stacked shots if the demand is there and Apple throws enough effort at it.

Long exposure (panning shots)

But there’s a second situation where you may want a long exposure – when you want to give a sense of movement.

Here, you will track the moving subject with the lens – like this motorcycle – and use a long exposure to create motion-blur in the background. This is known as panning, and typically these exposures will be in the 1/4 second to 1/30th second range.

This was a sunny day, so to avoid the long exposure letting in too much light I had to reduce the aperture. This is something a fixed-aperture lens like the iPhone can’t do.

In principle, this one too could be handled by software, taking rapidly-stacked exposures and then using Portrait-style processing to separate subject from background to artificially add motion-blur to the background. It is, though, a whole different order of complexity! I can’t see this one happening anytime soon.

Low light

The final major photographic challenge is low light photos in general. The iPhone has better low-light performance than almost any other smartphone camera on the market, but it’s still leagues away from a professional DSLR.

I mentioned above that a camera sensor has a fixed level of sensitivity, and if you want to capture more light for any given shutter speed and aperture, you have to amplify the signal from that sensor. But amplifying the signal has a major downside: you also amplify all the ‘noise’ in the image. Here’s an iPhone shot taken in low light:

It looks kind of ok when viewed in a very small size like that, but if we take a closer look at the sky, we can see that the noise is horrendous.

For static photos, we can use the stacked shot approach described above, but that isn’t any use for a shot like this, where we’re moving.

It’s also no use for the most common requirement of a low-light shot: photographing people at parties, concerts, bars and so on. People are not very good at remaining perfectly still for 30 seconds, even without any alcohol involved …

One of the key features that distinguishes a professional DSLR from a cheaper camera is the low-light performance offered. As I say, you have to amplify the signal from the sensor to achieve high ISO settings, but a pro camera copes with this well. This was a shot taken in dark conditions at 6400iso (amplified x64) on a Nikon D3, a pro body:

If we go in as close as I did on the shot from the plane, you can see there is some noise, but it’s in a completely different league.

(The focus looks a little soft as I’ve left the RAW shot unsharpened to avoid adding any sharpening artifacts. If you shoot in JPEG, sharpening is added in-camera.)

This is actually the toughest challenge for Apple to solve. For any given level of technical know-how applied by a manufacturer, the larger the sensor, the lower the noise level. In the better DSLRs, known as full-frame cameras, there’s room for a large sensor – the same size as a 35mm negative. You only find these in pro bodies like the Nikon D5 and Canon 1DX, plus prosumer bodies like the Nikon D750 and Canon 6D.

More portable mirrorless cameras, like the Sony a6000 and its more expensive 4K-shooting brother the a6300, have smaller but still decent-sized APS-C sensors. I’m happy enough with the performance of my Sony a6000 that I use it as my standard travel camera these days, leaving my Nikon D3 at home.

But an iPhone has a tiny sensor compared to any standalone camera. I think it could be a very long time indeed before technology has advanced sufficiently to allow DSLR levels of low-light photography in a smartphone.

Other factors

Professional photographers and real enthusiasts would, of course, raise a whole bunch of additional differences between a cameraphone and a DSLR – not least among them removable, high-capacity storage. But for the vast majority of amateur shooters, I think these five factors are the ones that count. I could see Apple solving the first three. The fourth and fifth, though, I think may take very much longer.

Do you still have a standalone camera? If you do, how much usage does it get? Please take our poll and share your thoughts in the comments.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing