I should open by saying I’m a tough sell where cameraphones are concerned. My primary camera is a Nikon D3 full-frame 35mm DSLR with a set of lenses that takes the total cost well into ‘let’s never do the sums’ territory, so the bar is set rather high.
But camera technology advances, and I judge by results rather than reputations, so I did recently switch to using a Sony a6000 compact camera for most shots – including travelling. This is a lot smaller and lighter, and also attracts less attention. It has an APS-C sensor, which isn’t quite in D3 territory, but is a lot larger than an iPhone sensor and has proven itself remarkably capable.
I’d love to have that kind of performance in an iPhone, but it’s not there yet in two respects: shallow depth of field, and low-light performance, both of which I’ll address below. So the question for now is: is the 12MP camera in the iPhone 6s a worthwhile improvement on the 8MP version in the iPhone 6 … ?
Let’s start with the shallow depth of field business. I’m sure some people reading my comment on that were immediately ready to disagree and pull out some flower shots to prove it. So yes, if you get the iPhone very close to the subject – as you would when taking a flower shot – then you can get reasonably shallow DOF.
But only with the iPhone positioned very close to the subject. As soon as you are further away, you get almost infinite depth of field. You can’t, for example, take a shot like this, where I’m about twelve feet away from the subject and still able to isolate the subjects from the background.
But for daylight shots where you don’t need shallow depth of field, both sensors are extremely capable. In all honesty, at the typical sizes at which photos are viewed these days, you’re not going to see a noticeable difference between an iPhone shot and a good compact camera, and even the DSLR offers only a marginal benefit in this situation.
Of course, you can’t tell anything from downsized photos viewed online, so let’s start pixel-peeping, starting with a daylight shot …
With all the comparison shots, the iPhone 6 is on the left, the 6s on the right.
Here, if we weren’t pixel-peeping, there would be nothing to choose between the two. If we do a 100% crop, then we can see that there is just a tiny bit of noise in dark areas of the iPhone 6 shot, while the iPhone 6s version is cleaner (and also larger, due to the extra pixels – more on this in a moment).
But really, this is nothing at all that is going to show up when viewed on even desktop screen sizes, and it would be exceedingly unlikely to show in a print (we’ve been at the stage for years where screens show more detail than even pro prints).
12MP vs 8MP
The iPhone 6s does, of course, have more pixels to crop from if we want to zoom in, either when we take the photo or when editing afterwards. So what I’ve done below is take the earlier photo of the Gherkin shot through the arch, and zoom right in to the very top of the building. That’s a very severe crop indeed, losing around 90% of the total photo! I’ve done both crops proportional to the resolution so you can get an idea of what the difference between 8MP and 12MP means in real life.
Is this significant? In all honesty, I’d have to say not. How often do you really need to zoom in to that kind of extreme, either in the camera or in editing? I’d say for most people the answer is going to be hardly ever.
And let’s be real here: the vast majority of iPhone shots are viewed online: Facebook, Twitter, Instagram, Flickr, whatever. They are viewed at small size, and even zoomed right in to the top of the building, the 8MP sensor still gives us an acceptable size photo for most online use (i still had to reduce it in size for this piece). Your mileage may vary, but for typical use of the camera, I’d say the extra pixels are not a big deal.
But there is, of course, one very big difference between the two cameras: the Live Photos functionality of the iPhone 6s. As you probably know, the way the iPhone camera eliminates shutter-lag – the time between pressing the button and taking the photo – is that it’s actually taking photos the whole time time the camera app is open. Most of the time, it’s silently deleting them immediately afterwards, but when you press the shutter button it keeps the frame it shot just as you touched the button and throws away the rest.
What Live Photos does is to save a couple of seconds worth of these auto-taken photos and turns them into a very short video. When you swipe between photos in the Photos app, it gives you a very brief preview of the animation, and if you 3D Touch, you can view the entire clip. Here’s an example.
It’s too early to tell what I’m going to make of Live Photos. At one end of the scale, it could be a feature I’d play with a few times then switch off, never to use again. At the other end, it could be seen as a must-have feature that all cameras would offer within a year or so. My jury is still out, but so far at least, I’ve left it on. Apple claims it only roughly doubles the storage size for a photo, and my tests show this to be true (typically a bit more than double, but not significantly so). I have a 128GB phone, meaning space is not at a premium, so I’m guessing I’ll switch it on for people and cat shots, just in case.
I am, though, already pretty sure that 90%+ of the Live Photos people are going to show will be of just two subjects: kids or cats …
Oh, and I also tested the selfie flash. I do my best to stay on the correct side of cameras, so I’ll simply say that it works.
Ok, this is where we get to the stuff that separates the men from the boys: low-light photography. As I mentioned earlier, DSLRs and high-level compact cameras deal with low-light situations in three ways. First, they have large sensors which have more widely-spaced sensor pixels that are less vulnerable to noise, so you can boost the ISO (aka amplify the signal) a lot before it degrades significantly. Second, they have wide-aperture lenses, which let in as much of the available light as possible. Third, they allow long exposures, keeping the shutter open longer to allow in more light in total (this of course requires a tripod or similar to hold the camera steady).
Cameraphones have small sensors, wide-ish aperture lenses but still not in DSLR territory, and they can’t do long exposures. So, what they do instead is to amplify the signal from the sensor a lot. This works, but the downside of amplification on a small sensor is that it generates even more noise than you get from a densely-packed sensor in the first place.
All photos were taken with Live Photos off, which improves the quality by allowing (somewhat) slower shutter speeds.
This is a reasonably-challenging shot for a camera. We’re in relatively low light, and we’re shooting directly into the light, which washes out detail. Cheap cameraphones tend to shrug and give up when faced with this kind of situation.
But both iPhone cameras are up to the task. If we take a 100% crop, we do see a tiny difference, but there’s very little in it.
Ok, let’s let the sun dip a little further, and see how the two cameras cope with that.
Now this is where things start to get interesting. Light levels outside are falling, and this is the point at which you normally start to see noise. Again, both cameras are coping really well. Viewed at normal screen size, it’s a perfectly acceptable photo – and even viewed at 100%, the noise level is extremely low. Both sensors are impressive.
But … if you compare closely in the 100% crop above, there’s actually slightly less detail in the iPhone 6s shot. I think what’s going on here is that the more densely-packed 12MP sensor is starting to get noisier, so Apple is applying more noise reduction to compensate – and this is where we lose a little detail.
I stress, this is only visible here when pixel-peeping – it’s not something we’d ever worry about in real life in this level of light. So let’s see what happens as things get darker.
Here we have bright light from the sun, and most of the rest of the shot is dark. This is a really tough challenge for any camera. As we’d expect, detail at street level is washed out. In real-life use, I wouldn’t bother including anything at street level, I’d just show the sunset and the reflection on the Cheesegrater (the foreground building), but it’s interesting to include for test purposes. (The diagonal streaks, incidentally, are on the window, not an issue with the photos.)
Now, at this point, looking at the photos on my iPad, I was convinced that the 6s shot was better. There seemed to be less noise. But, when I looked at 100% crops the next day, it confirmed my theory about the more aggressive noise-reduction necessary to compensate for the denser pixels. There is indeed less noise, but it’s achieved at the expense of loss of detail. You can see this in both the street-level buildings and the reflections in the windows – there’s a muddier look to the 6s shot.
So in a sense, my initial impression of the iPhone 6s camera being better in low light was completely wrong. But I’ll return to this point shortly.
Let’s now give it the ultimate test of a true night shot. There’s still a glow in the sky, but the city is essentially in darkness and all the lights are on. I wouldn’t normally even dream of attempting a cameraphone shot in this light, but let’s see what we get.
Again, viewed on the phone and at iPad size, the 6s shot looked noticeably cleaner (look at the sky around the Shard, top left). But again, when we look at 100% crops, it’s because the stronger noise-reduction on the 6s is simply wiping out detail. Look at the latticework in the arched arcade lit in yellow, for a good example. The 6s shot is noticeably less sharp due to the greater level of noise-reduction applied by the phone.
This reinforces what I’d long said: that Apple was right to refuse to enter the megapixel race and concentrate instead on quality rather than quantity. The more densely-packed sensor in the higher megapixel camera requires more aggressive noise-reduction to overcome the increased noise – and that is achieved at the expense of detail. So the higher resolution image does, in low-light conditions, end up less detailed than the lower resolution version.
This is, unfortunately, what happens when people who know nothing about photography simply count pixels and criticize Apple for falling behind. The company refused to play that game for a long time, but I guess this is the point at which it feared it would be panned for remaining with an 8MP camera for a fifth generation (after the iPhone 4S, 5, 5S and 6).
At a pixel-peeping level, then, the iPhone 6s sensor is actually a retrograde step, sacrificing detail for pixel-count. But … real-life viewing for most people maxes out at either a 15-inch MacBook or a 27-inch iMac. And the vast majority of photo viewing these days is far smaller than that, downsized by Facebook and its ilk. At any of those sizes, the iPhone 6s shots look better. So on balance, Apple made the right decision: in real-life use, your iPhone 6s photos are going to look better to almost everyone who views them.
For me, though, I care about quality, and have some of my photos blown up on my walls at 30×20 inches. So I’m going to be sticking with real cameras for now. If you want to see why, here’s a photo taken on my Sony a6000 camera with APS-C sensor: a 30-second exposure at 100iso. You would never guess it, but this is actually taken in very similar light to the final shot above – it just lets in so much more of it! The difference between 30 seconds of light and a fraction of a second is … night and day.
Maybe we’ll get there by the iPhone 10s.
If you’ve bought the iPhone 6s and tried out the camera, do take our poll – and as ever, please share your thoughts in the comments. Oh, and if you have any long-exposure apps you’d recommend, do please let me know – I’m going to be testing one or more of these shortly. Update: I did, and unfortunately none of them impressed me. Several were better than the native camera app, but not by a sufficient margin to make it worth the hassle.