If you were wondering why Apple has ignored the megapixel race and stuck to a modest 8MP camera in its latest iPhones when almost every other manufacturer is cramming in as many pixels as physically possible, it’s all about image quality. While more pixels allow you to blow up photos to larger sizes, that comes at a cost. Squeezing more pixels into a tiny sensor means more noise, reducing quality, especially in low-light situations like bars and parties.
The secret is effectively to use burst-mode to shoot a series of photos, using an optical image stabilization system – like that built into the iPhone 6 Plus – to shift each photo slightly. Combine those images, and you have a single, very high-resolution photo with none of the usual quality degradation. Or, in patent language:
A system and method for creating a super-resolution image using an image capturing device. In one embodiment, an electronic image sensor captures a reference optical sample through an optical path. Thereafter, an optical image stabilization (OIS) processor to adjusts the optical path to the electronic image sensor by a known amount. A second optical sample is then captured along the adjusted optical path, such that the second optical sample is offset from the first optical sample by no more than a sub-pixel offset. The OIS processor may reiterate this process to capture a plurality of optical samples at a plurality of offsets. The optical samples may be combined to create a super-resolution image.
The principle of combining multiple photos into one isn’t new: it’s how the HDR function works. What’s new here is shifting the image on the sensor between shots. This does, though, mean that it will only work with static photos, not with moving subjects or video.
As ever with Apple patents, there’s no telling if or when this one might make it into an iPhone, but with 4K and 5K monitors fast becoming mainstream, the ability to shoot high-resolution photos would certainly be handy.