The photography experts behind the Halide Camera app for iPhone have published a new blog post today diving deep into the new iPad Pro’s camera system. This includes the new ultra wide angle camera and LiDAR Scanner, and why the iPad still lacks Portrait mode support.
Halide offers some in-depth info on the ultra wide angle camera in the new iPad Pro:
The iPhone 11 and 11 Pro pack a significantly larger (and better) sensor with its wide-angle camera, compared to iPad. The ultra-wide sensor on iPhone is comparable to the ultra-wide on iPad in quality, but the iPad is lower resolution.
It appears the hardware just isn’t there to support night mode, Deep Fusion, and even portrait mode.
Then, there’s the LiDAR Scanner:
The LIDAR sensor, also known as a 3D ‘Time of Flight’ sensor (ToF for short) is a sensor that is exceptionally good at detecting range.
Regular camera sensors are good at focused images, in color. The LIDAR sensor doesn’t do anything like this. It emits small points of light, and as they bounce off your surroundings, it times how long it took the light to come back.
This sounds crazy, but it’s timing something moving at the speed of light. This window of time that amount to hundreds of picoseconds. Pico? Yes, pico — that’s an order of magnitude smaller than nanoseconds! A picosecond is 0.000000000001 seconds. Count the zeros.
One interesting tidbit is that the iPad Pro doesn’t support Portrait mode on the rear camera. Halide explains why the new LiDAR Scanner isn’t necessarily designed with Portrait mode in mind:
The Face ID sensor is trying to only resolve an area the size of a human face, with enough accuracy you can use it as a security device. The LIDAR sensor is made for room-scale sensing. It’s basically optimized for scanning rooms in your house.
The only reason we won’t say it could never support portrait mode is that machine learning is amazing. The depth data on the iPhone XR is very rough, but combined with a neural network, it’s good enough to power portrait mode. But if portrait mode were a priority, we’d put our money on Apple using the dual-cameras.
Another problem noted is that there are no APIs available that allow developers to access the new depth data, preventing Halide from using that information. The Halide developers, however, did build a proof-of-concept app called Esper that re-thinks photographic capture.
While the LiDAR Scanner can’t yet “augment our traditional photography,” Halide says that it “opens the door to new applications that are powerful and creativity-enabling in their own right.
The full blog post from Halide is well worth a read and can be found here.
FTC: We use income earning auto affiliate links. More.
Comments