The Future of Smartphone Cameras

A Google employee has released a set of images he's produced using sets of pictures taken with his Google Pixel. 

Taking all this into account, I wrote a simple Android camera app with manual control over exposure time, ISO and focus distance. When the shutter button is pressed the app waits a few seconds and then records up to 64 frames with the selected settings. The app saves the raw frames captured from the sensor as DNG files, which can later be downloaded onto a PC for processing.

To test my app, I visited the Point Reyes lighthouse on the California coast some thirty miles northwest of San Francisco on a full moon night. I pointed a Nexus 6P phone at the building and shot a burst of 32 four-second frames at ISO 1600. After covering the camera lens with opaque adhesive tape I shot an additional 32 black frames. Back at the office I loaded the raw files into Photoshop. The individual frames were very grainy, as one would expect given the tiny sensor in a cellphone camera, but computing the mean of all 32 frames cleaned up most of the grain, and subtracting the mean of the 32 black frames removed faint grid-like patterns caused by local variations in the sensor's black level. The resulting image, shown below, looks surprisingly good.

Surprisingly good is probably underselling it. These pictures are pretty much  incredible, including this one featuring the milky way!

The constellations Scorpius and Sagittarius are clearly visible, and squinting hard enough one can just barely make out the horizon and one or two rocks in the ocean, but overall, this is not a picture you'd want to print out and frame. Still, this may be the lowest-light cellphone photo ever taken.

The final comparison challenge holds up extremely nicely. Still, the post absolutely buries the lede:

Trying to find out if phone cameras might be suitable for outdoor nighttime photography was a fun experiment, and clearly the result is yes, they are. However, arriving at the final images required a lot of careful post-processing on a desktop computer, and the procedure is too cumbersome for all but the most dedicated cellphone photographers. However, with the right software a phone should be able to process the images internally, and if steps such as painting layer masks by hand can be eliminated, it might be possible to do point-and-shoot photography in very low light conditions. Almost - the cellphone would still have to rest on the ground or be mounted on a tripod.

This is probably going to first launch as a special mode available where the phone takes tens of seconds worth of longer exposures and then compiles them on board for the aesthetically pleasing result, the future of portrait mode. I'd be interested to know how much optical image stabilization can compensate for the need for a tripod, as the Pixel does not yet have the floating camera found in devices such as the iPhone 7, which by their own admission improves low light images.

Few things are as exciting to me as the smartphone camera industry. I'm sure Snapchat agrees.