Two years ago, Google’s release of the first Pixel smartphone radically raised the bar for the image quality we could expect from mobile cameras. Today, even as everyone else struggles to catch up, Google is extending its lead with the introduction of an equally revolutionary new camera. It’s called Night Sight, and it effectively lets your phone camera see in the dark. Only it doesn’t require any additional hardware or cost: Night Sight is a new camera mode for Pixel phones.
By now, you may have seen my testing with a pre-release version of Night Sight that was uncovered by the Android enthusiast community. The things that beta software could do were truly unprecedented and awe-inspiring, and the proper Night Sight release that Google is serving up to all Pixel models today keeps that high quality while providing an easier way to access the mode. This week, I spoke with Google’s Yael Pritch, lead researcher on Night Sight, about how the company built its new night mode and the constant improvements it is implementing.
Night Sight is momentous because it’s a software change that delivers a leap in performance that previously only new hardware could bring.
At the outset, Night Sight is not merely a long-exposure mode for your phone. What Google has built is a vastly more intelligent sibling to the brutish long exposure. In the past, you’d have needed a tripod to stabilize your camera to obtain multiple seconds’ worth of light information and thus get a brighter image at night than the human eye can see. Google is achieving similar results with a handheld Pixel by segmenting the exposure into a burst of consecutively taken frames, which are then reassembled into a single image using the company’s algorithmic magic. It’s an evolution of the HDR+ processing pipeline that’s used in the main Pixel camera, with some unique upgrades added in.
Before a shot is even taken, Google’s Night Sight camera does a ton of multifactorial calculations. Using what the company calls motion metering, the Pixel takes into account its own movement (or lack thereof), the movement of objects in the scene, and the amount of light available to decide how many exposures to take and how long they should be. At most, Night Sight photos will take up to six seconds and up to 15 frames to capture one image. Google has placed a limit of one second per exposure if the phone is perfectly still, or a third of a second if it’s handheld. So that means you can get six one-second exposures with a Pixel on a tripod or up to 15 briefer exposures when holding the phone, all of them feeding into one final photo.
To judge white balance in Night Sight, Google is using a new, more sophisticated learning-based algorithm that’s been trained to discount and discard the tints cast by unnatural light. Google’s computational photography experts like Pritch and Marc Levoy have fed the algorithm loads of images in both a tinted state and with a corrected white balance and taught it to prefer the latter. On a technical level, the software is looking at how the log-chrominance histogram of each photo shifts with varying tints. Google calls this method Fast Fourier Color Constancy (FFCC) and has published a white paper on the subject. Here’s a quote from an earlier paper that FFCC builds on, summarizing the core technique:
“Tinting an image affects the image’s histogram only by a translation in log chrominance space. This observation enables our convolutional approach to color correction, in which our algorithm learns to localize a histogram in this 2D space.”
In more poetic terms, the machine is learning more than just colors, with Pritch describing it as having “learned something inherent to pictures.” Google isn’t yet confident enough in this alternative approach to color correction to deploy it as the default on the Pixel camera, but the company is delighted with how it works in night photos. Moreover, Pritch tells me Google is looking to make it the universal white balance default by this time next year.
You’ll notice in the above skateboard shot that the Night Sight photo doesn’t just brighten the conventional Pixel image, but it also cleans up a ton of ugly noise in the sky and brings in color that would otherwise be absent. The white of the skateboard loses its murky yellow-green tint, and the sky gains a natural blue shade (as well as an entire palm tree, thanks to the extended exposure). Details such as the condensation on the glass and the smooth surface of the table are made sharper and more apparent. Similar improvements can be noticed in the tree image below, which sheds image noise, a greenish tinge, and a lot of softness in its Night Sight transition.
This tree scene illustrates one of the few limitations of Google’s Night Sight: the photo no longer looks like it was taken at night. This was a deliberate choice by Google. The company had to pick between the most faithful image, which would keep the shadows intact, or the most detailed one, which brightens the scene so that the camera captures the most information possible. Google chose the latter, justifying it on the grounds that editing shadows back in is trivial compared to trying to edit detail into shadows.
Every aspect of Google’s Night Sight is dynamic and automatic. If the phone detects that a scene is dark enough, it’ll surface a suggestion to try night mode, you tap on that, and then it mostly takes over from there. The only controls offered to the user are tap-to-focus and the usual exposure slider. You can’t tell the camera how many frames you want it to capture or set your own shutter speed.
The Pixel’s selfie camera benefits from Night Sight too.
Night Sight is a poor fit for trying to capture anything in motion. It accounts for small movements of objects in the frame, but it will blur things like cars driving by. It also doesn’t deal especially well with bright lights in the frame, as illustrated by the comparison below.
Night Sight’s use should be limited to truly low-light situations, which are actually quite difficult to find in a big city. Walking around a place like London or San Francisco at night, you’ll quickly realize that streetlights and storefronts keep most places permanently illuminated with a day-like glow. But go into an unlit park, a smoky bar, or a dark room, and you’ll find yourself amazed by what this new camera mode can do.
Google is releasing Night Sight today as an update to the Pixel camera app for the latest Pixel 3, last year’s Pixel 2, and even the original 2016 Pixel. It’s commendable that the company is supporting its older phones like this, though OG Pixel users won’t get quite the same quality as owners of the later models. Because the first Pixel lacks optical image stabilization, Google can’t do the same length of exposures as on the other two. The learning-based white balancer is also trained specifically for the Pixel 3, so Pixel 2 users — as well as anyone else keenly awaiting a hacked version of the app for their Pocophone or Nokia — will not get the absolute best. In all cases, though, you can be sure that Night Sight will be a vast, monumental upgrade to the nighttime photography you used to know.
Sample photography by Dieter Bohn / The Verge
Comments
This game changing
By Dr Strange on 11.14.18 12:01pm
But what about the mood? You lose the mood! /s
By OpssYourBad on 11.14.18 1:15pm
Came to the comments just to see how people were trying to downplay it.
By KSulli on 11.14.18 3:06pm
Haha yeah. I actually think it looks more realistic to how humans see in lower light, so it really portrays the mood more accurately.
By BausFight on 11.15.18 10:37am
I don’t know why everyone is amazed by this when Huawei introduced it with the P20 Pro months before the Google.
By The Proper Gentleman on 11.14.18 2:07pm
Because Huawei’s isn’t as good?
Also pixels are known for camera quality
By theratchetnclank on 11.14.18 2:30pm
The camera on the Huawei is better, the software of the Pixel is better… guess what, you can have the software on your Huawei
By EveryDayIs on 11.15.18 4:42am
The camera on the Pixel is definitely better
By ujwalsoni on 11.15.18 6:05am
No the camera (hardware) on the Huawei is better, the software of the Pixel is better resulting in great shots. That’s a fact. Using Google’s camera software on other android phones results in insane results as well, it’s just too much work for most people.
By EveryDayIs on 11.15.18 6:27am
It’s true; it leaped my lackluster Essential Phone camera into a whole new class.
By BausFight on 11.15.18 10:39am
I didn’t even know Google allowed it’s camera app (and algorithms?) to be used on other phones
By ujwalsoni on 11.15.18 3:11pm
If it’s on Android there’s an APK for it out there somewhere
By zeer0 on 11.15.18 4:44pm
It’s just an apk in the end. On platforms like XDA you will find altered version to work for your phone. Including all of the AR stuff as well. I’ve been using the Google camera app by default on my Galaxy S8.
By EveryDayIs on 11.16.18 4:15am
I just returned the huawei mate 20 pro last week… you know why? My Pixel 2 with this night sights mode blew away the night mode of Huawei Mate 20 Pro… I also thought it that way before comparing and having a huawei phone… this pixel thing, is really a game changer. It produces sharp and beautiful and non blurry night phones… Huawei on the other had, photos are oversaturated, blurry, washed and not that sharp. Pixel wins… and i am not planning to upgrade my pixel 2 if there’s a nice pixel 3 deal during the black friday… otherwise, I’ll wait a bit longer to find that deal and jump to pixel 3 xl.
Huawei night mode, forget about it… this is a far superior night mode.
By AndrewGa on 11.14.18 2:31pm
Couldn’t you side load this camera into you Huawei? People have on the Oneplus 6T.
Did you return it before trying that?
By aarontsuru on 11.14.18 7:17pm
Software can only go so far. Run it on Mate 20 pro and you can throw away the Pixel.
By tytung on 11.15.18 2:29am
Literally what everyone else already said. You should’ve just loaded the Pixel Camera app on your Huawei for insane results.
By EveryDayIs on 11.15.18 4:41am
It’s not always the best to be the first at something though.
By anymunch on 11.14.18 3:33pm
there is a difference between having a feature and having a feature that works well.
By Robot-Hotdog on 11.14.18 3:49pm
Ha! The Pixel takes incredible photos with little light and the Huawei it is a gimmick.
By jacksmith21006 on 11.14.18 5:21pm
This is creepy… very Google. Imagine being at a campsite in darkness while you get kinky with your "special other" and someone pulls this night vision on steroids phone on you.
I can see google figuring out a way to do this with video. The video may be trippy and have motion blur, but it would still be wicked cool.
By Oldarney on 11.14.18 2:13pm
That’s just idiotic. The human eye is much better than cameras… so don’t get kinky where people can see you.
By mrkite on 11.14.18 2:39pm
…unless that’s you thing.
By Winklemeier on 11.14.18 2:55pm
This is the beta version, of it’s first generation. Dieter even said it could see more of the pier than his eye. With DSLR’s a long exposure on people will always result in blurriness, specially if you are hand holding it. Old school night vision turns everything green. This is supernatural, better than your eyeball sci fi stuff
By Oldarney on 11.14.18 4:07pm
There’s nothing supernatural about this…
I can take images in dark on my iPhone in RAW mode with a really high ISO and somewhat slow shutter, (but fast enough that there isn’t motion blur) and then use an app that can edit the RAW and boost the exposure way up and do a bunch of noise reduction and get a result that’s pretty close to what i’m seeing form Google’s night shift.
The nice thing is google has automated and optimized this process.
By SirMaster on 11.14.18 5:34pm