Skip to main content

Google gives the Pixel camera superhuman night vision

The mighty Night Sight mode is being released to Pixel phones today

Two years ago, Google’s release of the first Pixel smartphone radically raised the bar for the image quality we could expect from mobile cameras. Today, even as everyone else struggles to catch up, Google is extending its lead with the introduction of an equally revolutionary new camera. It’s called Night Sight, and it effectively lets your phone camera see in the dark. Only it doesn’t require any additional hardware or cost: Night Sight is a new camera mode for Pixel phones.

By now, you may have seen my testing with a pre-release version of Night Sight that was uncovered by the Android enthusiast community. The things that beta software could do were truly unprecedented and awe-inspiring, and the proper Night Sight release that Google is serving up to all Pixel models today keeps that high quality while providing an easier way to access the mode. This week, I spoke with Google’s Yael Pritch, lead researcher on Night Sight, about how the company built its new night mode and the constant improvements it is implementing.

Night Sight is momentous because it’s a software change that delivers a leap in performance that previously only new hardware could bring.

Pixel 3 (left) versus Pixel 3 with Night Sight on. Shot is taken handheld, and the only light in the room comes from the phone lighting up Vjeran’s face.

At the outset, Night Sight is not merely a long-exposure mode for your phone. What Google has built is a vastly more intelligent sibling to the brutish long exposure. In the past, you’d have needed a tripod to stabilize your camera to obtain multiple seconds’ worth of light information and thus get a brighter image at night than the human eye can see. Google is achieving similar results with a handheld Pixel by segmenting the exposure into a burst of consecutively taken frames, which are then reassembled into a single image using the company’s algorithmic magic. It’s an evolution of the HDR+ processing pipeline that’s used in the main Pixel camera, with some unique upgrades added in.

Before a shot is even taken, Google’s Night Sight camera does a ton of multifactorial calculations. Using what the company calls motion metering, the Pixel takes into account its own movement (or lack thereof), the movement of objects in the scene, and the amount of light available to decide how many exposures to take and how long they should be. At most, Night Sight photos will take up to six seconds and up to 15 frames to capture one image. Google has placed a limit of one second per exposure if the phone is perfectly still, or a third of a second if it’s handheld. So that means you can get six one-second exposures with a Pixel on a tripod or up to 15 briefer exposures when holding the phone, all of them feeding into one final photo.

Pixel 3 (left) versus Pixel 3 with Night Sight on.

To judge white balance in Night Sight, Google is using a new, more sophisticated learning-based algorithm that’s been trained to discount and discard the tints cast by unnatural light. Google’s computational photography experts like Pritch and Marc Levoy have fed the algorithm loads of images in both a tinted state and with a corrected white balance and taught it to prefer the latter. On a technical level, the software is looking at how the log-chrominance histogram of each photo shifts with varying tints. Google calls this method Fast Fourier Color Constancy (FFCC) and has published a white paper on the subject. Here’s a quote from an earlier paper that FFCC builds on, summarizing the core technique:

“Tinting an image affects the image’s histogram only by a translation in log chrominance space. This observation enables our convolutional approach to color correction, in which our algorithm learns to localize a histogram in this 2D space.”

In more poetic terms, the machine is learning more than just colors, with Pritch describing it as having “learned something inherent to pictures.” Google isn’t yet confident enough in this alternative approach to color correction to deploy it as the default on the Pixel camera, but the company is delighted with how it works in night photos. Moreover, Pritch tells me Google is looking to make it the universal white balance default by this time next year.

You’ll notice in the above skateboard shot that the Night Sight photo doesn’t just brighten the conventional Pixel image, but it also cleans up a ton of ugly noise in the sky and brings in color that would otherwise be absent. The white of the skateboard loses its murky yellow-green tint, and the sky gains a natural blue shade (as well as an entire palm tree, thanks to the extended exposure). Details such as the condensation on the glass and the smooth surface of the table are made sharper and more apparent. Similar improvements can be noticed in the tree image below, which sheds image noise, a greenish tinge, and a lot of softness in its Night Sight transition.

Pixel 3 XL (left) versus Pixel 3 XL with Night Sight on. Photos by Vlad Savov.

This tree scene illustrates one of the few limitations of Google’s Night Sight: the photo no longer looks like it was taken at night. This was a deliberate choice by Google. The company had to pick between the most faithful image, which would keep the shadows intact, or the most detailed one, which brightens the scene so that the camera captures the most information possible. Google chose the latter, justifying it on the grounds that editing shadows back in is trivial compared to trying to edit detail into shadows.

Every aspect of Google’s Night Sight is dynamic and automatic. If the phone detects that a scene is dark enough, it’ll surface a suggestion to try night mode, you tap on that, and then it mostly takes over from there. The only controls offered to the user are tap-to-focus and the usual exposure slider. You can’t tell the camera how many frames you want it to capture or set your own shutter speed.

The Pixel’s selfie camera benefits from Night Sight too.

Pixel 3 selfie with Night Sight off.
Pixel 3 selfie with Night Sight off.
Pixel 3 selfie with Night Sight on.
Pixel 3 selfie with Night Sight on.

Night Sight is a poor fit for trying to capture anything in motion. It accounts for small movements of objects in the frame, but it will blur things like cars driving by. It also doesn’t deal especially well with bright lights in the frame, as illustrated by the comparison below.

Pixel 3 (left) versus Pixel 3 with Night Sight on.

Night Sight’s use should be limited to truly low-light situations, which are actually quite difficult to find in a big city. Walking around a place like London or San Francisco at night, you’ll quickly realize that streetlights and storefronts keep most places permanently illuminated with a day-like glow. But go into an unlit park, a smoky bar, or a dark room, and you’ll find yourself amazed by what this new camera mode can do.

Pixel 3 (left) versus Pixel 3 with Night Sight on.

Google is releasing Night Sight today as an update to the Pixel camera app for the latest Pixel 3, last year’s Pixel 2, and even the original 2016 Pixel. It’s commendable that the company is supporting its older phones like this, though OG Pixel users won’t get quite the same quality as owners of the later models. Because the first Pixel lacks optical image stabilization, Google can’t do the same length of exposures as on the other two. The learning-based white balancer is also trained specifically for the Pixel 3, so Pixel 2 users — as well as anyone else keenly awaiting a hacked version of the app for their Pocophone or Nokia — will not get the absolute best. In all cases, though, you can be sure that Night Sight will be a vast, monumental upgrade to the nighttime photography you used to know.

Sample photography by Dieter Bohn / The Verge

The Verge on YouTube /

Exclusive first looks at new tech, reviews, and shows like Processor with Dieter Bohn.

Subscribe!