Google’s Pixel phones have already changed and improved smartphone photography dramatically, but the latest addition to them might be the biggest leap forward yet. Night Sight is the next evolution of Google’s computational photography, combining machine learning, clever algorithms, and up to four seconds of exposure to generate shockingly good low-light images. I’ve tried it ahead of its upcoming release, courtesy of a camera app tweak released by XDA Developers user cstark27, and the results are nothing short of amazing. Even in its pre-official state before Google is officially happy enough to ship it, this new night mode makes any Pixel phone that uses it the best low-light camera.
Let’s take a look at a few examples, shall we? All of the shots below are taken with the Pixel 3 XL: first with the default settings and second with the night mode toggled on. Google claims Night Sight will save you from ever having to use the flash again, and so naturally, I didn’t use it with any of these images.
If you listen closely, you might be able to hear every other phone camera engineer flipping their desk and resigning in disgust. This is just an astonishing improvement. The Pixel 3’s camera is already among the very best low-light performers, so when a scene is so dark that it barely registers anything, you know there’s hardly any light. And yet, with Night Sight on, we actually see a scene that looks like a moderately noisy daytime shot.
Less dramatic than the difference between the first pair of images, this comparison represents a step change from crummy, noisy, and unusable to a perfectly decent shot.
This is easily my favorite comparison because the differences are so obvious that they scarcely need analysis. The default Pixel shot actually does an admirable job — most other phones would smudge the text to smithereens in such challenging conditions — but the night mode completely overhauls the photo. Google says that its machine learning detects what objects are in the frame, and the camera is smart enough to know what color they are supposed to have. That’s part of what makes these reds pop so beautifully.
Say hello to Vlad circa 1990. You’ll notice that all of the photos so far are indoors, and that’s because, well, I’ve just gotten this camera APK running this morning. Some proper nighttime shots have been added near the end of this preview. We’ll follow up with a fuller review of Night Sight when Google officially releases it.
Of course, Night Sight isn’t a perfect solution to every low-light situation. If you have small but focused sources of bright light, this night mode will blow out some highlights. Even so, an actual long exposure would probably blow out the entire screen on the Pixel 3, whereas the night mode shot keeps it readable. The fact that I can casually do this at my desk, without any tripod or specialist equipment (like, say, a dedicated camera), is bewildering.
The first shot is how I usually appear on Verge team conference calls. The second is just Google’s inexplicable camera magic.
I love this old painting that’s hanging above my computer. It looks terrible with the regular Pixel camera, but the night mode version restores it beautifully. All the cracks in the surface of the wood show up, replacing the nasty blotchiness of the other photo, and even the wispy curls of hair on the girl’s forehead are discernible. If I were to take the same shot with the lights turned on, I’d get an undesirable highlight on the painting.
I have to keep saying it because it’s important: the dark photos in these comparisons are probably the best you can get from a smartphone camera today. And yet the Pixel’s night mode makes the default Pixel’s pictures look like they were taken with a phone from a decade ago. The color fidelity and sharpness of these Night Sight images are not things I can understand or explain. There’s simply no reference point for this kind of imaging improvement through software.
Here’s a final comparison to drive home the point. The first photo is a sea of chromatic noise, smudged up by noise reduction blur working overtime to produce a somewhat reasonable image. All you’d ever get from that is an outline of the headphones. With night mode turned on, you can read the “Beyerdynamic” on the ear cup, you can see the “1” cutout next to the “T” in the yoke, and you can even read the last three digits of the serial number on the inside on the left of the headband.
Besides turning up the brightness on the entire shot, Google’s night mode improves the sharpness and legibility of everything in sight. On the yellow “Danger of Death” sign, I can read the emergency telephone number in the improved pic, which I can’t do with the blurrier default.
In the basic Pixel shot, the patterned window and the green hedge are both just black shapes. With Night Sight on, they gain color and detail. The sky also sheds a ton of blotchy noise as it brightens up. I can understand if you prefer the mood of the darker photo, but that’s easily achievable by tweaking the exposure and shadows in the Google Photos editor. There’s no photo editor on Earth that will improve a photo the way Google’s upcoming night mode can do.
People wanted to know how Night Sight handles moving objects, and the answer is it doesn’t. Although it’s not one single long exposure, Google’s night mode still gathers light over a period of a few seconds, and anything moving through the frame in that time will turn into a blur of motion. I can see that lending itself to some creative uses, as you’re now getting a handheld, fully automatic long exposure in your phone. And the Pixel camera is intelligent enough not to blow out the highlights from the street lamps. That being said, it’s obvious from this shot that the night mode doesn’t really benefit nighttime photos that are nevertheless well exposed. Night Sight tidies up the noise in the dark sky here, but it doesn’t wow the same way that it does in truly dark environments.
You also asked to see how the night mode compares with using the Pixel 3 XL’s flash, so here’s that comparison. Night Sight overcomes the flash’s lack of range and boosts the entire scene. The flash is only really useful on subjects that are really close to you, and then it casts a much harsher light than the effect achived by Google’s algorithms in night mode.
Now we’re entering the realm of the ridiculous. You’ll have to trust me when I say that the lighting conditions were identical and I didn’t cover up the lens on my default-camera shots.
I have these two pictures open in Adobe Lightroom, and there’s a little bit of definition in the seemingly all-black shadows of the first, but when I bring that exposure up, I get humongous blotches of red and pink noise. Things are honestly as simple as they look: one picture has a dark outline with a small island of light, and the other has the entire frame filled with information. Again, you can apply a vignette effect and strengthen the shadows of the night mode shot to restore the dark mood, but there’s no amount of post-processing that will get you from the default pic to the one taken with Night Sight.
Same situation here: you could theoretically boost the exposure of the first image in Lightroom, but the result would be a chunky mess of noise and noise-reducing blur. The stuff Google achieves with its extended Night Sight exposure system is literally impossible to reproduce with post-processing software. You’d need a professional camera with a big sensor, high ISO, and a fast lens to achieve results of this caliber.
None of these photos are going to win any beauty contests, but my aim has been to hit on the worst-case scenarios with these examples. I want to show the maximum that can be achieved with Google’s upcoming update. And it’s evidently a lot.
All of this amazingness is coming from prerelease software. You can download it for your Pixel device and play around with it as I have and lose your shit at the unreal results it produces. Google is evidently close to releasing the final Night Sight addition to the Pixel camera app, and when it does, it’ll probably change the mobile photography game.
Update November 14th, 12:20pm ET: This article was originally published on October 25, 2018 and has been updated to include video.
Photography by Vlad Savov / The Verge