How the Pixel's software helped make Google's best camera yet

The verdict is in on Google's impressive new Pixel and Pixel XL phones, and one of the bright spots is the camera. We have an in-depth comparison between the Pixel, the Galaxy S7 Edge, and the iPhone 7 available for your perusal here, and in our full review Dieter Bohn says "if you wanted to agree with Google and call this the best smartphone camera, I wouldn't argue with you."

"The results on the Pixel are very, very good," says Dieter. "I put it in the same ballpark as the iPhone 7 and the Galaxy S7 in most situations, which is not something I expected to say going in."

Clearly, this is by far the most competitive Google has ever been in mobile photography. But the Pixel phones, on paper, don't have cutting-edge hardware, relying on an f/2.0 lens without optical image stabilization. Instead, and in typical Google fashion, Google has turned to complex software smarts in order to power the Pixel camera. I spoke with Marc Levoy, a renowned computer graphics researcher who now leads a computational photography team at Google Research, about how software helps make the Pixel camera as good as it is.


Levoy's team has worked on projects as diverse as the 360-degree Jump camera rig for VR and burst mode photography for Google Glass. On the Pixel, the most prominent place you'll see its work is in the HDR+ mode that has been deployed on Nexus devices over the past few years. Apple popularized mobile HDR, or high dynamic range photography, back in 2010 with the iPhone 4, but Google's approach differs dramatically in both implementation and technique.

For one thing, you're supposed to leave it on all the time, and it's switched on by default. "I never switch it off," says Levoy. "I can't think of any reason to switch it off." You can, of course, and there's another slightly higher quality mode called HDR On that works similarly to previous Nexus phones, which is to say slowly. But for general photography, Google thinks you should be using HDR+ for each shot.

Read more: Google Pixel and Pixel XL review

This no-compromise approach to HDR photography has partly been made possible by new hardware. The Hexagon digital signal processor in Qualcomm's Snapdragon 821 chip gives Google the bandwidth to capture RAW imagery with zero shutter lag from a continuous stream that starts as soon as you open the app. "The moment you press the shutter it's not actually taking a shot — it already took the shot," says Levoy. "It took lots of shots! What happens when you press the shutter button is it just marks the time when you pressed it, uses the images it's already captured, and combines them together."

It's a major usability improvement on the HDR+ mode in last year's Nexus 6P and 5X. "What used to happen last year is you'd press the shutter button and you'd get this little circle going around while it captured the images you need for the burst; now it's already captured those," says Levoy. "And that's big, because that means that you can capture the moment you want."

Though Google has certainly made massive strides in speed, based on our testing of the Pixel we're not sure we'd agree that we'd never want to turn HDR+ off. It does generally produce great results, but there has been the odd image like this that reminds us a little too much of mid-2000s Flickr shots with overzealous HDR processing — check the unusual colors in the sky around the edge of the building. These examples are rare, however, and Google's atypical approach manages to avoid many of the pitfalls of conventional HDR imagery on phones.

The traditional way to produce an HDR image is to bracket: you take the same image multiple times while exposing different parts of the scene, which lets you merge the shots together to create a final photograph where nothing is too blown-out or noisy. Google's method is very different — HDR+ also takes multiple images at once, but they're all underexposed. This preserves highlights, but what about the noise in the shadows? Just leave it to math.

"Mathematically speaking, take a picture of a shadowed area — it's got the right color, it's just very noisy because not many photons landed in those pixels," says Levoy. "But the way the mathematics works, if I take nine shots, the noise will go down by a factor of three — by the square root of the number of shots that I take. And so just taking more shots will make that shot look fine. Maybe it's still dark, maybe I want to boost it with tone mapping, but it won't be noisy." Why take this approach? It makes it easier to align the shots without leaving artifacts of the merge, according to Levoy. "One of the design principles we wanted to adhere to was no. ghosts. ever." he says, pausing between each word for emphasis. "Every shot looks the same except for object motion. Nothing is blown out in one shot and not in the other, nothing is noisier in one shot and not in the other. That makes alignment really robust."

Google also claims that, counterintuitively, underexposing each HDR shot actually frees the camera up to produce better low-light results. "Because we can denoise very well by taking multiple images and aligning them, we can afford to keep the colors saturated in low light," says Levoy. "Most other manufacturers don't trust their colors in low light, and so they desaturate, and you'll see that very clearly on a lot of phones — the colors will be muted in low light, and our colors will not be as muted." But the aim isn't to get rid of noise entirely at the expense of detail; Levoy says "we like preserving texture, and we're willing to accept a little bit of noise in order to preserve texture."

As Levoy alludes to, mobile image processing is a matter of taste. Some people will like the Pixel's results, others may not. But if you're the kind of person who follows phone announcements and scours spec sheets, you'll probably wonder whether the Pixel's lack of optical image stabilization sets it back. Not so, says Levoy. "HDR+ needs that less than other techniques because we don't have to take a single long exposure, we can take a number of shorter exposures and merge them... it's less important to have optical image stabilization if you're taking shorter exposures. We have had it in some years and not in other years. The decisions are complicated — they have to do with the build materials and other things that we're trying to optimize on the platform."

The Pixel phones are the first to be fully designed by Google, which means such decisions can be made with a more holistic view toward the final product. "There's now this hardware org headed by Rick Osterloh, and one of the goals of that was to pivot to a more premium experience for our phones and also to have more vertical integration," says Levoy. "[Our team is] a part of that effort, so we definitely want to take over more of the camera stack."

What might that involve in the future? "The notion of a software-defined camera or computational photography camera is a very promising direction and I think we're just beginning to scratch the surface," says Levoy, citing experimental research he's conducted into extreme low-light photography. "I think the excitement is actually just starting in this area, as we move away from single-shot hardware-dominated photography to this new area of software-defined computational photography."

Read more: Google Pixel takes on the iPhone 7 and Galaxy S7 Edge in a smartphone camera shootout


Google Pixel phone review

Comments

I want that SeeInTheDark app

The HDR+ sounds a lot like what NVidia had with tetra with one-shot HDR back in 2013. No one really implemented it in the market due to lacking the radio solution.

Glad to see someone else picked it up and bring the underlying principle back.

Nvidia did a lot of cool sounding things early but somewhat fudged the implementations. See the "shadow" core that was later made better as big.LITTLE.

I think for the general public it would be nice to mention the sensor size of the Pixel(which is the same as the 6P and 5X), compared to say the iphone and Galaxy S7, cause lack of OIS and F2 lens doesn’t tell the full story. I recently installed the new camera app(via xda) on my 6P and I like it more. The grid and improved performance in HDR+ is welcome addition, though not sure if it uses the same algorithm as Pixel or if it’s using the 6P one. Either way the images(from the 6P and 5X) are still very comparable to the Pixel, S7, and iPhone 6 thank to the larger than phone average 1/2.3" sensor.

Did you get unlimited storage too?

I think you get that when purchasing the pixel not using the camera app(which is almost the same as the old version, with a few tweaks of course).

The Pixel sensor is the same size as the 6p/5x sensor.

While many things have remained the same from the IMX377 to the IMX378, including the pixel size (1.55 μm) and sensor size (7.81 mm), there have been a couple key features added. Namely it is now a stacked BSI CMOS design, it has PDAF, it adds Sony’s SME-HDR technology, and it has better support for high frame rate (slow motion) video.

http://www.xda-developers.com/sony-imx378-comprehensive-breakdown-of-the-google-pixels-sensor-and-its-features/

Have you noticed improved performance with HDR set to always-on, or is it just in the ‘auto’ mode? I just installed the updated camera app on my 6P, but I haven’t had a chance to really try it out yet.

I haven’t really tried it out either, but the latest version of the camera app(which I read is still technically a beta until the Pixel is full out). It’s suppose to have a faster HDR mode and final version will have HDR burst mode on 5x and 6p devices.

1/2.3" 1.55 micron pixels. the IMX 378 sensor is an upgraded version of the IMX 377 found in HTC 10

Speaking not necessarily as a fan of Apple, but rather of good working technology, this is the first time in years that an Android phone is seriously tempting me. When my 5s finally gives up on me, I might just go for a taste of the flagship Google phone.

I had a Nexus 7 back in the day, but gave up on it after almost two years. I switched to the iPad and never looked back.

I’m thinking the opposite might happen this time around.

This is what Google is looking for.

Hey, if it works, it works.

Nexus 7 was what, $150-$200 at the time? iPad was $350+, right?
People do this comparison between $300 Dells and $1500 macbooks all the time. It’s not just the operating system, you get what you pay for. There were premium android tablets at that time. You chose to go with a budget unit and got budget performance.

I see your point, but I would rather argue that I went with the flagship 7-to-8-inch form factor tablet from Google, after which I decided to opt for the flagship 7-to-8-inch form factor tablet from Apple.

Google seemed to have decided that Android’s advantages would be enough to keep its users satisfied with that tablet, but at one point, like you mentionned, hardware is what makes a significant difference.

This is why google is charging so much for pixel devices. The more budget friendly nexus phones and tablets always had some sort of major flaw.

The camera truly rocks on the Pixel. Just came from the Ars review where the Pixel camera obviously did better than the brand new iPhone 7+ in multiple situations (saying that as a iPhone user). Great job Google, time to step things up Apple.

Great to see Google notice and put the chips on the camera – since that is such a critical thing for folks used to forking out the cash for top of line smartphones (often with families who want those pictures). This is a key item they had to do if they were going to lure users away from iOS and they did it.

Nat & Lo just did a video shot with the Pixel… shows off some more of the features too.

https://www.youtube.com/watch?v=j5coVIPugBQ

What’s going on with the video digital image stabilization? It always has the abrupt jumps that look like lost frames or something.

Yeah, disappointed I didn’t see an actual comparison of stabilisation. From what I’ve seen, the iPhone 7’s is the most stable and natural-looking but you wouldn’t know it from this article.

Not speaking about the Pixel specifically the stuff he is talking about certainly sounds really cool. Might rub analog folks the wrong way – and I also still think hardware tech will always give you a slight better picture – but it’s still really cool.

I just don’t see why you can’t have all the cool software and the better hardware on top. Not saying it’s bad hardware but it could be better on top of the software and then give you a hands down best experience

How long before Google Camera is creating an image based on your location, filling in details from previously uploaded images on the web?

You’ll be raving about image quality until someone else’s kids show up in your photos…!

Doesn’t this lay the foundation for Google to introduce Live Photos to Android?

GOOGLES GOT A HEAVYHITTER

View All Comments
Back to top ↑