Skip to main content

Can Apple’s iPhone XS camera catch up to Google’s Pixel?

Can Apple’s iPhone XS camera catch up to Google’s Pixel?

Share this story

Photo by Dieter Bohn / The Verge

As ever, the camera system is one of the most important features in the new iPhones. Apple’s SVP of product marketing Phil Schiller played the same role he always does at each iPhone announcement, breaking down the camera technology in depth and explaining the improvements. This year he went as far as to pronounce that the iPhone XS will hail “a new era of photography.”

But the days where Apple held a major lead over every Android manufacturer are long gone. Google’s innovative approach to computational photography with the Pixel line means they’re now the phones to beat in terms of pure image quality, while competitors like Samsung’s Galaxy S9 Plus and Huawei’s P20 Pro have strong claims to supremacy in areas like low-light performance. The iPhone 8 and X have good cameras, to be sure, but it’s hard to make the case that they’re the best. Can the iPhone XS catch up?

Pixels on par with the Pixel 2

The biggest hardware upgrade this year is the new, larger 12-megapixel sensor. Last year Apple said the 8 and X sensor was “larger” than the 7’s, but teardowns revealed that this wasn’t meaningfully true; the field of view and focal length of the lens didn’t change. This time, though, Apple is citing the increased pixel size, which should indeed make a difference. 

The iPhone XS’ main camera has 1.4µm pixels, up from 1.22µm in the iPhone X and on par with the Google Pixel 2. The bigger the pixels, the greater their ability to collect light, which means more information to play with when constructing a photo. This is the first time Apple has increased pixel size since the iPhone 5S, after going down to 1.22µm when it moved to 12 megapixels with the 6S, so it could represent a major upgrade.

Otherwise, the hardware is much the same as seen on the X. There’s still a six-element f/1.8 lens and a secondary f/2.4 telephoto module on both the XS and XS Max, though the optics will likely have had to be redesigned for the new sensor. (The cheaper iPhone XR has the same primary camera but no telephoto lens.) Apple also says the True Tone flash is improved, without providing details. And the selfie camera is unchanged beyond “all-new video stabilization.”

But as Schiller said on stage, hardware is only part of the story with camera technology today. Computational photography techniques and software design are at least as important when it comes to getting great photos out of tiny phone optics.

The A12 Bionic chip in the iPhone XS is designed for this core use case. This year Apple has directly connected the image signal processor to the Neural Engine, the company’s term for the part of the chip designed for machine learning and AI. This looks to be a big focus with the A12. It will be the world’s first 7-nanometer processor to ship in a smartphone, which should allow for more efficient performance, but Apple is only citing conventional speed increases of up to 15 percent over the A11; far more dramatic boosts are seen with machine learning operations, suggesting that Apple has used the opportunity of a 7nm design to focus on ramping up the Neural Engine.

What this means for photography is that the camera will better understand what it’s looking at. Schiller identified use cases like facial landmarking, where the camera can figure out a map of your subject’s face to remove unwanted effects such as redeye, and segmentation, where the camera can parse complex subjects’ relation to the focal plane and more accurately generate shallow depth of field. 

Apple is also introducing a new computational photography feature called Smart HDR. It sounds similar to Google’s excellent HDR+ technique on the Pixel phones, though Google’s relies on combining several underexposed images whereas Apple uses overexposure as well to capture shadow detail. Apple says the feature relies on the faster sensor and new image signal processor found in the iPhone XS. 

The other main update to the iPhone’s camera software is in how it handles bokeh, the quality of the out-of-focus areas in a photo. Apple says it’s studied high-end cameras and lenses extensively in a bid to make the bokeh in its portrait mode more realistic, and a new feature called Depth Control lets you adjust the degree of blur after a photo has been taken by simulating different aperture settings. A similar feature has been available on phones like the Galaxy Note 8, as well as on existing iPhones through third-party apps that make use of depth information APIs, so it’s not clear how much of an improvement this will be.

Portrait mode in the iPhone XS still makes use of the secondary telephoto camera, which gives a more natural and flattering perspective for pictures of people than the single-lens solution used by the Pixel. The single-camera iPhone XR, however, also has the same bokeh and depth control features.

The Depth Control feature on the iPhone XS

From what Apple’s saying, the iPhone XS could be the biggest update to the iPhone camera in some time, despite the lack of obvious headline features or eye-catching hardware. The sample photos on Apple’s website are certainly impressive. But Apple’s been putting iPhone photos on billboards ever since the iPhone 6 — the real test is going to come when the XS is in the hands of millions of regular users.

As HDR+ is Google’s primary advantage, how well Smart HDR performs will be the critical factor in how well the iPhone XS can compete against the Pixel. Apple seems to be saying the right things, but Google has quite the head start — and as the company’s computational photography lead Marc Levoy told The Verge, its system gets better through machine learning without anyone even touching it. That’s before considering what improvements may have been made to the heavily leaked Pixel 3, due to land in less than a month.

Still, if the iPhone XS could even match last year’s Pixel 2 in basic image quality, it’d be a significant and immediately noticeable improvement for every iPhone user. That means that this year, it’s imperative for Apple to deliver on the camera if it isn’t to fall further behind. And with availability starting from next Friday, we won’t have long to wait to find out exactly what the maker of the world’s most popular cameras has achieved.