Today, Apple introduced a new flagship phone — the iPhone X — with a powerful new login system. Because phone’s all-glass front leaves no room for a home button, Apple is ditching Touch ID in favor of a facial recognition system powered by a new camera array and a specially modified A11 chip. Alongside the new technology, the new Face ID system raises serious questions about surveillance and user privacy. Until the phone goes on sale in November, some of those questions will be left unanswered — but this is what we know so far, and what it means for anyone thinking of buying an iPhone X.
Will Face ID make it easier for police to unlock my phone?
Like Touch ID before it, Face ID raises real questions about compelled unlocking. If you’re detained by police or kidnapped by criminals, they won’t be able to guess your password — but they would be able to hold the phone up to your face until you pass a Face ID scan. It’s a major privacy concern, and one many users don’t think about until it’s too late. There’s no indication Face ID is any worse on this front than Touch ID, but it still raises real questions over how the system holds up under duress.
A screenshot from the leaked iOS 11 firmware.
The good news is that Face ID allows users to opt out, just like Touch ID did. Leaked firmware from iOS 11 shows the option to disallow Face ID logins, even if your face is already enrolled. It’s not a perfect solution, but it’s as good as Touch ID was, and should give privacy-conscious users a way to address their concerns without avoiding the iPhone X entirely.
A trickier question is whether you can unlock someone’s phone with Face ID once the system is enabled. Onstage, Schiller claimed the system required the user’s attention to properly function, saying, “If your eyes are closed, if it’s not lined up, it’s not going to work.” It’s also not easy to line up someone else’s face in a front-facing camera, particularly if the system requires a sustained, eyes-open photograph.
Still, the speed of the process does suggest Face ID might shift the balance of power. As you can see in our own hands-on testing, the iPhone X refuses to unlock as long as the subject’s eyes are closed — but almost as soon as he opens his eyes, Face ID makes the match, even though the camera is slightly off-axis to the subject’s face.
The strength of the involuntary-login protections will depend a lot on the details of Face ID’s specific user interface, so this will be another question to watch as the iPhone X hits the market.
Could my face leak or get stolen?
Apple has already said the company won’t send faceprint data to the cloud, which means your face data stays on your phone. Every indication is that Apple is treating faces the same way they treated fingerprints with Touch ID, which is good news. In that system, Apple uses the enrolled fingerprint to create a hashed version of the data, which is then stored on the phone’s Secure Enclave security chip. (You can read more about this in Apple’s iOS security white paper, starting around page seven.)
Assuming Apple follows the same playbook for Face ID, it will be extremely difficult to get that data off the phone, and nearly impossible to reconstruct a face from it. The Secure Enclave is the most secure part of the phone, resistant to even circuit-level analysis, and while researchers have started to break some of those protections, the chip is still probably the most secure place on any consumer device you own. More importantly, the hashing process eliminates a lot of data, which would make it extremely difficult to reconstruct a fingerprint or face if the data were ever extracted.
That’s not an absolute assurance. Apple could always break from that playbook in some way they haven’t discussed, or hackers could make some incredible new breakthrough. But compared to most information on an internet-accessible device, this looks to be pretty safe.
Will Face ID have a racial bias problem?
Facial recognition systems have a long history of racial bias, and it’s attributable mostly to a lack of diversity in databases. The algorithms used to match faces get better as they see more faces. As you might expect, algorithms trained on mostly white faces aren’t as good at recognizing people who are black, Chinese, or Indian, which translates to higher error rates and a worse product for specific groups of people. Will Face ID have the same problem?
The commercial facial recognition industry caught onto this problem early on, and for the most part, companies have incorporated more diverse datasets to address it. We know how to fix this problem, if we want to. The question is whether Apple has done the necessary work.
Algorithms trained on white faces aren’t as good at recognizing black or Chinese people
So far, we don’t have enough information to say. Phil Schiller said onstage that the Face ID team took over a billion images to train the algorithm — but that doesn’t tell us much about how many people were part of the database and what they looked like. Like most tech companies, Apple doesn’t have a very diverse workforce, particularly at the executive level. It’s easy to imagine an issue like this slipping through, especially given the tight deadlines and strict secrecy that accompany a new product. There was a lot of racial diversity in the onstage video about Face ID’s testing, which at least indicates Apple is aware of the issue — but we won’t know for sure until we can test out the system in a rigorous way.
Can you spoof Face ID with a picture of someone’s face?
When the Galaxy S8 came out this March, its facial recognition system was one of its major selling points — until it turned out the scan could be defeated by simply holding an image of the person’s face up to their phone. Apple’s system is significantly more sophisticated, relying on dual cameras and an array of projected infrared dots to detect depth. Apple’s marketing video showed off three-dimensional masks used to test Face ID against spoofing attacks, and the simple fact of having a motion-capable camera should make it easier to spot a false face at work. After all that, it seems unlikely that an iPhone X would fail the photo test — but you never know until you try.
Will Apple ever use Face ID for anything other than unlocking phones?
This is the most interesting question, and the hardest to answer. Soon, millions of people will be enrolled into Face ID, giving Apple control over a powerful facial recognition tool. In the current system, that data stays on phones, but that could always change. The hashing would make it difficult for anyone other than Apple to use the data, but there’s no real limit on what they use it for, particularly if they start to store information outside of specific phones. On Twitter, privacy advocates worried about Face ID data being used for retail surveillance or attention tracking in ads. You could also imagine it as next year’s delightful product breakthrough, integrated into Apple Stores or Apple Cars as a way of carrying over logins no matter who walks in.
For now, the company is very much in the iPhone business, as today’s keynote proved. Apple has pitched its commitment to privacy in the past, and unlike most of their competitors in the tech world, they’ve seemed genuinely uninterested in the kind of data collection and mass targeting that powers most web companies. But with one of the world’s most ambitious companies showing off a powerful new toy, it would be foolish not to wonder what comes next.