Skip to main content

Why Facebook is beating the FBI at facial recognition

Why Facebook is beating the FBI at facial recognition

Share this story

Images from Facebook's DeepFace
Images from Facebook's DeepFace
Images from Facebook's DeepFace scan

If you're worried about Big Brother and computerized facial recognition, this summer has given you plenty of reason to be scared. Law enforcement has been toying with facial recognition for a while, but the FBI is getting set to deploy its own system, called Next Generation Identification (NGI for short), planned to be fully operational this summer. NGI will bring together millions of photos in a central federal database, reaching all 50 states by the end of the year. After years of relative anonymity, it's easy to think 2014 is the year that law enforcement will finally know you by face.

The FBI's new system isn't very good

But here's an inconvenient fact about the FBI's shiny new system: it isn't very good. Thanks to extensive work by the Electronic Frontier Foundation, we actually know quite a bit about NGI, and the numbers suggest it isn't very good at recognizing faces. Given a suspect's face, NGI returns a ranked list of 50 possibilities, and only promises an 85 percent chance of returning the suspect's name in the list. To put it another way, even when you give NGI 50 guesses, it still lets one in seven suspects off the hook.

Compare that to Facebook's DeepFace system, presented at the IEEE Computer Vision conference earlier this month, and it looks even worse. Give Facebook two pictures, and it can tell you with 97 percent accuracy whether they're the same person, roughly the same accuracy as a human being in the same spot. To be fair, Facebook has a whole network's worth of data on its side, so it ends up comparing each face to a smaller number of possibilities. It isn’t an exact comparison, but the overall impression is hard to deny: the nation's most powerful law enforcement agency is getting outgunned by a social network.

"The difference between a human brain and a computer brain is huge."

While there are plenty of contractors who are willing to promise "near-human" recognition capabilities, real facial recognition is much harder than the industry lets on. "It's a huge lie that commercial companies that provide facial recognition tell," says Shahar Belkin, CTO of FST Biometrics. "The difference between a human brain and a computer brain is huge." FST's facial recognition systems work with tenants in apartment buildings, cooperative subjects who are happy to look straight into the camera for a match, a much simpler task than what the FBI is proposing. The company also adds other layers of verification like height and gait-tracking to weed out false positives.

The FBI system doesn't have either of those advantages. It's expecting faces to work like fingerprints, where a single unintentional print provides a positive ID that's strong enough to hold up in court. But while a human witness can easily identify a face, automated systems still have a lot of trouble keeping up. "What will kill these systems is the false-accept rate," Belkin says. "I don't believe we'll see a solution for that in the next five to ten years."

"What will kill these systems is the false-accept rate."

It's particularly bad because of the quality of the pictures the FBI is using. Belkin says facial recognition systems typically need to photograph your face straight-on, no more than 15 degrees off the center axis. That works great for mugshots or systems like FST's, where people are intentionally giving a faceprint. But for more intrusive uses like spotting criminal faces through public cameras, it's a much bigger problem. Surveillance cameras are typically mounted on ceilings or streetlights, so by the time a suspect comes close enough for a faceprint, the angle is often so bad that recognition techniques won't work. Modern-day camera systems aren't built to work with facial recognition, and it's not even clear how you might fix them.

Facebook's help could be just a court order away

Facebook can skirt around that problem because it already knows who your friends are and who's likely to show up in your pictures. It also has many more pictures to work with, hosting 250 billion photos to the FBI's 50 million. That gives Facebook’s engineers more chances to find a good picture, and more data to generalize from. Facebook also has more freedom to make mistakes, since a false tag carries much less weight than a mistaken police ID. Facial recognition ends up as a great tool for automatic photo-tagging, but almost useless if you're trying to ID a suspect.

That sounds like good news for privacy advocates — and it is — but it's nothing to get too comfortable about. The FBI can still build a face-tracking system that works. It just needs more photos, more names, and a smarter network to organize it all: in short, it needs Facebook's help, which could be just a court order away. Facebook is currently in a brutal legal fight with the Manhattan district attorney over how broadly prosecutors can collect user data. If Facebook loses the fight, prosecutors and law enforcement agencies might have enough data to bring their system up to date, singling out faces and building lists of known associates through automatically tagged photos. The end result could look an awful lot like the face-trackers we were warned about.

The FBI may be losing with cameras and computers, but it could still win in the courts.

It's like any race for cutting-edge technology, whether on Macs and PCs or Android and iOS. Getting there first is nice, but it doesn’t guarantee you a monopoly. If Facebook can make facial recognition work, the FBI will figure out a way to make it work too, either by pulling each suspect's data or reverse-engineering the process from the ground up. The biggest question is how far into Facebook's system the government can reach, and it's a question that's still being decided. That means that after decades of testing, the technical aspects of face-tracking may finally have taken a back seat. The FBI may be losing with cameras and computers, but it could still win in the courts.