Yesterday, an unusual app started making news. Called Facezam, the app promised to instantly identify anyone in front of your phone’s camera, using facial recognition to match them against Facebook profile pictures. It was a terrifying promise: anyone walking down the street could be identified with just a wave of your phone.
It was all fake, as it turned out. The app was built by a viral marketing agency called Zacozo, who mocked up visuals for the app, lied about the broader outlines of the project, and generated a wave of alarmed coverage in the process. The outlets who ran with the report have already drawn fire on Twitter, but so far Zacozo seems unconcerned.
Reached by The Verge, the firm said they thought a facial recognition app would be instantly controversial. “Similar invasive apps are a popular topic in the media,” a spokesperson said, “so we thought this would be picked up quickly.”
Clearly Zacozo was right — but it’s worth considering why. The internet is full of overblown promises and outright fakes — but Facezam was particularly easy to accept as real, and felt particularly urgent to share. In part, that’s because it’s so plausible. Facial recognition has become immensely powerful in the past five years, and using it to spot people on the street feels like a natural application. There are real barriers to creating a product like Facezam, but they’re flimsier than they seem. All it takes is a well-made rendering to believe they’ve been knocked down for good.
The most immediate problem for a product like Facezam is Facebook itself. Facial-recognition algorithms are well within the reach of a well-funded startup, and Facebook really does have the necessary photos to power such a system. (More than 350 million are uploaded each day, which puts the service well on its way to a trillion photos overall.) But getting those photos off of Facebook servers is harder than it looks. The company has little tolerance for outsiders scraping data, and even with a developer key, it would be hard to pull the millions of pictures necessary before anti-scraping measures kicked in.
Marketing your whole product around that process is a sure way to get kicked off Facebook entirely. In fact, that’s just what happened. Within 24 hours of the first press hit, Facezam got an email from Facebook’s legal team, forcing them to come clean as a hoax.
There’s a flip side to that argument that’s less reassuring: if Facebook wanted a product like Facezam, it would be easy to build. The company already constructs facial-recognition models for each user for the photo-tagging system, and it’s a process that’s already spurred a few lawsuits. Apple and Google both run similar scans as part of their photo-tagging process, although neither has the same wealth of personally tagged pictures. It would be a serious engineering challenge to match faces at scale in real time, but if anyone can make it work, it’s Facebook.
In Russia, it’s already happening. Facebook never found a foothold in the country, and most Russians still use an early knockoff called VKontakte. The platform is more tolerant toward facial recognition, and a plugin called FindFace is already taking advantage of that license, allowing users to compare static photos against nearly 1 billion stored profile pics. It’s a popular service, with over half a million users. If it makes Russians a little less eager to open a VKontakte account, the service doesn’t seem to mind.
So far, Facebook has chosen not to take that path. The reasons aren’t hard to understand: Facezam is creepy and doesn’t help Facebook with any of its current goals. But it’s Facebook’s decision to make, not ours. If those goals change and users get a little more comfortable with facial scans, it’s easy to imagine something like Facezam onstage at the next F8. Facebook still hasn’t set any policies on how it handles and shares biometric data. Despite the ongoing lawsuits, it doesn’t even ask for user consent to create a faceprint for photo tagging. Why not push a little further?
That puts the dream of face-spotting in an unusual place. It’s not quite here yet — not because we can’t build it, but because we’re not sure we want to. For anyone genuinely worried about these technologies, it has the feeling of a hammer waiting to fall. If someone tells us it’s already falling, we’re inclined to believe them.