This morning, a Guardian article reported a new weakness in WhatsApp’s encryption, described as a backdoor in one of the most widely used encrypted chat apps in the world. WhatsApp was quick to push back against the allegation, saying in a statement, “WhatsApp does not give governments a ‘backdoor’ into its systems and would fight any government request to create a backdoor.” The bug described in the article had long been known by security professionals, and there’s no evidence WhatsApp ever tried to conceal it. But the weakness itself is real, and its persistence shows just how hard it is to balance security with the demands of everyday users.
At its core, The Guardian piece describes an advanced but plausible attack that WhatsApp’s current encryption can’t stop. If an attacker gained access to a WhatsApp server, he could forcibly reset the keys used to encrypt messages and install himself as a relay point, intercepting any future messages sent between the parties. (This is commonly referred to as a man-in-the-middle attack.) The recipient of the message would not be alerted to the change in keys, and the sender will only be alerted if they’ve opted in to the app’s “Show security notifications” setting. Because it requires server access, the attack is far beyond the reach of most criminals, but still it could be exploited by an unusually skilled attacker or used by a court to compel WhatsApp to break its own security.
The underlying weakness has to do with alerts rather than cryptography. Although they share the same underlying encryption, the Signal app isn’t vulnerable to the same attack. If the Signal client detects a new key, it will block the message rather than risk sending it insecurely. WhatsApp will send that message anyway. Since the key alert isn’t on by default, most users would have no idea.
Should key verification be a blocking or non-blocking user interaction? Signal chose blocking. WhatsApp chose non-blocking.#UXvsSecurity?— Frederic Jacobs (@FredericJacobs) January 13, 2017
It’s a controversial choice, but WhatsApp has good reasons for wanting a looser policy. Hard security is hard, as anyone who’s forgotten their PGP password can attest. Key irregularities happen, and each app has different policies on how to respond. Reached by The Guardian, WhatsApp pointed to users who change devices or SIM cards, the most common source of key irregularities. If WhatsApp followed the same rules as Signal, any message sent with an unverified key would simply be dropped. Signal users are happy to accept that as the price of stronger security, but with over a billion users across the world, WhatsApp is playing to a much larger crowd. Most of those users aren’t aware of WhatsApp’s encryption at all. Smoothing over those irregularities made the app itself simpler and more reliable, at the cost of one specific security measure. It’s easy to criticize that decision, and many have — but you don’t need to invoke a government conspiracy to explain it.
The Guardian raises the urgency of this choice by pointing to the UK’s recently passed Investigatory Powers Bill, which gives that government significant new legal powers for aggressive data collection. But it would be very hard to use this vulnerability for mass surveillance. A successful attack would allow WhatsApp servers to break a given conversation’s encryption, but to provide data en masse to the government, the servers would have to perform that attack continuously on every conversation in the UK, sending out a cascade of pings to anyone with security notifications enabled.
Even retrieving logs after the fact — the type of request that triggered WhatsApp’s legal fight with the Brazilian government — would be difficult, absent some kind of systematic change to the way WhatsApp’s servers behave. If WhatsApp were to leverage this bug to fulfill lawful access demands, the company would have to implement the attack continually on every user in the country, which would be extremely noisy and extremely visible. The end result wouldn’t be much different from shipping an update and announcing that the service is no longer encrypted.
That still leaves WhatsApp and its users in a somewhat awkward place. Messages are encrypted by default, but there’s a demonstrated way for WhatsApp’s servers to break a given conversation’s encryption. That could open the company up to some uncomfortable legal demands. It’s easy to imagine a forward-looking wiretap order demanding that WhatsApp perform this attack on a particular user. Unlike the Brazilian order, it would be difficult for the company to claim it couldn’t comply. It’s not a particularly useful technique for law enforcement: the target would be notified, and investigators wouldn’t get as much information as they would from an SMS login hijack or simply mugging the target when her phone is unlocked. But if an ambitious prosecutor wanted to score points in the encryption debate, it could be a very tempting subpoena to file.
In the long term, the fix for all these problems is key transparency, a feature that makes the elements of the underlying encryption visible to the user. Versions of that feature are already present in Signal and other services, and more advanced models are in development now. Google is currently building its own key transparency system, and others are looking to blockchains as the solution. The underlying question has as much to do with user interface as with encryption. How do we make encryption keys visible without making them confusing? We still have very few answers to that question.
In the meantime, we’re left with a strange split between usability and security. For most critics of mass surveillance, encryption is only valuable if it’s on by default, otherwise most of the world’s communication will continue to be vulnerable. At the same time, most users have little tolerance for complex logins. Default encryption has to be invisible, or else users will ditch your ultra-secure app for a simpler alternative with fewer protections. It’s a different model from the self-reliance of the PGP era, with different risks. As WhatsApp is learning, making encryption too easy can be just as dangerous as making it too hard.