The UK government wants to compromise WhatsApp’s encryption. It’s a familiar demand that we’ve seen before in the US and in the UK, but despite the technical ignorance of politicians and the myths used to push this policy, we need to take the debate seriously. Again.
This weekend, UK home secretary Amber Rudd said it was “completely unacceptable” that intelligence services could not read WhatsApp messages sent and received by Khalid Masood, the perpetrator of last week’s terrorist attack in London. Rudd said that messaging apps providing end-to-end encryption (including WhatsApp, iMessage, Signal, and others) were giving terrorists a “secret place to communicate.” Just as with the San Bernardino case, where the FBI tried to legally compel Apple to decrypt a terrorist’s iPhone, governments say tech companies have created a locked box, and they want the key.
But the demand for a back door hasn't made sense in the past, and it doesn't make sense now. There's no reason to believe weakening consumer privacy and safety would have helped stop the Westminster attack. Although Masood was identified as a potential extremist in 2010, he “dropped off the radar of intelligence officials” sometime after. And, if he had been surveilled, the government could have read his WhatsApp messages by compromising his phone some other way. Both the US and UK intelligence services have ample tools for doing so.
Secondly, if Rudd is suggesting that police are unable to read WhatsApp messages on Masood’s phone now because of encryption, that’s also likely false. We don’t know what type of phone Masood had, but even if it was an iPhone, there are ways of breaking in. With San Bernardino, for example, the FBI used a purchased exploit to access the target’s phone.
Read more: The five big lies of the encryption debate
Third, Rudd says that installing a back door in an encrypted product — be that app or phone — is not the same as asking tech companies to “open up,” and that the situation with WhatsApp is “completely different” from Apple’s fight with the FBI. That doesn’t ring true either. Rudd is framing the argument for a back door as if it’s a minor concession, simply the matter of WhatsApp turning a digital key. But as tech experts have pointed out, breaking encryption is not a small thing to do. The whole point of end-to-end encryption is that companies can’t be compelled to reveal messages even if they want to — they don’t have access. Once you introduce a back door, it can be exploited by anyone, including hackers. As Apple CEO Tim Cook said last year, weakening encryption hurts the public, while terrorists find new ways to communicate.
Because of all this, some people might believe that the UK government simply won’t break WhatsApp’s encryption. It’s a stupid idea that won’t deliver any benefit and will hurt the general population, so it’ll never happen. This is wishful thinking. As the government has proven with Brexit and the introduction of last year’s new surveillance legislation, just because something is potentially harmful to the country’s well-being, or intrusive and destructive of personal liberties, that doesn’t mean politicians won’t pursue it.
The UK has proven it’s happy to push surveillance at the cost of personal liberties
The Investigatory Powers bill gives the UK government the legal footing it needs to use whatever measures to compel companies to decrypt users’ data. And although, in the case of WhatsApp, the company could just stop offering its app in the UK, or mobile carriers could cut access (like what happened in Brazil), that’s hardly a win for UK privacy. People will be pushed to communicate using other platforms, some of which won’t offer the same protections.
In the UK — and in the US, with the election of Donald Trump — it seems a new battle over encryption is starting. We need to remember that although creating back doors in encrypted apps and products is stupid and harmful, that doesn’t mean politicians won’t push for it. And even if they don’t manage to break encryption, think of the damage they could do by trying.