In response to fears about the rise of autonomous weapon systems, the UK government has issued guidelines reaffirming its commitment to “human control of cutting-edge weaponry.”
The new doctrine published by the Ministry of Defence makes the argument that remote-controlled weapons like drones are safer for military personnel and civilians. It also says the UK “does not possess fully autonomous weapon systems and has no intention of developing them,” and that military weapons “will always be under control as an absolute guarantee of human oversight and authority and accountability.”
In a press statement, the Minister for the Armed Forces, Mark Lancaster, said, “It’s absolutely right that our weapons are operated by real people capable of making incredibly important decisions; and we are guaranteeing that vital oversight.”
The guidelines come just a few weeks after tech and robotics leaders signed a petition calling on the United Nations to introduce new regulations for the development of so-called “killer robots.” The letter, which included Tesla CEO Elon Musk among its signatories, argues that the development of such systems could usher in a “third revolution in warfare,” following the development of gunpowder and nuclear weapons. Semi-autonomous weapons are in development around the world, with some — like gun turrets on the Korean border — already in active service.
Professor Noel Sharkey, a UK robotics expert and chair of the International Committee for Robot Arms Control, welcomed the Ministry of Defence’s new doctrine. “This looks like same old, same old, but it is stronger wording than before and I hope it implies a change,” he told The Verge over email. “This has been pretty much the standard MoD line [since 2013] but now it’s also presented as the UK government's position.”
Sharkey points out that the doctrine is far from an ironclad commitment, and that the UK government could do much more to signal its good intentions. It could, for example, support an international moratorium on the development of autonomous weapons systems, or push for new legislation later this year when the United Nations convenes a group of experts to consider the state of killer robots.
The idea that humans will always have “oversight” of autonomous systems is also misleading, said Sharkey. Human intervention, for example, could be as limited as confirming a lethal strike suggested by a computer system without fully reviewing the evidence for and against. “We can only hope this means the UK will support human control of weapons in a meaningful and deliberative way,” said Sharkey.