Skip to main content

Microsoft denied police facial recognition tech over human rights concerns

Microsoft denied police facial recognition tech over human rights concerns

/

The company has sold the technology to at least one US prison though

Share this story

Illustration of algorithmic facial recognition.
Illustration by Alex Castro / The Verge

Microsoft has said it turned down a request from law enforcement in California to use its facial recognition technology in police body cameras and cars, reports Reuters.

Speaking at an event at Stanford University, Microsoft president Brad Smith said the company was concerned that the technology would disproportionately affect women and minorities. Past research has shown that because facial recognition technology is trained primarily on white and male faces, it has higher error rates for other individuals.

“We said this technology is not your answer.”

“Anytime they pulled anyone over, they wanted to run a face scan,” said Smith of the unnamed law enforcement agency. “We said this technology is not your answer.”

Facial recognition has become a controversial topic for tech companies in recent years, partly because of its biases, but also its potential for authoritarian surveillance.

Amazon has been repeatedly criticized for selling the technology to law enforcement, and faced pushback from both employees and shareholders. Google, meanwhile, says it refuses to sell facial recognition services altogether due to their potential for abuse.

Microsoft has been one of the loudest voices in this debate, repeatedly calling for federal regulation. “‘Move fast and break things’ became something of a mantra in Silicon Valley earlier this decade,” Smith wrote in an open letter earlier this year. “But if we move too fast with facial recognition, we may find that people’s fundamental rights are being broken.”

In China facial recognition has been integrated into sunglasses worn by police.
In China facial recognition has been integrated into sunglasses worn by police.

Speaking at Stanford this week, Smith said the company had also turned down a deal to install facial recognition in cameras in the capital city of an unnamed country. He said doing so would have suppressed freedom of assembly.

Activists worried about the malicious uses of facial recognition often point to China as a worst-case example. The Chinese government has deployed facial recognition on a huge scale as part of its crackdown on the largely Muslim Uighur minority. Activists say the result has been a digital surveillance network of unprecedented reach, which can track individuals across a city and produce automated warnings when Uighurs gather together.

But despite concerns, facial recognition is also becoming more common in the West, even if it’s not part of a centralized system, as in China. The technology is being installed in airports, schools, and retail stores, and retrofitted into existing surveillance systems.

Even Microsoft, which is openly debating the merits of this technology, is happy selling it in places some might find troubling.

Reuters notes that, speaking at Stanford, Smith said that while the company had refused to sell facial recognition to police, it had provided it to an American prison “after the company concluded that the environment would be limited and that it would improve safety inside the unnamed institution.”