Skip to main content

Microsoft says it doesn’t work on ICE facial recognition and calls for regulation

Microsoft says it doesn’t work on ICE facial recognition and calls for regulation

Share this story

Illustration by Alex Castro / The Verge

After a confusing series of communications, Microsoft said today that, contrary to claims in a company blog post, the services it provides to Immigration and Customs Enforcement are not being used for facial recognition. The company also called for federal regulation of facial recognition in general.

The controversy over the company’s work for ICE started started last month. Amid news of family separations at the border, social media users picked up on a Microsoft blog post that explained how the company worked with ICE and was “proud” of the association. The post also suggested that the agency could be using facial recognition powered by Microsoft cloud services.

the Company says ICE contract “isn’t being used for facial recognition at all”

“We’ve since confirmed that the contract in question isn’t being used for facial recognition at all,” Microsoft president Brad Smith said in an in-depth letter about the company’s facial recognition work. CEO Satya Natella had previously said that the contract applied to services like email and calendar work, but he did not address the facial recognition claim directly.

The note today didn’t explain why the blog post made those claims or whether the services could still, in theory, be used as part of a facial recognition program. The note is also unlikely to fully appease some employees, who have questioned whether Microsoft should be aligned with ICE in any capacity.

The post also discusses Microsoft’s work on facial recognition more broadly, and it asks for the US government to step in to regulate the tech industry’s work in the field. “We believe Congress should create a bipartisan expert commission to assess the best way to regulate the use of facial recognition technology in the United States,” Smith writes, while raising some questions about how the technology should be used.

Smith says that the tech industry needs to take responsibility for how facial recognition is used as well, and Microsoft has already turned down customer requests for the technology “where we’ve concluded that there are greater human rights risks.”

“‘Move fast and break things’ became something of a mantra in Silicon Valley earlier this decade,” Smith writes. “But if we move too fast with facial recognition, we may find that people’s fundamental rights are being broken.”