Skip to main content

A black man was wrongfully arrested because of facial recognition

A black man was wrongfully arrested because of facial recognition

/

‘The computer must have gotten it wrong’

Share this story

Illustration by Alex Castro / The Verge

The American Civil Liberties Union (ACLU) has filed a formal complaint against Detroit police over what it says is the first known example of a wrongful arrest caused by faulty facial recognition technology.

Robert Julian-Borchak Williams, an African American man, was arrested after a facial recognition system falsely matched his photo with security footage of a shoplifter. The New York Times reports that the ACLU is calling for the dismissal of Williams’ case and for his information to be removed from Detroit’s criminal databases, and prosecutors have since agreed to delete his data.

Facial recognition technology has been criticized for years, with researchers showing it to be biased against members of different races and ethnicities. But its use by law enforcement has grown even more controversial in recent weeks following nationwide protests against police brutality and racism. Now, Williams’ case shows the reality of what happens when flawed technology collides with poor police work.

Williams’ case was later dismissed “without prejudice”

The NYT reports that the robbery Williams was accused of took place in October 2018, and in March 2019, a still from the store’s surveillance video was uploaded to Michigan state’s facial recognition database. This would have generated a series of photographic matches, later provided as part of a document that said they were not “probable cause for arrest.” However, they nevertheless led to Williams’ picture being included in a photo lineup that was shown to the store’s security guard. This guard, who the ACLU says did not witness the robbery firsthand, positively identified Williams.

The identification led to Williams being arrested in the driveway of his home in January, after which he was taken into police custody for a total of 30 hours. Williams has given his account of the arrest in an op-ed for The Washington Post. He says that in one interview with police he held up a photo of the shoplifter next to his own face, after which one of the detectives said, “the computer must have gotten it wrong.”

Although Williams’ case was dismissed two weeks after he was arrested, it was dismissed “without prejudice,” leaving him open to being charged again, the NYT notes. In addition, the ACLU says that as a result of the arrest, Williams’ DNA sample, mugshot, and fingerprints are on file, and that his arrest is on the record.

A Detroit police spokesperson told the NYT that the department had accepted the prosecutor’s decision to dismiss the case, and that as of July 2019 the department’s policy was to only use facial recognition to investigate violent crimes.

Williams’ story comes as multiple high-profile tech companies, including IBM, Microsoft, and Amazon, have announced that they will be stopping or pausing their facial recognition work for police. “We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies,” IBM CEO Arvind Krishna said in a statement earlier this month.

But many facial recognition experts say there needs to be a longer moratorium on the technology’s use, and that some firms may simply be waiting out the current news cycle before starting to sell to law enforcement once again. As Williams’ case highlights, though, the damages that a false arrest can cause on someone’s life are harder to forget.

“My daughters can’t unsee me being handcuffed and put into a police car,” Williams writes in his op-ed. “But they can see me use this experience to bring some good into the world. That means helping make sure my daughters don’t grow up in a world where their driver’s license or Facebook photos could be used to target, track or harm them.”

Update, 2:52PM ET: Added that prosecutors have agreed to expunge Williams’ data.