Skip to main content

The NYPD uses altered images in its facial recognition system, new documents show

The NYPD uses altered images in its facial recognition system, new documents show

/

In one case, a photo of the actor Woody Harrelson was used to locate a suspect

Share this story

A new report from Georgetown Law’s Center on Privacy and Technology (CPT) has uncovered widespread abuse of the New York Police Department’s facial recognition system, including image alteration and the use of non-suspect images. In one case, officers uploaded a picture of the actor Woody Harrelson, based on a witness description of a suspect who looked like Harrelson. The search produced a match, and the matched suspect was later arrested for petty larceny.

“The stakes are too high in criminal investigations to rely on unreliable—or wrong—inputs,” CPT senior associate Clare Garvie writes in the report. “Unfortunately, police departments’ reliance on questionable probe photos appears all too common.”

In more complex cases, image editing software can be used to manipulate a photo to create a higher chance of an affirmative match. One training presentation recommends the “removal of facial expression technique,” in which an open-mouthed subject is edited into a neutral mug shot expression. Crucially, this can mean pasting in stock images of eyes or lips, which can affect the matching algorithm in unpredictable ways.

An slide from NYPD training materials, uncovered as part of the Georgetown report.
An slide from NYPD training materials, uncovered as part of the Georgetown report.
Image: CPT

Reached for comment by The Verge, an NYPD representative did not dispute any of the specific claims in the report, but emphasized the investigative value of facial recognition. “The NYPD constantly reassesses our existing procedures and in line with that are in the process of reviewing our existent facial recognition protocols,” Detective Denise Moroney said in a statement. “No one has ever been arrested on the basis of a facial recognition match alone. As with any lead, further investigation is always needed to develop probable cause to arrest. The NYPD has been deliberate and responsible in its use of facial recognition technology.”According to CPT’s report, many departments also use police sketches as raw material for facial recognition, an unsupported and broadly inaccurate technique. Researchers found at least six departments across the US that permit sketches to be used in facial recognition searches, including the Maricopa County Sheriff’s Office in Arizona and the Maryland Department of Public Safety. There’s no indication that the NYPD uses police sketches in this way.

Facial recognition has become a widely used technique in law enforcement, although it remains controversial and largely unregulated. A companion report from CPT describes how real-time facial recognition systems have been quietly put in place in Detroit and Chicago, largely outside the view of residents.

Amazon came under fire in 2018 for marketing a cloud-based facial recognition service (dubbed Rekognition) to cities and police departments, despite concerns over privacy and racial bias. Amazon continues to offer the service, despite criticism from AI researchers as well as shareholders and employees at Amazon. Microsoft offers a similar service through its Azure cloud hosting system called Face API.

The city of San Francisco voted to ban the use of facial recognition by city agencies earlier this week, largely in response to civil rights concerns.

The practices described in the report underscore how few judicial restraints there are on police use of facial recognition. Courts have produces centuries of court rulings on when an officer can search a suspect’s home or take them into custody, but there are few corresponding rules for how police can use a software tool for matching faces. Database searches are typically framed as an investigative technique, which means they rarely need to stand up to the scrutiny of a court. Any leads are confirmed with separate evidence before a case can be brought, so any search that produces a viable lead is seen as successful, no matter how many innocent false positives are drawn into the system along the way.

Many proponents of facial recognition believe that algorithmic improvements will assuage civil rights concerns by reducing error rates, but Garvie is skeptical. “[Technical] improvements won’t matter much if there are no standards governing what police departments can feed into these systems.” she writes. “In the absence of those rules, we believe that a moratorium on local, state, and federal law enforcement use of face recognition is appropriate and necessary.”

Update 9:58am ET: Updated with NYPD statement