Skip to main content

UK police are testing facial recognition on Christmas shoppers in London this week

UK police are testing facial recognition on Christmas shoppers in London this week

/

Pick up some last minute presents and have your face scanned by algorithms

Share this story

London’s High Streets In Full Swing For Christmas
Photo by Jack Taylor/Getty Images

Facial recognition technology continues to be trialled by police forces in the UK despite warnings of high error rates. In the latest test, the technology is being used to scan the faces of Christmas shoppers in London, with police hoping to spot wanted criminals.

It’s the seventh time the Metropolitan Police, the UK capital’s police force, has trialled facial recognition in public. The technology has previously been used at large events, including Notting Hill Carnival in 2016 and 2017, and Remembrance Day services last year. This year, the technology is being used Monday and Tuesday of this week in Soho, Piccadilly Circus, and Leicester Square — all major shopping areas in the heart of the city.

Cameras are fixed to lampposts or deployed on vans, and use software developed by Japanese firm NEC to measure the structure of passing faces. This scan is then compared to a database of police mugshots. The Met says a match via the software will prompt officers to examine the individual and decide whether or not to stop them. Posters will inform the public they’re liable to be scanned while walking in certain areas, and the Met says anyone declining to be scanned “will not be viewed as suspicious.”

An example poster of how the public is informed of the trials.
An example poster of how the public is informed of the trials.
Credit: Metropolitan Police

Privacy advocates have come out strongly against the technology’s use in the UK. Big Brother Watch has described the Met’s justification for using facial recognition as “misleading, incompetent, and authoritarian.” Critics note that the police have not limited themselves to searching for wanted criminals, but also include so-called “fixated individuals” on their watch lists (typically this means people with mental health issues who may be obsessed with certain public figures).

The Met has also had to defend the high error rate of its technology. According to data released under the UK’s Freedom of Information laws, 98 percent of “matches” made by the Met using facial recognition were mistakes. Despite this, police commissioner Cressida Dick said in July she was “completely comfortable” with the trials.

Worries in the UK mirror those in the US. Facial recognition is also being deployed by law enforcement across America, and experts have voiced similar fears about unbridled use of the technology. The question is now moving onto regulation, with even large tech companies like Microsoft calling for new rules on how and where facial recognition should be used.