Skip to main content

Rite Aid used facial recognition in secret across hundreds of its stores

Rite Aid used facial recognition in secret across hundreds of its stores

/

The drugstore chain used the tech predominantly in low-income and minority neighborhoods

Share this story

Illustration by Alex Castro / The Verge

Drugstore chain Rite Aid secretly deployed facial recognition software across a network of security cameras in hundreds of locations in the US, according to a new investigation from Reuters published on Tuesday. The company had been doing so for more than eight years, and it only recently stopped using the technology, it told Reuters, following a “larger industry conversation” around facial recognition and the grave concern over privacy risks and racial discrimination it presents.

Yet, Reuters says Rite Aid initially defended its use of facial recognition as a deterrent against theft and violent crime, having nothing to do with race. The investigation found that not to be entirely true. “In areas where people of color, including Black or Latino residents, made up the largest racial or ethnic group, Reuters found that stores were more than three times as likely to have the technology,” the report reads.

After presenting its findings to the company, Reuters says Rite Aid issued a new statement and said it had turned off its cameras. “This decision was in part based on a larger industry conversation,” Rite Aid said. “Other large technology companies seem to be scaling back or rethinking their efforts around facial recognition given increasing uncertainty around the technology’s utility.”

‘Reuters’ found that Rite Aid deployed the tech in predominantly minority neighborhoods

Concerns over the unregulated use of facial recognition in the US, both by law enforcement and private companies, has been steadily growing over the last few years, fueled by studies that show the tech in its current form to be inherently flawed and more likely to misclassify the gender and identity of Black individuals. Numerous companies have now publicly renounced the tech in one form or another. IBM says it will no longer invest in or develop the tech at all, while both Amazon and Microsoft say they are pausing facial recognition contracts with law enforcement until Congress passes laws regulating its sale and use. A number of municipal governments, like Oakland, California’s, have also begun banning police use of the tech.

A growing concern among activists, artificial intelligence researchers, and lawmakers is that the tech is being sold and used in secret, without oversight or regulation that might protect against civil rights abuses. Companies like Clearview AI — which was found to have been supplying a powerful facial recognition database and search tool to countless law enforcement agencies and private companies — have emerged as public faces of the threat the tech poses to privacy and other at-risk civil liberties. Now, it’s looking like even run-of-the-mill retail chains, like Rite Aid, might be using facial recognition in secret.

Of particular alarm in Rite Aid’s case is that the company used the tech of a vendor, DeepCam, with links to a Chinese firm, Reuters reports. Prior to that, Rite Aid used a company called FaceFirst, which, until 2017, did not rely on any form of artificial intelligence and as a result routinely misidentified people, often Black individuals, based on blurry photos its cameras captured, Reuters reports. The point of the whole operation, the report states, was to alert security personnel of someone entering the store that had exhibited past criminal activity, so that they may be asked to leave to help prevent theft or crime. But Reuters’ interviews with former employees and managers illustrate how the system was used to racially profile customers.

While Rite Aid would not say which stores were using the cameras, Reuters found them at 33 out of 75 Rite Aid locations in New York and Los Angeles from last October to this month. Rite Aid says it informed customers that the cameras were scanning their faces as they walked through the store, but the investigation found that appropriate signage was missing from at least a third of the locations making use of the facial recognition cameras.