Skip to main content

Clearview’s facial recognition tech is illegal mass surveillance, Canada privacy commissioners say

Clearview’s facial recognition tech is illegal mass surveillance, Canada privacy commissioners say

/

One commissioner said the software puts people ‘continually in a police lineup’

Share this story

A stock privacy image of an eye.
Illustration by Alex Castro / The Verge

Clearview AI’s facial recognition amounts to mass surveillance and the company should delete the faces of Canadians from its database, Canada’s privacy commissioners said Wednesday.

Commissioner Daniel Therrien said what Clearview does — scraping photos from social media and other public sites for use by law enforcement — is “illegal” and creates a system that “inflicts broad-based harm on all members of society, who find themselves continually in a police lineup.”

The commissioners released a report that follows a yearlong investigation by several Canadian privacy agencies into Clearview’s practices, which found the company had collected highly sensitive biometric information without consent and that it “used and disclosed Canadians’ personal information for inappropriate purposes.”

According to the investigators, the company maintains that Canada’s privacy laws don’t apply to Clearview since it does not have a “real and substantial connection” to the country, and that consent from individuals was not required because the information it collected was publicly available.

Clearview’s database of some 3 billion images was revealed in a New York Times investigation in January 2020. In addition to raising serious concerns about privacy, the company’s practice of taking images from social media violated the platforms’ rules, and tech platforms sent cease-and-desist orders to Clearview in the wake of the report.

Dozens of law enforcement agencies in Canada had used the software

Clearview CEO Hoan Ton-That told the Times that the company had ceased operations in Canada last July due to the investigation, and he added that Clearview was not planning to delete Canadians’ images from its database, but that people could request their data be removed by using an opt-out form.

The Royal Canadian Mounted Police was among the dozens of law enforcement and other organizations in Canada that had paid for Clearview’s services.

The commissioners don’t have the authority to fine companies or order them to leave Canada, but they sent a letter of intention to Clearview telling the company to stop scraping images of Canadians’ faces, stop offering its facial recognition product in Canada, and to delete previously collected images of Canadians.

It’s not the first time Clearview has been ordered to remove images from its database. The company terminated all its contracts in the state of Illinois last May — the majority of which were with law enforcement agencies — following a lawsuit alleging Clearview had violated the state’s Biometric Information Privacy Act. Clearview also said at the time it was canceling the contracts of any non-law enforcement entities; a report from BuzzFeed News found that Clearview’s list of private clients had included Bank of America, Walmart, and Macy’s.

Clearview AI has said it plans to challenge the decision in court. In an statement emailed to The Verge, Doug Mitchell, an attorney for Clearview AI, likened the company to Google. “Clearview AI is a search engine that collects public data just as much larger companies do, including Google, which is permitted to operate in Canada,” he said.

Mitchell added that while Clearview’s technology is not available in Canada and does not operate in Canada currently, its practice of collecting public information from the internet is “explicitly permitted” under Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA).

“The Federal Court of Appeal has previously ruled in the privacy context that publicly available information means exactly what it says: ‘available or accessible by the citizenry at large.’ There is no reason to apply a different standard here,” Mitchell said.

Use of Clearview AI’s facial recognition among law enforcement spiked 26 percent on January 7th, one day after a mob of rioters attacked the US Capitol, with several police departments using Clearview software to assist the FBI with identifying the rioters. Clearview also has contracts with US Immigration and Customs Enforcement and the Department of Homeland Security.

Update January 4th, 11:16AM ET: Added statement from Clearview AI attorney