Skip to main content

Facebook program reportedly let celebrities avoid moderation

Facebook program reportedly let celebrities avoid moderation

/

XCheck was meant to improve moderation

Share this story

Illustration by Alex Castro / The Verge

Facebook maintains an expansive program that exempts athletes, politicians, and other high-profile users from its typical moderation process, according to The Wall Street Journal. The program is reportedly meant to stop “PR fires,” or bad press caused by pulling down photos, posts, and other content from high-profile users that should have been allowed to stay up. In reality, the program just lets these users break the rules in ways that would have gotten most people into trouble, according to the report.

The program is known as XCheck, or “cross check,” and it’s ostensibly meant to provide additional quality control around moderation when it comes to high-profile users, according to the Journal. Posts from users flagged for XCheck are supposed to be routed to a set of better-trained moderators to ensure Facebook’s rules are properly enforced. But the program reportedly protected 5.8 million people as of 2020, and just 10 percent of posts that hit XCheck actually get reviewed, according to a document seen by the Journal.

Facebook says it’s “working to address” issues with XCheck

High-profile users protected by the program include former President Donald Trump, Donald Trump Jr., Senator Elizabeth Warren, and Candace Owens, according to the report. Users are usually unaware that they’re being given special treatment, the report says.

Facebook told the Journal that criticism of XCheck was warranted and the company is working to fix the program. The system is meant to “accurately enforce policies on content that could require more understanding,” a spokesperson said. They added that “Facebook itself identified the issues with cross check and has been working to address them.”

Andy Stone, Facebook’s policy communications manager, later responded to the Journal’s story on Twitter. Stone said that the system had been disclosed in the past, linking to a 2018 blog post where Facebook explains it has a “cross check” system to offer a “a second layer of review” for high-profile accounts.

While the Journal’s added details don’t look great for Facebook, which has promised even enforcement of its rules, there’s a level on which none of this is particularly surprising. Facebook has a long and detailed set of moderation policies. But it’s always been clear that those policies are enforced at Facebook’s discretion, with leeway often granted to major names or questionable content when removal might lead to problems for the company. With the Journal’s report, it’s evident that in some cases, Facebook’s own system, by design or not, is helping keep some of those posts online.

Correction September 13th, 3PM ET: In a response after publication, Facebook pointed to a prior disclosure of its cross-check program offering some details of how it worked. References to the program being a secret have been removed from this story.