Skip to main content

Facebook’s willing to reform its controversial cross-check program — but only parts of it

Facebook’s willing to reform its controversial cross-check program — but only parts of it

/

The company has finally responded to the Oversight Board’s recommendations to fix its cross-check program, which shields high-profile figures from Meta’s automated moderation system.

Share this story

The Facebook logo on a blue background.
Illustration by Nick Barclay / The Verge

Meta has agreed to modify Facebook and Instagram’s cross-check program, which exempts high-profile users from the company’s automated moderation system. In an updated blog post published Friday, the company shared its response to the Oversight Board’s recommendations, stating it will make the cross-check system “more transparent through regular reporting” as well as tweak the criteria it uses to add people to the program “to better account for human rights interests and equity.”

The Oversight Board, or the “independent body” that reviews Meta’s content moderation decisions, made a total of 32 recommendations on how Meta can improve its cross-check program last December. Meta has opted to fully implement 11 of those recommendations, while partially adopting 15.

Facebook and Instagram’s cross-check program came under fire after a 2021 report from The Wall Street Journal revealed that Meta’s been using it to shield politicians, celebrities, and popular athletes from its automated moderation system. According to Meta, the system lets the company apply “additional levels of human review” to posts shared by high-profile figures in an attempt to avoid wrongly removing them.

The Oversight Board criticized the program, stating it “appears more directly structured to satisfy business concerns” rather than as a way to further the company’s “human rights commitments” as it previously claimed. As part of its response, Meta agreed to implement recommendations that require it to take immediate action on cross-checked content “identified as potentially severely violating.” It also committed to reducing the cross-check program’s backlog, an issue the Oversight Board found could cause harmful content to stay online longer than it should.

However, Meta’s still “assessing the feasibility” of a rule that would allow figures to opt out of the cross-check program, and isn’t going through with five recommendations, including a suggestion to “publicly mark” some of the figures benefitting from the program. It also rejected the Oversight Board’s recommendation to notify users that it might take longer for Meta to take action when they report a post from someone in the cross-check program. You can read the full list of recommendations and Meta’s response to each here.

While the Oversight Board calls Meta’s response a “landmark moment” in a thread on Twitter, it isn’t completely satisfied with the changes the company’s willing to make. “Several aspects of Meta’s response haven’t gone as far as we recommended to achieve a more transparent and equitable system,” the Oversight Board writes. “Meta declined the Board’s suggestion that deserving users be able to apply for the protections afforded by cross-check... We will continue to react to Meta’s specific responses in the days and weeks to come.”