The Facebook Oversight Board says it will review Facebook’s “cross-check” program for high-profile users and criticized it for not disclosing enough information about the program.
In a new transparency report, the Oversight Board — a semi-independent body that reviews Facebook’s moderation process — publicly accepted a Facebook request to review cross-check (sometimes called “XCheck”) and recommend changes. Facebook will also share documents related to reporting from The Wall Street Journal, which described cross-check as a system that “shields millions of VIP users from the company’s normal enforcement process.” The board previously raised concerns about cross-check following the Journal’s report.
“In the board’s view, the team within Facebook tasked to provide information has not been fully forthcoming in its responses on cross-check. On some occasions, Facebook failed to provide relevant information to the board, while in other instances, the information it did provide was incomplete,” the board report says. The board says its “credibility” depends on being able to trust that Facebook’s information is “accurate, comprehensive, and paints a full picture of the topic at hand” — and a lack of disclosure undermines that credibility.
The board calls it “not acceptable” that Facebook didn’t initially mention cross-check was a factor in its moderation of former President Donald Trump, who was banned for violating Facebook’s policies against inciting violence. It also says that Facebook should not have referred to cross-check as covering a “small number of decisions” when its true scope was clearly far wider.
Facebook asked the board for guidance on cross-check in late September, and the board says it will now reach out to academics, researchers, and former Facebook employees who have come forward as whistleblowers. It will also open a public comment period in the coming days, something it also did around the Trump ban.
The transparency report is the first of its kind, although the Oversight Board says it will now release these reports quarterly. It offers a sense of the program’s scope compared to the massive demand for its services.
Facebook and Instagram users submitted around 524,000 total requests for review between October 2020 and June 2021. The requests were particularly concentrated in the US and Canada, where a combined total of roughly 46 percent originated. This doesn’t reflect the overall demographic makeup of Facebook, and the board says it will conduct more outreach to draw reports from other areas — saying that “if anything, we have reason to believe that users in Asia, Sub-Saharan Africa, and the Middle East experience more, not fewer, problems with Facebook than parts of the world with more appeals.”
Either way, that volume is orders of magnitude greater than the number of cases the Oversight Board actually takes up. It decided 11 cases in the first and second quarters of 2021, and its review process caused Facebook to identify 38 cases where it had made an incorrect moderation decision, apparently leading it to restore around 35 pieces of content.
The Oversight Board recommendations can have a larger indirect effect by leading Facebook to review and modify its policies — in at least one case, it also rediscovered a policy it had lost. And one of the transparency report’s themes is that Facebook should be more transparent with users as well. “In areas where we feel that Facebook is falling short, such as transparency, we will keep challenging the company to do better,” the report says.