Meta’s Oversight Board — an independent panel the social media giant selected to deliberate its content decisions — has published a report that supports content moderation actions taken by Meta at the height of the covid pandemic. The board is also recommending a number of changes to its misinformation policy while highlighting the company’s failure to assess the impact its social media platforms had on public health and human rights.
Meta is being urged by the board to commission an impact assessment with a focus on how design features like Facebook’s News Feed recommendation algorithms can amplify dangerous health-related misinformation. This includes publicly releasing any prior research that the company has conducted into the matter. During the height of the covid pandemic in 2021, Facebook was criticized by the White House for allowing vaccine misinformation to spread, accusing the platform’s algorithm of boosting false information over accurate content.
The report advises Meta to publish information on government requests to remove covid-related content
The panel also recommends that Meta should maintain its existing covid misinformation policy but should be more transparent when removing content across its platforms. Within the report, the board advises Meta to publish information on government requests to review content regarding public health emergencies amid concerns that the covid pandemic has been used to “erode the tenets of democracy.”
In addition to providing greater transparency about removal decisions, the Oversight Board also recommends that Meta should provide greater support for independent research of its platforms.
The Oversight Board’s report follows an investigation requested by Meta in July 2022 to assess if it should be less restrictive when removing false covid-related content or cease removing such misinformation entirely to “better align with its values and human rights responsibilities.” Meta’s current misinformation policy outlines that content is subject to removal if it contributes to the “risk of imminent physical harm,” could interfere with political processes, or contribute to “certain highly deceptive manipulated media.” It also includes specific exclusions designed to prevent certain types of covid-related content from being removed, such as joking that “only Brad Pitt’s blood can cure covid.”
The board recommends leaving Meta’s current policies on covid misinformation in place based on the WHO’s ongoing health emergency declaration
In its findings, the Oversight Board advises that Meta should keep its current policy in place as long as the World Health Organization continues to declare covid an international public health emergency. This recommendation was influenced by “Meta’s insistence that it takes a single, global approach” to any policy changes as Meta claims it lacks the capacity to enforce adjustments localized by country or region.
Meta has since issued a statement to The Verge responding to the Oversight Board’s report: “We thank the Oversight Board for its review and recommendations in this case. As Covid-19 evolves, we will continue consulting extensively with experts on the most effective ways to help people stay safe on our platforms.”
Update: April 20th, 2.05PM ET: Article updated to include Meta’s statement responding to the Oversight Board’s report.