Skip to main content

Facebook is starting to share more about what it demotes in News Feed

Facebook is starting to share more about what it demotes in News Feed

/

A high-level look at what gets suppressed by Facebook’s algorithm

Share this story

The word Facebook in white against a blue background.
Illustration by Alex Castro / The Verge

The way that Facebook controls its News Feed is often controversial and largely opaque to the outside world.

Now the social network is attempting to shine more light on the content it suppresses but doesn't remove entirely. On Thursday, Facebook published its “Content Distribution Guidelines” detailing the roughly three-dozen types of posts it demotes for various reasons in the News Feed, like clickbait and posts by repeat policy offenders. That process, which relies heavily on machine learning technology to automatically detect problematic content, effectively throttles the reach of offending posts and comments without the author knowing.

There’s still plenty that the guidelines, which Facebook has mostly confirmed in various reports over the years but is just now publishing for the first time in one place, don’t say. They don’t detail exactly how a demotion works and exactly how much it reduces a piece of content’s reach. Or how severely a certain kind of post, like a link to spam, is throttled in the News Feed relative to a post about health misinformation, for example.

“We want to give a clearer sense of what we think is problematic but not worth removing”

“We want to give a clearer sense of what we think is problematic but not worth removing” because it doesn’t explicitly violate platform policy, Jason Hirsch, Facebook’s head of integrity policy, told The Verge. He said the company hopes to add more information to the guidelines over time, including how demotions throttle specific kinds of content relative to others. But he said Facebook likely won’t stack rank the severity of demotions “for adversarial reasons.”

Detailing the guidelines now could help Facebook avoid controversy the next time it throttles a high-profile post from going viral, like when it suppressed a story by The New York Post about U.S. President Joe Biden’s son, Hunter. The guidelines spell out that Facebook’s policy is to suppress stories that have been disputed by users as inaccurate — as was the case with The Post’s dubious reporting — until a review is completed by its network of third-party fact-checkers. That policy was made known widely only a year ago after critics accused the company of political bias for censoring The Post.

According to the distribution guidelines, other types of content Facebook demotes include links to spam sites, “low quality” comments that are either very lengthy with copied text or contain no words at all, posts in groups from accounts that share at a “very high frequency,” and news articles without a clear byline.

Releasing these guidelines is part of a bigger effort to disclose more about how the News Feed works to the public, according to Hirsch. Media outlets and politicians are increasingly examining Facebook’s negative effects on the world, and lawmakers in the US and elsewhere are looking to regulate how social media companies police their platforms.

Facebook recently put out its first quarterly report on the most-viewed content in the News Feed after journalists used its public-facing analysts tool, CrowdTangle, to show that right-leaning personalities are often the most engaged with on its service. He said that, while building the most-viewed data into a self-service, real-time tool like CrowdTangle would be a “huge investment of time and resources,” the company wasn’t opposed to doing it eventually.