Facebook is once again tweaking how stories spread through the News Feed, this time going after posts that are “sensationalist and provocative.” The goal isn’t just to cut down on clickbait, but to cut down on misinformation and problematic posts that don’t quite warrant an outright ban on the site.
In a blog post, Facebook CEO Mark Zuckerberg writes that people naturally engage more with sensationalist content. Engagement with this content, he says, increases the closer it gets to being so problematic that it has to be banned.
“Borderline content” like offensive speech will see less distribution
So instead of moving the line of what’s banned, Facebook is going to alter its distribution algorithms. Posts that Facebook’s AI detects as needlessly provocative will be distributed less and less, preventing them from seeing a spike in engagement.
Zuckerberg believes this will discourage people from creating and posting this kind of content in the first place, ultimately leading to a better experience for users and less polarization.
Facebook uses the term “borderline content” to describe what stories will be affected here. The blog post gives two examples: “photos close to the line of nudity, like with revealing clothing or sexually suggestive positions” and “posts that don’t come within our definition of hate speech but are still offensive.” Zuckerberg only explicitly says that the algorithm has been adjusted when it comes to nudity, but the implication is that it applies to news stories and more.
“Divisive groups and pages” will also be suggested to users less often, as a result of these changes.
“I believe these efforts on the underlying incentives in our systems are some of the most important work we’re doing across the company,” Zuckerberg writes.
Here is a completely non-scientific chart Facebook made to illustrate how this works:
I particularly appreciate the sketchiness of the arrow.
The changes were announced today alongside a host of other updates on how Facebook handles problematic content. Facebook said it would expand its appeals process for content moderation; the company will also create an independent oversight board meant to essentially act as a Supreme Court when it comes to dealing with challenging moderation issues, as a way to partially take the problem out of Facebook’s hands.