Facebook plans to test how people respond to seeing fewer posts about politics in the News Feed. Starting this week, Facebook will “temporarily reduce” political posts for a “small percentage” of people in Canada, Brazil, and Indonesia, with a test in the US following some weeks later. The tests will continue for the next few months.
The experiment comes in response to feedback Facebook has (somehow just now) heard that “people don’t want political content to take over their News Feed,” Aastha Gupta, product management director at Facebook, wrote in a blog post this morning. The goal is to improve the News Feed by “finding a new balance of the content people want to see.”
Users “don’t want politics and fighting to take over their experience on our services.”
Gupta says that political content only makes up about 6 percent of the typical News Feed right now in the US. Nonetheless, that small percentage has clearly had an oversized impact, prompting years of discussions around the polarizing effects of people being served up too-often misleading political stories. Facebook has widely been blamed for stoking polarization and boosting far-right voices.
In October, Facebook said it would temporarily stop recommending civic and political groups to users in the US, and last month it said that change would become permanent. Facebook CEO Mark Zuckerberg said on a call with investors that the goal was to “turn down the temperature and discourage divisive conversations and communities.”
The tests around reducing political content overall are part of the same initiative. Zuckerberg said Facebook wants to allow political discussions and grassroots organizing to keep happening, but that users “don’t want politics and fighting to take over their experience on our services.”
Facebook says health agencies and information on COVID-19 from authoritative groups like the WHO will be exempt from the reduced political distribution.
The tests come after years of frustrations, from pretty much all directions, around how Facebook handles political content. Facebook shut down an internal effort in 2018 to make its site less divisive, The Wall Street Journal reported last year. At the same time, Facebook was making choices of its own that seemed to up the divisiveness. In 2017, the company pulled back on reductions to political content distribution would have had an outsized impact on right-wing sides, and instead put forward changes that more significantly limited the reach of left-wing sites, the Journal also reported.