Skip to main content

Facebook admits what we all know: that social media can be bad for democracy

Facebook admits what we all know: that social media can be bad for democracy


‘I wish I could guarantee that the positives are destined to outweigh the negatives.’

Share this story

Illustration by Alex Castro / The Verge

Facebook’s ongoing attempt to reckon with its impact on civil life continued today with the company acknowledging that its platform is not always good for democracy.

In a set of blog posts published as part of its “Hard Questions” series, Facebook execs and outside experts assess the company’s impact on elections, partisan politics, and fake news. As ever, Facebook tempers its self-criticism. For example, referring to “the damage that the internet can do to even a well-functioning democracy” (our emphasis), rather than damage caused by Facebook specifically. But, it does admit to a sliver more responsibility — taking the company one step further from CEO Mark Zuckerberg’s comments in 2016 that it was “crazy” to say Facebook influenced the US election.

The 2016 US elections were a wake-up call for Facebook

As Facebook’s global politics and government outreach director Katie Harbath tells it, this was the moment the company began to recognize its influence on democracy, for better or for worse. “From the Arab Spring to robust elections around the globe, social media seemed like a positive,” writes Harbarth. “The last US presidential campaign changed that, with foreign interference that Facebook should have been quicker to identify to the rise of ‘fake news’ and echo chambers.”

In another post, Facebook’s product manage for civic engagement, Samidh Chakrabarti, expands on these issues. He points out many positives — that the company helps keep people informed about politics, and that it’s a venue for debate — but cautions that the company will never be able to completely stamp out its problems. On the spread of fake news and misinformation on Facebook, he writes: “Even with all these countermeasures, the battle will never end.”

Since November 2016, Facebook has moved to address these issues in concrete ways. This month, the company started to reengineer the News Feed, demoting content from news outlets in favor of activity from friends. It’s also going to start polling users on which sources they trust. “We feel a responsibility to make sure our services aren’t just fun to use, but also good for people’s well-being,” said Zuckerberg.

Could Facebook’s solutions make things worse?

Arguably, though, these moves also exacerbate existing problems. If users get less news from news sources, they’re more likely to share sensationalized stories, say reports. And if people are given the task of judging which outlets they find trustworthy, what’s to stop them simply voting in line with sites that support their worldview? This perpetuates the problem of polarization and “echo chamber” politics — which Cass Sunstein, a professor at Harvard Law School, calls “a nightmare” in a blog post published today for Facebook.

It’s also important to note that although much of Facebook’s attention is focused on the US and the influence of Russia on the 2016 election, in other parts of the world the situation is more dire. A recent report from BuzzFeed in Cambodia illustrated Facebook’s problematic role in politics, with the country’s authoritarian prime minister Hun Sen (last year Sen banned the main opposition party) using the site to push pro-government messages while identifying, and often jailing, critics.

As Facebook’s Chakrabarti writes: “If there’s one fundamental truth about social media’s impact on democracy it’s that it amplifies human intent — both good and bad [...] I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t.”