At one point in time, Facebook’s relationship with politicians was relatively uncontroversial.
But after the 2016 US elections, everything changed.
Early in the campaign, then-presidential candidate Donald Trump tested the limits of Facebook’s rules against hateful speech at the same time that the company became a vehicle of political exploitation by foreign actors.
Facebook’s first test: dealing with Trump’s 2015 Facebook post calling for a “total and complete shutdown” of Muslims entering the US. While some inside the company saw a strong argument that Trump’s comments violated Facebook’s rules against religious hate speech, the company decided to keep the post up. Until then, most Facebook employees had never before grappled with the possibility that their platform could be used to stoke division by a political candidate for the highest position of office.
“What do you do when the leading candidate for president posts an attack…on [one of the] the biggest religion[s] in the world?” former Facebook employee and Democratic lobbyist Crystal Patterson told us.
And it wasn’t just national politicians that Facebook had to worry about but foreign adversaries, too. Despite Mark Zuckerberg’s initial post-election comments dismissing a “pretty crazy idea” that fake news on the platform could influence elections, it soon became clear that propaganda from Russian Facebook accounts had reached millions of American voters — causing an unprecedented backlash that forced the company to reckon with its culpability in influencing global politics.
Over time, Zuckerberg would acknowledge Facebook’s role as the “The Fifth Estate” — an entity as powerful as the government and media in shaping the public agenda, while at the same time trying to minimize the company’s role in dictating the acceptable terms of political speech.
How much responsibility does Facebook still have to dictate the terms of its own platform?
To offload the burden of political responsibility going forward, Facebook formed the Oversight Board, a Supreme Court-like body it set up to weigh in on controversial content decisions — including how to deal with Trump’s account. But the board is new, and we’re still learning how much power it has over Facebook. How much responsibility does Facebook still have to dictate the terms of its own platform? And can the board go far enough to change the social media platform’s underlying engine: its recommendation algorithms?
We explore these questions about Facebook’s role in moderating political speech in our fourth episode of Land of the Giants, Vox Media Podcast Network’s award-winning narrative podcast series about the most influential tech companies of our time. This season, Recode and The Verge have teamed up over the course of seven episodes to tell the story of Facebook’s journey to becoming Meta, featuring interviews with current and former executives.