Skip to main content

Facebook sees a rise in misinformation as reports warn of incitement

Facebook sees a rise in misinformation as reports warn of incitement


The company plans to demote election misinformation on Facebook and Instagram

Share this story

Illustration by Alex Castro / The Verge

Facebook is seeing an increase in worrying activity around the US election, according to reports from The New York Times and BuzzFeed News. According to BuzzFeed News, which viewed a post on Facebook’s internal message board, the company has been tracking a rise in “violence and incitement trends” associated with hashtags and keywords.

Facebook did not respond for comment about a potential rise in the organization of violent activity on its platform by the time of publication, but Facebook spokesperson Liz Bourgeois told BuzzFeed News that the company is “staying vigilant in detecting content that could incite violence during this time of heightened uncertainty.” In a statement given to The Verge and others, Facebook said that the company is seeing a rise in misinformation about the election. “As vote counting continues, we are seeing more reports of inaccurate claims about the election,” a Facebook spokesperson said in a statement to The Verge. “While many of these claims have low engagement on our platform, we are taking additional temporary steps, which we’ve previously discussed, to keep this content from reaching more people.

The company had reportedly planned to deal with the spread of viral misinformation about the election with tools designed for “at-risk” countries such as Sri Lanka and Myanmar. Facebook took down a group with more than 300,000 members called “Stop the Steal” that was organizing rallies to challenge election results earlier on Thursday.

Facebook plans to institute “demotions for content on Facebook and Instagram that our systems predict may be misinformation, including debunked claims about voting,” the spokesperson said. “We are also limiting the distribution of Live videos that may relate to the election on Facebook.” The company will also make it harder to share content as soon as today, according to The New York Times. That means, for example, Facebook could require an extra click when sharing content.

A lot of that misinformation is coming directly from President Donald Trump. In one post, Trump claimed that any vote coming in after election day can’t be counted; Facebook applied a label to that post directing readers to the company’s voting information center. In another, he demanded that election officials stop counting votes, which Facebook also labeled. It’s unclear if Facebook has limited the reach of those posts or others that have received labels.

Like Facebook, Twitter has taken action on a number of Trump’s tweets since the election, often placing tweets entirely behind a label and preventing replies or retweets of those tweets. (Quote tweets of the labeled tweets are still allowed, however.)

Major news organizations have not yet called a winner in the presidential election. Results are still expected from Arizona, Alaska, Georgia, Nevada, North Carolina, and Pennsylvania, where votes continue to be counted.