Skip to main content

Facebook partners with fact-checking organizations to begin flagging fake news

Facebook partners with fact-checking organizations to begin flagging fake news


Plus new tools for reporting hoaxes

Share this story

Facebook today began rolling out new tools designed to prevent the spread of misinformation, weeks after the US presidential election results raised new questions about how viral hoaxes may have contributed to Donald Trump’s victory. Facebook is introducing tools designed to make it easier to report links shared in the News Feed as fake news, and it’s working with four independent fact-checking organizations to assess the accuracy of viral stories. Facebook users who try to share a story that has been marked as false will be warned that “independent fact-checkers have disputed its accuracy.”

To start with, Facebook is working with Snopes, Politifact, ABC News, and (It plans to add more over time, it said.) All are members of the Poynter International Fact Checking Network, who have agreed to abide by a common set of principles. Together, Facebook and the news organizations will attempt to identify fast-spreading hoaxes and discourage users from sharing them.

Fact-checking groups can mark stories as disputed and link to their own articles explaining why

For its part, Facebook will use a variety of signals to identify stories that are likely to be false. These signals include stories that people post but later delete, and stories that include lots of comments about them being fake. Facebook says it will use these and other signals to populate a dashboard of dubious stories. Its fact-checking partners will get access to the dashboard. After they investigate the article’s claims, they can mark it as disputed and link to their own article debunking it.

If at least two fact-checking organizations mark a story as disputed, users will begin seeing a banner under the article if it appears in their News Feed. The banner reads: “Disputed by 3rd Party Fact Checkers.” Links to articles debunking the posted item will appear below it. Facebook will also penalize the disputed article so it shows up lower in the News Feed, Facebook says.

If a user decides to share the link anyway, a pop-up dialog will warn them the article’s content has been disputed by its fact-checking partners. Users can still tap “continue” and share the article if they want to. But disputed articles will not be eligible to be promoted using one of Facebook’s advertising tools.

The system includes a light layer of intervention by Facebook employees, who will be tasked with separating personal posts from links that present themselves as news. The employees will not make judgments on the content of the articles, Facebook said. Their job is to check the domain of the posted content to ensure it is not a personal post.

“We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully,” said Adam Mosseri, who leads product management for the News Feed, in a blog post. “We've focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organizations.”

“We’ve focused our efforts on the worst of the worst.”

Facebook also said it is testing several new ways to more easily report a hoax. Users could previously report false links through a three-step process that required reporting a link, selecting the “I think it shouldn’t be on Facebook” option, and then choosing “It’s a false news story” from among the options. Facebook says getting more reports from users will help it detect more hoaxes, although critics have argued that users often are not in a position to assess the validity of the links they see.

The company said it would do more to eliminate the financial incentives behind fake news, which is often created as an easy way to generate ad revenue. Facebook will now prevent publishers that use spoof domains (think, or from buying ads on the platform to generate traffic. And the company said it is “analyzing publisher sites to detect where policy enforcement actions might be necessary,” though it did not elaborate.

The changes introduced today won’t eliminate all misinformation from the News Feed. As Mark Zuckerberg has noted, even articles from reputable publications still routinely contain errors. But these changes may begin to put the brakes on links of the “Hillary Clinton is a lizard person” variety. And crucially, from Facebook’s perspective, they will do so without Facebook having to weigh in as a company on the relative truth of a story.

It’s a start. “We know there's more to be done,” Mosseri said in his blog post. “We're going to keep working on this problem for as long as it takes to get it right.”

Update, December 15th, 1:33PM ET: Mark Zuckerberg had some thoughts as well: