The weeks since the 2016 presidential election has put Facebook under the spotlight for its role in the circulation of fake news articles, which included President Barack Obama weighing in during a press conference earlier this week. While CEO Mark Zuckerberg has made an effort to sidestep the blame leveled against his company for the rise of these articles, he had said that the company has more work to do in combatting misinformation.
In a post to Facebook last night, Zuckerberg outlined the steps that Facebook is taking to limit the spread of false information, but reiterated his belief that the company should not become the "arbiters of truth."
Some of the projects currently underway at Facebook include new systems that will help flag false information, better ways for people to report misinformation and flags to users once fake articles are reported, better recommendations when an article is clicked on, and updating its ad policies to discourage spam sites, which profit off of the exposure. He also noted that Facebook would be working with journalists and "respected fact checking organizations" to understand how they work to verify information, so that the company could learn from their efforts and experience.
While Zuckerberg outlined a number of fairly straightforward steps to limit fake articles, he did not indicate any sort of timeline for when — if ever — these stories are rolled out to regular users, noting that "some of these ideas will work well, and some will not." He also explained that the company has utilized user feedback and other sources to weigh the content appropriately so that it doesn’t spread as quickly through user news feeds.
However, Facebook has had numerous problems with these sorts of automated systems in the past: after firing its entire Trending Topics editorial team earlier this summer, the site subsequently picked up a number of false stories, which gave false content considerable visibility. While Zuckerberg tries to avoid putting his company in the midst of a philosophical argument, it’s clear that the automated systems that it presently uses are flawed and the spread of false information over the course of the election shows that there are considerable challenges ahead for the company.
We’ve reached out to Facebook for comment, and will update this post once we hear back.
Update, November 19th, 2:52PM ET: A representative from Facebook declined to comment further, noting that "right now we don't have anything more to share beyond the post from Mark."