Facebook may have put new tools in place that allow third-party checkers to flag dubious news stories, but that doesn’t mean that the social network will try to stop people from sharing “fake news.”
In an on-stage interview today during Recode’s Code Media conference in Southern California, Facebook vice president of partnerships Dan Rose said that, “at the end of the day, if people want to share stories that have been flagged with their friends, that’s ultimately their prerogative.”
“We are making a very important point of not putting ourselves in a position of deciding what’s fake and not fake. I don’t think people want us to be the arbiters of truth,” Rose said, echoing what Facebook chief executive Mark Zuckerberg has said himself in shared posts about the “fake news” phenomenon. “There are third parties out there who do this for a living.”
Facebook’s “not it” approach to managing the dissemination of dubious or fabricated news stories is not entirely surprising. It’s not always clear which stories are fake at the moment they’re published, and to block them from the feed entirely would subject Facebook to cries of censorship. Facebook also has a vested interest in keeping its two billion users around the globe sharing as much content with each other as possible, even while it’s trying to figure out best practices around false news.
But Facebook, along with Twitter and Google, drive a disproportionate amount of news consumption. According to Pew Research Center, the majority of US adults now get their news from social media sites, and most of them — 64 percent — get their news from one site only: Facebook.
The issue of “fake news” emerged with a vengeance after last year’s US presidential election, when the spread of viral hoaxes may have contributed to Donald Trump’s victory over Hillary Clinton. (Mark Zuckerberg has rejected this idea, calling it crazy.)
In recent months Facebook has made some attempts to curb the problem: it partnered with fact-checking organizations, like Snopes, Politifact, and FactCheck.org, and created a tool that people can use to flag suspicious news content. If a fact-checker confirms the story is a hoax, Rose said, a prominent label is slapped on the story, and if a user goes to share it on Facebook, he or she will also see then that the story has been disputed.
Facebook is also making efforts around media literacy, having recently supported a PSA by The News Literacy Project designed to help people become more skeptical news consumers. Like Apple CEO Tim Cook’s recent remarks on the need to educate the modern media consumer, Facebook’s Rose thinks that media literacy is “a skill that people are going to need to have, and it’s a skill that we’re committed to.”
In other words: it’s not a Facebook problem, at least right now, if fake news is being spread on the site; it’s something consumers are supposed to be able to pick up on.
Not it, indeed.
Comments
Seems legit.
By SpyderMS on 02.14.17 3:34pm
… by inundating you with nonsense until you just stop paying attention.
By superberg on 02.14.17 3:35pm
You can lead an American to media literacy but you can’t make them critical.
By 4ndrew on 02.14.17 5:02pm
I think the problem goes deeper than that.
With all the metrics, data and statistical analysis, news outlets are able to find the best way to provoke a reaction out of their readers. It’s not just readers being uncritical, but rather being manipulated through base human emotions and instincts. Even otherwise rational people can act very irrationally when you push the right buttons.
By OpssYourBad on 02.14.17 5:27pm
I think you just nailed the plot for the movie Zootopia with that last sentence.
By chad.preslar on 02.14.17 5:35pm
I won’t disagree with your point, but the basis of media literacy is the ability to take a step back and question why something is being presented in a certain context. Appeals to emotion shouldn’t work so well in a just democracy.
By 4ndrew on 02.14.17 5:37pm
There isn’t anything inherent in a democracy that saids the mob can act rationally (and there’s always the problem of asymmetry of information). If you haven’t, check out the book Predictably Irrational when you have the chance.
I agree with you that more critical thinking is needed in society, but 1) there’s something paradoxical about "teaching" critical thinking and 2) it’s helpful to recognize that the people who believe in mainstream media are not stupid or willfully ignorant, it’s just that modern media is powerful, pervasive and has access to a historically unprecedented amount of data.
By OpssYourBad on 02.14.17 6:17pm
How dare you. You can’t lead an American anywhere. We love FREEDOM.
By superberg on 02.14.17 5:27pm
They could at least tweak their bubble-supporting algorithms.
By woef on 02.14.17 3:53pm
Agree with Facebook. The idea that they’re responsible for enforcing a standard of truth for what their members post is idiotic to the extreme.
I get that there’s a lot of liberal angst over Trump becoming President, and this kind of lashing out and attempting to blame everything they can think of under the sun, rather than accepting that democracy is messy, is the millennial method of coming to terms with reality, but there’ll be new buzz words and hashtags that people decide to congregate around as social media finds another cause to trumpet soon enough, so Facebook is probably in the right in not reacting like their heads have been cut off in a purely reactionary manner.
By fueledbygin on 02.14.17 4:21pm
Great post. The Sore Loser Effect hasn’t completely worn off yet. They still haven’t proven that 1 person changed his/her vote because of a post on FB (or any other website or app).
And the venerable BBC still shows pictures of 12 yr old Trayvon Martin. I haven’t heard the Left asking that the BBC new links be removed.
One man’s ‘fake news’ is another’s satire.
By TheMarkS on 02.14.17 7:22pm
Despite your condescending attitude towards an entire generation of people, I agree with Facebook too. That’s coming from a so-called entitled liberal millennial. Fake news is bad for everyone. Why do you need to make this a partisan issue?
By Disdain on 02.15.17 7:59am
I totally agree with this, they will flag something that is clearly fake, but they won’t try to censor anyone or decide what they get to share.
By lucas_fster on 02.14.17 4:28pm
"I get my news from facebook" – said every millennial ever.
By chad.preslar on 02.14.17 5:33pm
lol, millenial here, use Facebook about once a month, and I, (at the very least) don’t get my news from Facebook.
By AJduckfan on 02.14.17 11:37pm
Not millenials, but dumb baby boomers and generation X’ers. The millenials know how to parse through the bullshit while the older generation share all the bullshit imaginable in Facebook.
By Deckard Cain on 02.15.17 6:43am
It’s funny how the same exact generation of people that told my generation to not believe everything on the internet, are believing everything they read on the internet — on Facebook no less.
By Echo_One on 02.14.17 5:53pm
Such a horrifying statistic!
By web0rama on 02.14.17 6:14pm
Why is it that a television station is held liable for what it airs, but a website isn’t liable for it’s content posted?
By bradddddddd on 02.14.17 6:42pm
Facebook doesn’t produce content, they make a platform for users to share their own content. Using your metaphor, Facebook is more like the cable company than the television station. Would you hold Comcast accountable because ABC aired something untrue?
Also, what liability are you talking about? Is fake news illegal now? Yes, there are libel laws, but this is just a part of sharing false information. If I say the US government is comprised solely of toads, what law am I breaking?
By 4ndrew on 02.14.17 8:08pm
Maybe I’m missing the point, but I almost think you you are. The market doesn’t punish a website for disseminating false information in the same way it does a news organization. Quite the contrary, actually. In the system we have set up, the more outrageous, albeit it still has to be believable, information a website pilfers, the more the market rewards it in terms of ad revenue.
I wonder if this disparity is a consequence of looking at a fledgling vs developed industry. In the developed industry the players are consolidated and have a wide variety of customers. If a developed member were to spout outrageously false information, they would lose a lot of viewers (~say 50% of their audience). In a fledgling industry, a company can get away with spouting false information and even find a niche for it as they can have a very focused and specific set of customers. Specialized niches are good in most industries, but I’m not so sure that information dissemination is one of them.
By ben.beck.148 on 02.14.17 10:18pm
Taking an Econ 101 class doesn’t mean you should apply those concepts to every topic. This is not about markets. In case you missed my previous argument, Facebook is not producing any content. The "market" doesn’t hold them accountable for your cousin sharing Breitbart articles because they aren’t. Not legally, not ethically, not in any way. It WOULD be an ethical problem if they decided they were going to be the arbiters of Truth, and start censoring users posts based on what they consider to be True.
I also take issue with your argument that only large news agencies should be trusted. The larger the news agency, the more likely they are to be comprised by status-quo bias. Challenge the authority? You lose access. Challenge your audience’s opinions? You lose some of your audience. This is bad for business, but good for honest reporting.
By 4ndrew on 02.15.17 1:29pm