Skip to main content

The author of The Filter Bubble on how fake news is eroding trust in journalism

The author of The Filter Bubble on how fake news is eroding trust in journalism

/

‘Grappling with what it means to look at the world through these lenses is really important to us as a society’

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

In the aftermath of a US presidential election that seemed to shock at least half the country, many Americans are asking themselves how they missed the popularity of Donald Trump. One answer is a concept known as the filter bubble: the idea that personalization tools from companies like Facebook and Google have isolated us from opposing viewpoints, leading conservatives and liberals to feel like they occupy separate realities.

The concept was popularized by Eli Pariser, co-founder of Upworthy, who wrote a best-selling book about the subject in 2012. In it, he argued that the internet is highly effective at bringing like-minded groups of people together — and terrible at creating space for differently minded people to debate.

On Tuesday, Upworthy organized an "empathy lab" in New York City to show visitors its progressive-minded videos and measure their reactions. As it wound down, we caught up with Pariser to ask him about surging interest in filter bubbles — and what Facebook, in particular, ought to do about them.

"I’m encouraged that people are talking about the filter bubble and the role social media played."

"I’m encouraged that people are talking about the filter bubble and the role social media played," Pariser told me. "Because even if I think that it is not the primary factor in this election, grappling with what it means to look at the world through these lenses is really important to us as a society."

This interview has been edited and condensed for clarity.

In the run-up to this election, were you thinking about filter bubbles and how they might affect the outcome?

It’s something I’ve been thinking about for the last five years. After Brexit, there was a whole conversation in the UK about the filter bubble, for very similar reasons. A lot of us were looking at that whole phenomenon saying, I wonder if that’s going to happen here?

What role do you think the filter bubble played in the election?

There are a couple of contrarian, level-setting things I need to say.

One bubble we all live in is the idea that social media is a primary source of news for most people. In fact it’s still the case in 2016 that most Americans get their news from local TV news, according to Pew. So I actually think it’s very hard to attribute the results of this election to social media generally or the filter bubble in particular.

The filter bubble contributed to liberals and coastal folks being very surprised at the outcome. But that’s different from saying that it drove the results or changed the outcome.

"Extremely slanted news is eroding the authority that news has."

I think on the second piece, however, there is something going on with the way that fake news and extremely slanted news is eroding some of the authority that news has, and the trust people place in it. It’s much easier than it’s ever been to live in an information environment that is several standard deviations from normal. It changes the cultural conversation for everyone. I do see that happening in 2016.

I don’t think the effect is as direct as "Millions of people believe X because they saw this fake article on FB." I think it’s more subtle. Millions of people believe what they hear from many outlets less than they otherwise would and are more willing to believe some pretty far-out ideas than they otherwise would thanks to social media.

But I think you can over-extrapolate. There are a number of factors I would put as more important than the shape of social media, even though that’s my own preoccupation.

It does seem that Facebook is at risk of being scapegoated here, in a way that obscures the nuances of what’s happening.

I’m totally with the people who say Facebook needs to take its responsibility as arbiter of truth and an editor more seriously. I believe that, too. I just think the argument that Facebook won the election for Trump is overly simplistic.

"The argument Facebook won the election for Trump is overly simplistic."

How might you write The Filter Bubble differently if you wrote it today?

It’s sort of fascinating how little has changed in some ways. We think of technology moving so quickly, but really Google and Facebook continue to be the dominant platforms — and they continue to personalize even more. The big thing that has changed is that Facebook, even more than it was then, is unprecedented in the history of the world as a place where people spend their media time. It’s much bigger now than it was five years ago.

What responsibility do you think Facebook has around the filter bubble — a concept whose existence the company still basically denies? Are there things you’d like to see them do?

If they are aggregating an enormous percentage of the world’s attention, and building systems that decide where it is spent, with that naturally comes responsibility. It’s to make sure that if people come to Facebook to find news, that they do find the truth.

"Where there’s a will, there’s a way."

I think there are a whole bunch of ways that people can start to take a whack at that. On the fake news question, the funny thing about that is that [Google’s] PageRank is a pretty good system for assigning webpages authority without having to pick winners and losers. From a technical standpoint, you could just build something like that in and cut down on the Macedonian spam sites pretty quickly.

Where there’s a will, there’s a way. It’s not an unsolvable problem. And it doesn’t require Facebook to say, The New York Times is good and another paper is bad. The dodge that "It’s just code, and we’re not really responsible for it," is getting pretty old.

There are still those who say the filter bubble doesn't exist. Facebook published a study of its own data in Science saying the effects are overstated. How do you respond?

People have pointed out some of the challenges with that study. But even if you accept it as valid, the study says it’s a real effect. The algorithm increases people’s chances of coming into contact with information that reinforces their political point of view.

The effect of the algorithm is as strong as an individual’s own choices about which links in the News Feed to click. If you think that you’re doing something important by looking at your News Feed and deciding which items to click, the algorithm exercises an equally strong bent. And the bent is toward politically aligned information. So even if you accept that the study is true, that’s pretty striking.

"Who you’re friends with is the most important factor in what you see."

And it’s also a problem that if Facebook wanted to, it could solve. Even if you accept, and I mostly do, the premise that who you’re friends with is the most important factor in what you see. The algorithmic piece is the piece that Facebook clearly has control over. And we know it has this narrowing effect, by Facebook’s own research.

I always find it unconvincing when a company that is as enormous as Facebook diminishes itself to say, "Oh well, we’re just a tiny piece of this whole thing, we couldn’t possibly have an impact." It doesn’t really scan.

Your own company, Upworthy, arguably serves up a steady stream of content to politically aligned subscribers. Do you play a role in the filter bubble yourself?

We try to combat that in two ways. I do believe that one of the scores in the feed is just, is there a conversation around public-interest issues or not? Is there content in your feed about climate change or not? Is there content in your feed about economics or not? If we start those conversations, that’s a valuable thing even if we do come at it with a point of view. One of the premises of Upworthy was that if climate change content is going to be fighting with some guy surfing off his roof for attention in the News Feed, it needs help. That’s what we try to do.

"For democracy to work, you really need to understand other points of view."

The second piece is the empathy piece. And we do this in both a cross-partisan way and in all sorts of life experiences. I deeply believe that for democracy to work, you really need to understand other points of view. And that includes Trump supporters if you’re progressive, and we’ve showcased a bunch of those. But it also includes the folks who are different from us in every other respect. And if you look at our videos, there are lots of different life experiences represented.

The purpose is to help us understand who else is in this country is with us. I don’t feel like that has a super political bent, per se. But I do think it’s about bringing people into contact with points of view they wouldn’t naturally find in their feed.