Given the phenomenon it describes, it’s perhaps appropriate that the concept of “filter bubbles” has turned out to be so polarizing. To believers, it’s self-evident that social feeds mostly show people news that confirms’ users prior beliefs, encouraging partisanship and tribalism. To skeptics, the phenomenon describes behavior that has little to do with tech and algorithms — and, they say, there’s evidence that platforms like Facebook and Twitter introduce people to a broader set of views than they might otherwise encounter.
To internet activist Eli Pariser, who coined the term and wrote a book on the subject, questions about how tech platforms are reshaping public life remain as relevant as ever. In a new TED talk, Pariser says social platforms should be rebuilt to serve the greater good, drawing on principles from urban planning. (Civic Signals, a NEW organization he co-founded with University of Texas at Austin professor Talia Stroud, aims to build new models that would do just that.)
With these ideas all very much in the news, The Interface’s Zoe Schiffer caught up with Pariser to talk about his new project, whether filter bubbles are real, and why banning political ads could have unintended consequences.
The interview has been lightly edited for clarity and length.
Zoe Schiffer: Your new project, Civic Signals, is based on the idea that there’s a lot to learn by thinking about social media platforms as physical spaces. Can you talk about that a little bit?
Eli Pariser: Our starting point was trying to think about what do we want platforms to be like — not just what do we want them to stop doing. We realized one of the problems in how people think about platforms in digital space is that they think about them as places where rational people exchange information. When we think about them as physical spaces, it brings alive how human beings actually relate to one another. When you think about how people relate in a physical space you think about nonverbal cues and signals and different places to relate in different ways, which are part of what’s missing in how people conceptualize digital public squares.
How does the design of a space shape people’s behavior, either offline or online?
You can watch the same group of people walk into a library and they quiet their voices and their posture changes, and then watch them walk into a bar and see how their behavior shifts. Creating expectations for how people ought to behave is important, as is putting constraints on how we relate to each other.
William Whyte has this great extended rant about park benches. He hated them, because you when you’re sitting with someone you’re always either too close or too far away. He compares that to those little metal chairs that some cities now have in public spaces. When people sit in those, they typically shift them a couple inches. He took that as this statement of ownership and dignity, ‘I get to accustom this space to myself and with it the nature of the relationship to whoever I’m talking to.’ I think that’s another piece that a lot of digital platforms lack by being — by necessity — very one size fits all.
How does that translate to the digital world?
Well, spaces not only shape how we act individually, but they shape how groups of people interact. There’s a certain type of conversation you can have in a small cozy room that’s not as possible in a large crowded environment.
I think it’s really important because there’s this naive view of freedom, that freedom means having the most options at your disposal. But what we know is, it’s impossible to choose between a million different options. To actually make choices and have agency, it’s important to have some structure and be able to see what the options are, and that’s not possible when it’s a free-for-all.
Part of what I’m trying to argue for isn’t for one structure that serves everyone. There’s lots of different types of buildings and rooms that serve different purposes. But these vast open expanses have limited value. People react to them in antisocial ways because of the sense of the level of noise and the sense of overwhelm they feel.
There’s been a lot of research on filter bubbles — the term you coined almost 10 years ago — and how algorithms impact us. I know some people have questioned whether they really have as big an effect as you say. Has it changed your thinking on that early work at all?
We know that there is an effect. What we are learning is some people are super bubbled, and some people are not. It’s not a political statement, either — it’s across the spectrum.
If you think about the fact that people who have fewer friends on Facebook tend to be older, which correlates perhaps with them being more conservative, and following pages of conservative media outlets. They might be getting a more lightly algorithmically filtered feed, because there’s just less stuff to filter than for people who have thousands of friends. But on the other hand, a lot of what they’re seeing are page posts from outlets that reinforce what they believe. Now think about that, versus someone who is a news junkie but has a much more algorithmically filtered feed. The effects vary by where you stand in the system, but they’re almost impossible to assess because people can’t research Facebook. We need to be able to research the biggest and most powerful platform in human history.
This work has led to new legislation — The Filter Bubble Transparency Act — that’s aimed at forcing big tech companies to disclose how their algorithms work (although it’s questionable whether it would actually do that). Do you think that’s going to be effective?
I read [Adi Robertson’s] piece on The Verge, but I support what the senators are doing — in the sense that any effort to get Americans to think about and understand the basics of how algorithms work is really important, at least as a first step.
All of us that are online all the time can forget that most people haven’t gotten their heads around the basic mechanics of these platforms. The law is necessary but not sufficient. As a public education effort, I think it’s a good step.
One of the assumptions I had when I wrote The Filter Bubble was that some of the problems we’re seeing in civic discourse in society are really issues of exposure. But I’ve come to believe that’s not true. As a liberal, when I read Fox News, it confirms my bad opinions of Fox News. The research shows that it’s not just another whether we come into contact, it’s about how we come into contact. It makes all the difference. It’s all about the design.
So how do we build healthier spaces — online and in the real world?
One way that people think about building healthy places is the built environment, what exists where and what the design of the space is. But then they also think about what people are doing in those space: the programming, and who’s leading and taking responsibility for what takes place.
When I think about how platforms are structured, there’s a lot of focus on code and design and what’s physically possible. In the real world, there’s a difference between the law and physics. You can throw a brick through a window, even though it’s illegal to do so. But in the digital space, those things converge. If I say you can’t throw a brick through the window, you actually can’t. But there’s a lot less focus on soft social infrastructure in digital spaces, and that’s really important. The questions of what are people doing here, who is leading and showing people what behavior is invited matters.
I do think Reddit has approached this in a more thoughtful way than many platforms. Subreddits have clear sets of rules and moderation. That makes for some better conversational spaces than a similar-sized Facebook group or Twitter community. It’s not surprising that coders want to code, and don’t want to think about human social organizing. But it’s a really important part of how we move forward from where we are now.
What’s the business model for a healthier digital space?
I think we need both private platforms that are more public friendly, but also platforms that are publicly owned where people feel like they have real ownership. Because people behave really differently when they own something. They take better care of it. Right now, nobody feels like they’re responsible for picking up the trash, so there’s a lot of trash around.
My hope is really just to start a conversation about what our aspirations are for our digital life and how to build spaces that more accurately embody them, and then also inspire people who are building things to build those things a little different. We’ll see if that happens, but it would be exciting if it does.
If you were going to extend the cities metaphor a little further, how does it relate to the debate that’s been raging on Facebook and Twitter about letting politicians lie in political ads on these platforms?
So if we think about ads like an amplifier on a stage in a public space that you can plug your mic into, then you would want to think about who has access to that mic. If everyone can bring their own amp and turn it up as high as they want, it drowns out the ability to have a thoughtful conversation. Part of that points me towards a personal view that like, I worry about some of the consequences of turning off political ads entirely. The ability to reach the public is especially important if you don’t already have channels to do that. But that stage is not being managed well. So I’m sympathetic to the notion that if people are running onstage and yelling profanities, we need to deal with that problem before we open it back up again.
Today in news that could affect public perception of the big tech platforms.
🔃 Trending sideways: Facebook and YouTube worked to the block the spread of the name of the whistleblower who supposedly filed the complaint against President Trump. Twitter went the opposite route, allowing posts with the person’s (supposed) name and photo.
🔽 Trending down: Instagram is paying for some celebrities’ production costs on IGTV, the app’s hub for longer videos, so long as they don’t discuss politics or “social issues.” The rule would seem to undermine the company’s recent free-speech push.
🔽 Trending down: 12 anonymous Facebook employees published a damning open letter about ongoing discrimination at the company. The article circulated at the company late last week, prompting executives to send out an apology.
⭐ Google took action against seven ads purchased by Trump’s 2020 campaign last month, saying they violated the company’s rules. Google has largely escaped the political ad controversy that’s plagued Facebook and Twitter, but that could be changing. Tony Romm and Isaac Stanley-Becker at The Washington Post have the story:
Google unveiled its political ad transparency efforts last year, responding to regulatory threats from Congress in the wake of the 2016 election. During the race, Russian agents took to YouTube and the Web’s other social-networking sites in a bid to stoke political unrest, relying on a mix of ads and organic posts, photos and videos to undermine Democratic contender Hillary Clinton and boost then-candidate Trump, U.S. investigators have found.
But Google’s efforts ultimately stopped far short of what lawmakers had hoped. The company vets organizations that seek to run ads about federal candidates, and it caches many of them in a publicly available archive. But the search-and-advertising giant still discloses far less than its competitors, offering little transparency about the sprawling network of dark-money groups and super PACs that run ads about polarizing issues such as abortion or immigration. Those are the kinds of ads that Russian malefactors deftly exploited four years ago and that continue to vex tech companies today.
A visual guide to how Google edged out its rivals and built the world’s dominate ad machine. The company is currently being investigated for possible antitrust violations. (Keach Hagey and Vivien Ngo / The Wall Street Journal)
First-time candidates need to run ads on Facebook in order to get name recognition and galvanize support, some experts argue. As pressure mounts on the company to eliminate political ads, surprising support for them is coming from new House Democrats. (Isaac Stanley-Becker / The Washington Post)
Facebook’s former chief product officer, Chris Cox said political ads should be fact-checked, but that it’s difficult to do so in a non-partisan way. Cox is now advising a left-leaning nonprofit called ACRONYM. (Kif Leswing / CNBC)
The top Facebook executive in charge of Facebook News, former NBC News anchor Campbell Brown, co-founded a media company that’s criticized Elizabeth Warren extremely harshly. It’s called The 74 and mainly covers education. (Judd Legum / Popular Information)
After Legum published a story about Brown’s role at Facebook and her involvement in The 74, the media company hit back, saying much of the Warren coverage cited in the article came from opinion pieces, and adding that Brown is an advisor without editorial oversight.
Twitter announced that it’s developing a deepfake policy, and asked users to weigh in on the new rule. The company is either going to add warnings to deepfakes, add context to the posts that share them, or remove them altogether. You can still vote on what you think the company should do. (Makena Kelly / The Verge)
Regulators in Brussels are warning American authorities about mistakes they made cracking down on Big Tech. Europe has been heralded as the world’s leading tech industry watchdog, but regulators think they still haven’t gone far enough when it comes to Google, Amazon, Apple and Facebook. (Adam Satariano / The New York Times)
The porn industry is a case study in how strict content moderation — and laws that hold publishers accountable — don’t necessarily eradicate innovation. Perhaps the tech industry should take note. (Lux Alptraum / OneZero)
More than a hundred people in India recently learned that their phones were hacked by the Israeli cybersecurity firm NSO Group. NSO has been in the news in recent weeks for hacking the phones of journalists, government officials, and human rights advocates — and the number of people affected keeps expanding. (Vindu Goel and Nicole Perlroth / The New York Times)
⭐ Google is gathering the personal health information of millions of Americans as part of a secret initiative called “Project Nightingale.” The initiative appears to be the largest effort a Silicon Valley tech giant has made to establish a toehold in the massive health-care industry, says Rob Copeland at The Wall Street Journal:
Google in this case is using the data, in part, to design new software, underpinned by advanced artificial intelligence and machine learning, that zeroes in on individual patients to suggest changes to their care. Staffers across Alphabet Inc., Google’s parent, have access to the patient information, documents show, including some employees of Google Brain, a research science division credited with some of the company’s biggest breakthroughs.
Instagram is going to test hiding like counts in the US. The company already ran similar tests in Australia, Brazil, Canada, Ireland, Italy, Japan, and New Zealand. The move is meant to improve the health of the platform by making it feel less like a popularity contest. I’m going to take this opportunity to bring back brunch photography! (Adrienne So / Wired)
An Instagram account called @BallerBusters is wreaking havoc in the influencer community by calling out people who pretend to be wealthier than they are. In many cases, these #FlexOffenders use the veneer of a fancy lifestyle to sell mentorship, membership or online classes. (Taylor Lorenz / The New York Times)
Apple is aiming to release an augmented-reality headset in 2022, and a sleeker pair of AR glasses by 2023. The headset, code-named N301, will offer a hybrid of AR and VR capabilities, and resembles Facebook’s Oculus Quest. (Wayne Ma, Alex Heath and Nick Wingfield / The Information)
How the Sunrise Movement built a viral climate campaign without Twitter ads. Twitter has been an important organizing tool for the group, which uses it to mobilize voters and shame public officials — all without spending ad dollars. (Justine Calma / The Verge)
WhatsApp co-founder Brian Acton still thinks you should delete Facebook. The former Facebook executive doubled down on his tweet from March at Wired’s 25th anniversary summit. (Zoe Schiffer / The Verge)
A slew of new “personal CRM” start-ups want to help you manage your relationships like sales leads. Their services range from reminding you about your loved one’s birthdays to helping you write the perfect “follow up” message. (Kaitlyn Tiffany / The Atlantic)