Skip to main content

What Mark Zuckerberg’s big talk about free speech left out

What Mark Zuckerberg’s big talk about free speech left out

/

Facebook isn’t a neutral platform, and we shouldn’t talk about it that way

Share this story

All week, we’ve been talking here about a central debate in our reckoning over big tech platforms and their power: what should stay up on the internet, and what should come down. The discussion has been fueled by two national conversations: one, led by Sen. Elizabeth Warren, about whether Facebook ought to exempt political ads from fact-checking. And the other, about how far the limits of free speech extend in a world where China is moving aggressively to restrict American companies from hosting speech supportive of democratic protesters in Hong Kong.

On Thursday, Mark Zuckerberg had his say. In a 45-minute speech at Georgetown University, Facebook’s CEO made his case for internet services that promote the maximum amount of free speech. (Here’s a transcript and a link to the video.)

The speech began with a major tactical and factual error, in which Zuckerberg attempted to awkwardly retcon the founding of Facebook into a story about giving students a voice during the Iraq war. (“I remember feeling that if more people had a voice to share their experiences, maybe things would have gone differently.”) All previous reporting on the subject suggests that the truth was much, much hornier, and the fact that Zuckerberg’s speech began so disingenuously caused lots of the folks I read to tune out the rest.

But I wanted to see how Zuckerberg would tackle the two big speech debates of the moment: lying in ads, and China.

On the former subject, Zuckerberg made his case for Facebook staying out of it:

We don’t fact-check political ads. We don’t do this to help politicians, but because we think people should be able to see for themselves what politicians are saying. And if content is newsworthy, we also won’t take it down even if it would otherwise conflict with many of our standards.

I know many people disagree, but, in general, I don’t think it’s right for a private company to censor politicians or the news in a democracy. And we’re not an outlier here. The other major internet platforms and the vast majority of media also run these same ads.

Zuckerberg also made his case for Facebook’s pro-speech bias as a necessary counterweight to the advance of Chinese soft power around the world. He said:

China is building its own internet focused on very different values, and is now exporting their vision of the internet to other countries. Until recently, the internet in almost every country outside China has been defined by American platforms with strong free expression values. There’s no guarantee these values will win out. A decade ago, almost all of the major internet platforms were American. Today, six of the top ten are Chinese.

We’re beginning to see this in social media. While our services, like WhatsApp, are used by protesters and activists everywhere due to strong encryption and privacy protections, on TikTok, the Chinese app growing quickly around the world, mentions of these protests are censored, even in the US.

Is that the internet we want?

On the whole, Zuckerberg tried to chart a middle course between his two loudest, angriest constituencies: the voices, mostly on the left, pushing for him to take down much more content than Facebook currently does; and the voices, mostly on the right, that complain Facebook is an engine for censorship that actively suppresses their views. It’s a position that, I think, reflects Zuckerberg’s actual beliefs — and it’s also the only tenable position for someone who is trying to serve the largest number of customers, no matter their political views.

By that standard, I thought, the speech was fine. But in the view of Facebook it presented to the world, there were a few important things it left out.

One, Zuckerberg presents Facebook’s platform as a neutral conduit for the dissemination of speech. But it’s not. We know that historically it has tended to favor the angry and the outrageous over the level-headed and inspiring. We know that the distribution of various formats for speech, such as live video or links to third-party publishers, will rise and fall sharply — and with no warning — depending on Facebook’s business needs. Zuckerberg’s talk focuses exclusively on the right of speech, when the far more consequential question is the right of reach. What spreads, and by what means, and to what effect? These are all questions Facebook avoided today.

Two, Zuckerberg presents Facebook specifically and social media more generally as a leveling force in democracies. “People having the power to express themselves at scale is a new kind of force in the world,” he said. “A Fifth Estate alongside the other power structures of society.” And that’s true — social networks really did help to catalyze any number of vital social movements, including Black Lives Matter and #MeToo.

But as Jen Schradie wrote in this year’s The Revolution That Wasn’t, social networks have more reliably served to reinforce existing social hierarchies. Getting wide reach on platforms typically takes pre-existing celebrity, a coordinated campaign, or both. As Schradie told Vox.com:

The idea of neutrality seems more true of the internet because the costs of distributing information are dramatically lower than with something like television or radio or other communication tools.

However, to make full use of the internet, you still need substantial resources and time and motivation. The people who can afford to do this, who can fund the right digital strategy, create a major imbalance in their favor.

That’s why it’s somewhat disingenuous to paint Facebook in particular as a great equalizer in national politics. Especially when the company has chosen to accept paid political advertising — which naturally tends to benefit the wealthy and the status quo.

Finally, Zuckerberg presents Facebook as somewhat divorced from the real-world consequences of its speech decisions. He acknowledges that the company makes mistakes, but short of plugging its forthcoming independent oversight board, avoids discussion of what ought to happen when the company makes them. So much of the frustration with Facebook — over its size, its power, and its decisions on content moderation — stems from the fact that its decisions can have deadly consequences.

Think of the sectarian violence in Myanmar, or Sri Lanka, or India that has resulted from unfettered speech on the platform. Think of the new mothers who have joined anti-vaccination groups in the United States after Facebook’s algorithm suggested they do so. Facebook has taken action to remedy these problems after the fact. But it was never held accountable for them. (If a single person lost their job over any of those calamities, it has never been made public.)

There is something untenable about a massive corporation / quasi-state that sets global speech policies but never has to answer for them, outside the odd Congressional hearing or public-relations crisis. It’s easy to stand firmly on the side of free speech when the only negative consequence you suffer as a result is more speech.

What happens next? On one side, gradual improvements in detecting fake accounts, and incitements to violence. On another, outrageous lies in political ads going viral throughout the 2020 US presidential election campaign, creating increasing pressure for Facebook to changes its policies. A Zuckerberg interview on Fox News that fails to change anyone’s mind on any subject. Regulation, maybe? Antitrust enforcement?

What stays up? What comes down? The debate is necessary, complicated, and far from over.

The Ratio

Today in news that could affect public perception of the big tech platforms.

🔼 Trending up: Facebook is donating $25 million to build housing for teachers.

🔽 Trending down: Zuckerberg’s awkward linking of company history to the Iraq war dominated early discussion of the speech.

Governing

Sen. Ron Wyden (D-OR) introduced a new data privacy bill that gives the Federal Trade Commission more power to fine tech companies that violate user privacy. It’s called the Mind Your Own Business Act. Here’s Makena Kelly at The Verge:

If approved, the bill would allow the FTC to establish minimum privacy and cybersecurity standards for tech platforms and give it the authority to issue fines of up to 4 percent of a company’s annual revenue for first-time offenses, similar to provisions in the GDPR. The FTC has resolved a number of privacy investigations into companies like FacebookYouTube, and Equifax over the past year, but officials have faced significant criticism for not taking stronger action.

The bill also includes strong restrictions on tracking and digital advertising:

Wyden’s bill would institute a federal Do Not Track requirement, giving users the option to opt out of data tracking that’s used to target ads. Platforms like Facebook and Twitter would also be required to offer “privacy-protecting” versions of their products for a fee. Interestingly, Wyden’s measure would extend the Federal Communications Commission’s Lifeline program for low-income people to use to obtain these privacy-focused versions of products so “privacy does not become a luxury good,” as his office put it.

Wyden told Vice the legislation is needed so Mark Zuckerberg starts taking privacy seriously:

Mark Zuckerberg won’t take Americans’ privacy seriously unless he feels personal consequences,” Wyden said. “A slap on the wrist from the FTC won’t do the job, so under my bill he’d face jail time for lying to the government.”

Facebook’s plan to create an independent Oversight Board to police content is a really good idea, argues Kara Swisher. The board will act as the ultimate judge of what content stays up on the platform, and should be up and running in a year. Swisher says it deserves public support. (Kara Swisher / The New York Times)

Mark Zuckerberg is meeting with lawmakers ahead of next Wednesday’s Congressional hearing to discuss Libra. He’s trying to answer their regulatory concerns and rally more support for Libra in the United States — which appears to be collapsing. (Christopher Stern and Ashley Gold / The Information)

Chris Hughes, the Facebook co-founder who has become a vocal antitrust activist, launched a $10 million ‘anti-monopoly’ fund alongside George Soros and eBay founder Pierre Omidyar. Hughes has been calling for Facebook to be broken up for months. His new fund will support antitrust actions in a number of industries, not just tech. (Tony Romm / The Washington Post)

Amazon is making moves into the business of providing elections software. (Kevin McLaughlin / The Information)

Trump retains a significant advantage over Democratic presidential candidates: his well funded digital campaign. Under Brad Parscale, the president’s re-election committee has devoted millions honing a sophisticated digital apparatus that can microtarget voters on Facebook and Google. (Thomas B. Edsall / The New York Times)

An investigation suggests that Daryl Morey, the Houston Rockets general manager whose pro-Hong Kong tweet jeopardized NBA-China relations, was likely the subject of a coordinated harassment campaign from pro-China trolls. Half the accounts that replied to his tweet had fewer than 13 followers. (Ben Cohen, Georgia Wells and Tom McGinty / The Wall Street Journal)

Industry

More school districts are using facial recognition technology to prevent school shootings. But the software is also being used to enforce school rules and monitor students, according to Tom Simonite and Gregory Barber at Wired:

Jason Nance, a law professor at the University of Florida, says facial recognition is part of a trend of increasing surveillance and security in US schools, despite a lack of firm evidence that more technology makes kids safer. Nance’s research has documented how high-profile school shootings drive intensifying surveillance, with the burden falling heaviest on students of color.

Companies selling facial recognition systems see schools as a growing market. Shootings like the murder of 14 students and three staff members at Marjory Stoneman Douglas High School in Parkland, Florida, last year drive interest and sales. Max Constant, AnyVision’s chief commercial officer, won’t disclose how many US schools the company has worked with but says its work “typically centers around areas in which previous tragedies have occurred.” In a statement, AnyVision said its technology is installed at hundreds of sites worldwide. “Our technology never catalogs or retains records of individuals screened, and AnyVision remains committed to operating under the highest level of privacy and ethical standards,” the company said.

Twitch is becoming a go-to destination for people looking to watch Trump rallies and live-stream violence online. Some, like the German synagogue shooter, likely assume their videos will stay up longer on the newer platform. (Drew Harwell and Jay Greene / The Washington Post)

TikTok launched a series of educational videos in India in an attempt to expand its reach and appease local authorities who banned the app back in April. The videos range from explainers on science and math to tips on mental health awareness. (Manish Singh / TechCrunch)

The bizarre story of how the 1996 halloween song “Spooky Scary Skeletons” became a megaviral meme on TikTok, complete with its very own original dance. (Brian Feldman / Intelligencer)

Snap launched dynamic ads to attract more money from retailers. Now, advertisers can now create ads in real time by syncing their product catalogs. The ads will automatically adjust as product availability or prices change. (Sarah Perez / TechCrunch)

And finally...

Talk to us

Send us tips, comments, questions, and your Zuckerberg speech edits: casey@theverge.com and zoe@theverge.com.