Skip to main content

What YouTube could teach Facebook about conspiracies

What YouTube could teach Facebook about conspiracies


Context cards could help Facebook fight misinformation, but they also have limits

Share this story

Illustration by William Joel / The Verge

Months before critics revisited Facebook’s embrace of Holocaust deniers and other conspiracy peddlers, YouTube faced similar pressures. In February, a Wall Street Journal investigation found that Google’s video-sharing site routinely pushed users to misinformation or hyper-partisan content through its automated recommendations. In a follow-up in The New York Times, Zeynep Tufekci called Google’s video-sharing site “the great radicalizer.”

Like Facebook, Google is loath to declare any topic off-limits to its user base. And so at South By Southwest, YouTube CEO Susan Wojcicki unveiled a potential solution: “information cues,” a companion product for conspiracy videos that offers users additional, non-crazy viewpoints about subjects like the Moon landing and chemtrails. It began rolling out within the past two weeks, and the company does not yet have data to share about how it’s working, a YouTube spokeswoman said.

Could a similar approach work for Facebook? Writing in The Atlantic, Yair Rosenberg suggests that the company try it.

Take the Facebook page of the “Committee for Open Debate on the Holocaust,” a long-standing Holocaust-denial front. For years, the page has operated without any objection from Facebook, just as Zuckerberg acknowledged in his interview. Now, imagine if instead of taking it down, Facebook appended a prominent disclaimer atop the page: “This page promotes the denial of the Holocaust, the systematic 20th-century attempt to exterminate the Jewish people which left 6 million of them dead, alongside millions of political dissidents, LGBT people, and others the Nazis considered undesirable. To learn more about this history and not be misled by propaganda, visit these links to our partners at the United State Holocaust Museum and Israel’s Yad Vashem.”

Obviously, this intervention would not deter a hardened Holocaust denier, but it would prevent the vast majority of normal readers who might stumble across the page and its innocuous name from being taken in. A page meant to promote anti-Semitism and misinformation would be turned into an educational tool against both. The same could easily be done for pages and posts promoting conspiracy theories ranging from 9/11 trutherism to Islamophobic obsessions with impending Sharia law, working with partners ranging from the NAACP to the Anti-Defamation League to craft relevant responses and source materials.

It’s an appealing pitch, one that seeks a middle ground between free speech absolutism and passive promotion of calls for violence. But measuring how effective it is will be difficult.

Platforms can’t read users’ minds, and it’s impossible to determine whether truthful context added to conspiracy content limits the spread of noxious ideas. In cases where platforms link to external sources, as YouTube does to Encyclopedia Britannica and Wikipedia, YouTube measures the percentage of viewers who click those links. An earlier set of information cues that YouTube introduced to highlight when a channel is paid for by a government has seen high clickthrough rates, a YouTube spokeswoman said.

It’s impossible to determine whether truthful context added to conspiracy content limits the spread of noxious ideas

In the meantime, we can simply examine the text shared with viewers in these information cues. Logged out of YouTube, I ran a search for “moon landing faked.” The first video was from — disappointingly — BuzzFeed’s Blue channel, which features an otherwise unidentified guy named Matt confessing his “unpopular opinion” that the Moon landing never happened. The video was terrible, and after it finished, auto-play led me directly to a video alleging that NASA “admitted” the landing was faked. (NASA has never done any such thing.)

Both the search results and individual videos came with prominent information cues: in the former case, above the results; and in the latter, directly under the video player. Here’s the text that appears:

ApolloMoon-landing project conducted by the U.S. National Aeronautics and Space Administration in the 1960s and ’70s. The Apollo program was announced in May 1961, but the choice among competing techniques for achieving a Moon landing and return was not resolved until considerable further study. In the method ultimately employed, a powerful launch vehicle (Saturn V rocket) placed a 50-ton spacecraft in a lunar trajectory. Several Saturn launch vehicles and accompanying spacecraft were built. The Apollo spacecraft were supplied …

Notably, the relevant information here — that the Moon landing actually happened — is not contained in the information cue. The gray box offers no hint that it is designed to serve as a counterweight to the content of the video. It feels tentative — halfhearted, even. The only thing you learn from reading the text in the box is that there was a research program aimed landing on the Moon, something even the conspiracists don’t deny.

Clicking anywhere on the text takes you to a full page about the Apollo program on Encylopedia Brittanica, which briefly describes the Apollo 11 Moon landing at the end of its second paragraph. Conspiracy theories about the landing are not addressed. The presentation highlights a limitation of using third parties to rebut conspiracy theories: encyclopedia articles generally are not written, first and foremost, to rebut other writing. Their plainspoken style can often bury the lede.

In his piece on Facebook and Holocaust denial, Rosenberg calls for more explicit editorializing. (“This page promotes the denial of the Holocaust, the systematic 20th-century attempt to exterminate the Jewish people which left 6 million of them dead.”) That would likely represent an uncomfortable degree of editorial intervention at Facebook. Certainly, it would be unprecedented. But if Facebook wants to both host misinformation and prevent it from spreading, the company may have no other choice.

The Interface /

An evening newsletter about Facebook, social networks, and democracy.