Skip to main content

YouTube’s vague conspiracy theory policies present issues for the platform

YouTube’s vague conspiracy theory policies present issues for the platform

/

The line between entertainment and harmful is still unclear

Share this story

Shane Dawson
Shane Dawson/YouTube

YouTube removed ads from a feature-length conspiracy theory documentary within hours of it being posted by one of the site’s most popular creators last week. The next day, all of those ads were back — with YouTube’s approval.

It was a surprising message from YouTube at a time when the platform is focused on reducing the number of conspiracy theory videos that get recommended to viewers. Just last month, YouTube said it would promote fewer videos containing “harmful” misinformation, like claims that the Earth is flat, as part of its mission to be a safer platform and to clean up the site for advertisers that are worried about objectionable content. Those guidelines are just beginning to be enforced, but this incident speaks to the unclear ways YouTube draws the line between what’s okay and what isn’t.

The video, uploaded by Shane Dawson who has 20 million subscribers, covered a series of popular conspiracy theories, including one that claims the recent California wildfires were set on purpose. Dawson states that everything he covers is just a theory, but the video shows him and his friends questioning whether those theories could be true. “How does every house on the street catch fire except for one?” Dawson asks while driving past destroyed homes. “What does that mean?” So far, the video has 26 million views.

YouTube initially stripped ads from Dawson’s video after an automated review spotted a portion that contains footage of a dangerous prank — something the site outlawed last month — a spokesperson told The Verge. After manual review, YouTube reinstated Dawson’s ability to include ads on the video, saying it wasn’t in violation of the site’s advertising guidelines. It was also able to appear on the front page and be recommended to other users.

Dawson declined to comment for this story. A YouTube spokesperson pointed to older explanations about how its guidelines work.

Despite the recent changes, YouTube largely doesn’t penalize conspiracy videos. The company gives advertisers the option not to appear on them, but those videos are never banned from running ads altogether. While YouTube may limit a conspiracy video’s reach if it’s considered dangerous, it likely won’t be removed from the site.

Dawson’s video covers a number of false theories — like iPhones perpetually spying on their owners — but YouTube doesn’t consider them to violate even the rules that would warrant reducing its distribution. A spokesperson said the company is focused on reducing recommendations of “borderline content or videos that could misinform users in harmful ways,” such as anti-vaccination messages or 9/11 conspiracies, and nothing in Dawson’s video qualifies.

YouTube is attempting to draw a line between series like Dawson’s, which don’t explicitly present conspiracies as fact, and shows like Infowars, which promoted the PizzaGate conspiracy theory as truth. But in doing so, it allows for a broad gray area where popular videos can present conspiracy theories as possibilities without being penalized.

“YouTube largely doesn’t penalize conspiracy videos.”

Dawson knew he was walking the line with both YouTube and advertisers when posting his video. He told commentator Philip DeFranco that he expected his video to be demonetized because it “wasn’t gonna be brand friendly” for advertisers. In the video, he also includes a brief disclaimer — moments after saying the iPhone spying theory is real — that “these are all just theories, none of them are facts, and they’re not meant to hurt any one or any company.”

Conspiracy theorists, including Infowars contributor Paul Joseph Watson, have called YouTube’s decision to not promote some creators whose videos focus on conspiracy theories “censorship.” Watson highlighted the impact on Dawson in a recent Infowars article, and the two engaged with each other on Twitter about YouTube demonetizing conspiracy theory videos. It’s an exchange that Dawson has since been criticized for, with one activist using it as an example of how radicalization on YouTube works.

YouTube plans to spend more time figuring out how to minimize the harmful impact of conspiracy theory videos on the platform by better teaching its machine learning algorithm what to keep an eye out for, according to its blog post. What’s apparent, however, is that the vague language of its updated plan to tackle conspiracy theories means creators are unlikely to be driven from the site.