YouTube is the latest Silicon Valley company to update its moderation policies around the fringe theory QAnon, announcing that content that targets or harasses people based on conspiracy theories will be removed. YouTube will not issue a blanket ban on QAnon content, though.
The company is trying to curb harassment and hate by “removing more conspiracy theory content used to justify real-world violence,” according to its new blog post. That means if people are posting videos about QAnon and alleging anything that could result in actual harm or harassment for a specific person or group, those videos will be removed. YouTube’s blog post did not talk about whether or not those accounts would be removed, although YouTube tends to operate on a three-strike policy before a channel is taken down.
“As always, context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up,” the blog post reads. “We will begin enforcing this updated policy today, and will ramp up in the weeks to come.”
YouTube has spent some time updating its policies over the last couple of years to target hateful videos, some of which include conspiracy theory videos, according to the blog. The policies are supposed to limit algorithmic recommendations of these types of videos, and the number of views for QAnon content that came from non-subscribed recommendations has dropped by more than 80 percent since January 2019, the company says.
Although YouTube is the latest company to take an additional stance against QAnon conspiracy theories, other social platforms also are starting to take firmer stances. Facebook banned content related to QAnon just last week, although posts from individual accounts are still fine. It was the biggest step taken from Facebook in its ongoing fight against misinformation spreading on the platform. Pinterest also reiterated its policy to Insider on banning QAnon content, which has been in place since 2018, a spokesperson told The Verge. Peloton also removed hashtags related to the conspiracy theory.
YouTube’s blog post adds it has “removed tens of thousands of QAnon-videos and terminated hundreds of channels” since the updated policy went into place. The company calls the work “pivotal in curbing the reach of harmful conspiracies,” but acknowledged there’s more to be done.
“There’s even more we can do to address certain conspiracy theories that are used to justify real-world violence, like QAnon.”
Update October 16th, 12:15pm ET: The story has been updated to include additional context from Pinterest.