YouTube is facing a lawsuit from a group of channel owners who say their rights were violated by the platform’s recent moderation actions against QAnon accounts. The users, many of whom boasted hundreds of thousands of followers on the platform, are seeking a temporary restraining order to restore their accounts.
“YouTube’s massive de-platforming, which occurred just three weeks before the 2020 Presidential election, worked to the severe detriment of both conservative content creators and American voters who seek out their content,” the complaint alleges. “YouTube took this draconian action so swiftly that the Plaintiffs... received no advance notice and were not able to download their own content.”
What is Section 230?
Section 230 of the Communications Decency Act, which was passed in 1996, says an “interactive computer service” can’t be treated as the publisher or speaker of third-party content. This protects websites from lawsuits if a user posts something illegal, although there are exceptions for pirated and prostitution-related material.
Sen. Ron Wyden (D-OR) and Rep. Chris Cox (R-CA) crafted Section 230 so website owners could moderate sites without worrying about legal liability. The law is particularly vital for social media networks, but it covers many sites and services, including news outlets with comment sections — like The Verge. The Electronic Frontier Foundation calls it “the most important law protecting internet speech.”
Section 230 of the Communications Decency Act typically protects platforms from lawsuits over moderation actions, and the law is likely to be a pillar of YouTube’s legal defense. Republicans have proposed adding a “duty of faith” clause to Section 230, which would make it easier for lawsuits of this nature to succeed. But none of those efforts have made it into law. As a result, the lawsuit’s legal merit remains uncertain.
Reached for comment, Google emphasized its commitment to even-handed moderation. “We cannot comment on pending litigation, but our policies are updated regularly to meet new challenges, like harmful conspiracies that have been used to justify real-world violence,” a company representative said. “We have a dedicated policy team that works to review our policies and adjust them as needed. We apply our policies consistently regardless of who owns the channel.”
At least one of the channels involved in the lawsuit had run into trouble with moderators before the October 15th policy shift. The SGT Report channel, run by the lead plaintiff in the case, was suspended in 2018 after promoting unfounded allegations about Hillary Clinton and Huma Abedin terrorizing a small child — although the channel was reinstated after subscribers complained. Another of the accounts, TRU Reporting, has promoted a number of Pizzagate-adjacent conspiracy theories on Twitter, including allegations that children in the Biden family are victims of sex trafficking.
YouTube has long struggled with conspiracy content, but it has only recently taken concrete steps to reduce its spread. In 2018, the platform began adding “authoritative” links to videos about conspiracy-adjacent topics like the Moon landing or the Oklahoma City bombing, in the hopes that factual information would steer users away from wilder theories. The following year, YouTube changed its algorithm to downrank conspiracy content, and platform moderators took a harder line against conspiracy videos, even banning a Shane Dawson video that explored popular conspiracies.
On October 15th, 2020, YouTube took that campaign a step further, expanding its hate and harassment policies “to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.” The policies were particularly focused on the QAnon conspiracy theory, resulting in the immediate removal of tens of thousands of QAnon videos and hundreds of channels.
In the lawsuit, however, plaintiffs frame the moderation move as targeting conservative YouTube channels more generally, playing into long-standing Republican concerns about anti-conservative bias on platforms. The complaint cites the plaintiffs’ free speech rights under the First Amendment, arguing that the removal of the channels in the weeks running up to the election will trigger irreparable harm to the public.
“Because Plaintiffs’ channels address issues of public concern that are highly relevant to the November 3 election and its anticipated aftermath,” the complaint reads, “both Plaintiffs and the public will suffer irreparable harm in the absence of an immediate and affirmative injunction.”
Update 10/27 9:14am: Updated with statement from Google.