clock menu more-arrow no yes

Filed under:

TikTok removing accounts of users who share QAnon-related content

The platform says content and accounts that promote QAnon violate its disinformation policy

Illustration by Alex Castro / The Verge

TikTok has been removing the accounts of users who share QAnon-related content on the platform, NPR reported, part of a policy the video-sharing platform says has been in effect since August. While TikTok initially focused on reducing discoverability of such content — banning QAnon-related hashtags, for instance — its policy now includes removing the content, banning accounts, and redirecting searches and hashtags to display its community guidelines.

“Content and accounts that promote QAnon violate our disinformation policy and we remove them from our platform,” a spokesperson said in an email to The Verge. The company has taken steps to make QAnon-related content harder to find in search. “We continually update our safeguards with misspellings and new phrases as we work to keep TikTok a safe and authentic place for our community.”

A search for QAnon on TikTok brings up its community guidelines
Image: TikTok

In July, TikTok started blocking several hashtags related to the QAnon conspiracy theory, including “QAnon,” “QAnonTruth,” and the related phrase “Out of Shadows.” But the videos themselves were still visible and could have appeared among the For You suggestions or in users’ feeds, the BBC reported.

Other social platforms have cracked down QAnon content as well. Facebook said earlier this month it would ban content related to QAnon, which it termed a “militarized social movement” from Facebook and Instagram. Facebook had removed some groups and pages promoting QAnon back in April, saying they were engaged in “coordinated inauthentic behavior.” Facebook users can still post QAnon content to their individual profiles, however.

Twitter banned thousands of QAnon-related accounts and links over the summer, and Reddit banned the QAnon subreddit r/GreatAwakening for violating its rules against “inciting violence, harassment, and the dissemination of personal information.” Even exercise platform Peloton has removed hashtags related to the conspiracy theory from its online classes.

And while YouTube isn’t banning QAnon content, the platform said last week it will remove “conspiracy theory content used to justify real-world violence.”

QAnon is a false conspiracy theory that claims, among other things, that President Trump is secretly planning to arrest high-profile Democratic politicians and celebrities for pedophilia or cannibalism. The FBI has labeled the conspiracy theory a potential domestic terrorist threat.