Skip to main content

TikTok adds a strikes policy for bans

TikTok adds a strikes policy for bans


The company says it’s trying to make its moderation more transparant to the creators it affects.

Share this story

TikTok logo
Illustration by Nick Barclay / The Verge

TikTok has announced an update to its moderation system, which will implement account strikes, similar to YouTube’s community guidelines strikes. The company says it’s doing this because its previous system could be confusing to creators and exploited by bad actors.

If TikTok removes content, such as a video or comment, for violating a community guideline, the account behind it will get a strike, which expires after 90 days. There are multiple types of strikes — you can get them for specific product features like Comments or Live or for sections of TikTok’s policies (so strikes for doing a dangerous challenge won’t necessarily be counted with strikes for leaving a spammy comment).

Getting enough strikes in any category will result in a permanent ban, though the threshold is different depending on “a violation’s potential to cause harm to our community members.” TikTok doesn’t say exactly what those limits are, potentially to keep people from toeing the line. The company also says that it’ll ban people who get “a high number of cumulative strikes across policies and features.”

The company says that the strike system won’t apply for “severe violations” — anyone caught posting child sexual abuse material, threats of real-world violence, or other extreme content may still be immediately banned.

Two screenshots, one showing a screen saying “your account is in good standing,” and the other showing a list of reported videos.
The section of TikTok’s safety center that will let you see the strikes on your account.
Image: TikTok

TikTok says it’s rolling out an update to the app’s Safety Center that will let creators see and appeal strikes and that it’ll give users a warning if they’re getting close to that permanent ban. It’s also testing a feature that will tell you if your video won’t be algorithmically placed onto people’s For You pages, along with an explanation for why it’s been marked as ineligible.

The company says that all this is a move to increase transparency around its moderation decisions and to clarify things for creators who only occasionally violate rules by accident while clamping down on repeat offenders. It also admits that its previous account enforcement system, which involved temporary bans and restrictions, could be confusing to some creators.

TikTok has faced a lot of scrutiny over transparency and accountability, especially when it comes to how it recommends content. The company is facing the prospect of being banned outright in the US as an increasing number of lawmakers implement blocks on certain government-owned devices.