TikTok has added a new safety feature that helps users prevent predatory, hateful, and obscene comments from appearing on their videos.
A custom comment filter is now available to all users, letting people decide what type of comments they don’t want to see on their videos, including words or phrases that may be upsetting. They can also choose to only allow approved people to leave comments.
TikTok has released a couple of new tips videos to help explain the safety and security features that users have at their disposal, such as who can send messages, follow an account, or even partner up with for additional content.
Other social platforms, like Instagram, already have similar comment moderation filters. YouTube has a similar approval feature that allows creators to give specific people the ability to post comments, and requires everyone else’s comments to be screened first.
Comment moderation has been a topic of conversation since it came out last week that some YouTube videos showing children had predatory messages in the comment section. The company responded by deleting tens of millions of comments, a spokesperson previously told The Verge, and closing comment sections on videos that might attract predators.