Skip to main content

YouTube says it will now remove ‘violent’ and ‘mature’ videos pretending to be kid-friendly

YouTube says it will now remove ‘violent’ and ‘mature’ videos pretending to be kid-friendly

/

Sounds like a good idea

Share this story

The YouTube logo against a black background with red X marks.
Illustration by Alex Castro / The Verge

Earlier this week, YouTube quietly announced a policy change to its treatment of videos targeted toward minors and young children. The video platform says it will now remove all content that contains “violent” or “mature” themes if it is targeted toward kids, either through the title of the video, its description, or the accompanying tags.

Going forward, YouTube says this type of content “will no longer be allowed on the platform.” Prior to this change, YouTube was age-restricting such videos, but now it’s going a step further to help clean up the platform and make it a safer place for children amid intense regulatory scrutiny and nonstop criticism of its executive leadership.

The policy change was announced two days ago, but it was done so on a YouTube Help community forum and appears to have gone largely unnoticed, with the post amassing only 20 replies and little news coverage. YouTube says it will begin ramping up enforcement of this new policy over the next 30 days, to give creators a chance to become familiar with the new rules.

YouTube used to age-restrict these videos, but now it will remove them

As part of that process, YouTube says it will remove videos that violate the policy, but it won’t be giving strikes to channels until the 30-day period is up. YouTube says it won’t be handing out strikes to videos uploaded prior to the policy change, but it still reserves the right to remove those videos. YouTube advises creators check the YouTube Kids guidelines if they want to specifically reach children with their videos, and it also advises creators to make sure their descriptions and tags are targeting the right audience to avoid getting caught up in the ban. YouTube also says it will be age-restricting more content that could be confusingly viewed as kid-friendly, like adult cartoons.

YouTube gives some examples of offending content, like videos tagged as “for children” that feature family-friendly cartoons engaging in otherwise violent or disturbing activity, like “injecting needles.” YouTube also warns against content featuring nursery rhymes that engage with mature themes like sex, violence, and death.

YouTube’s struggles to moderate its platform, and in particular videos targeted toward children, have been ongoing for years, although these issues have noticeably intensified in the past few years. In just the past six months, regulatory scrutiny over tech platform moderation has reached a fever pitch, with YouTube embroiled in an ongoing Federal Trade Commission investigation into its inability to combat videos designed to exploit, manipulate, or harm young kids and potential privacy violations related to its handling of underage viewers.

At the center of the controversy is YouTube’s recommendation algorithm that critics claim is fundamentally flawed, because it does not take into account the nature of the content it’s recommending to users and as a result can send people, including young children, down dangerous paths toward extremist, violent, and exploitative content. Another issue is that YouTube videos featuring children tend to perform extremely well on the platform, with the company’s algorithms rewarding creators that use kid-friendly tags and descriptions and put children on-screen with massive viewing metrics and more advertising dollars.

In June, YouTube said it would not stop recommending videos featuring children, even after the company became aware that pedophiles were flocking to such content to leave lewd comments and engage in other exploitative behavior. YouTube’s decision-making in that regard has become central to the FTC’s investigation, which reached a settlement in July that has is reportedly still under review by the Justice Department. The company has been vigorously criticized for taking a lax approach to enforcement and moderation on such videos because of how lucrative they are.

Despite YouTube creating a kid-friendly app, YouTube Kids, with the sole purpose of creating a safe place for parents to send their children on the internet, the company has come under fire for failing to address the myriad algorithm-gaming activities borne from the kid-friendly content boom. Before this year’s pedophile issue, YouTube faced down controversies like Elsagate, where anonymous and hard-to-track content creators overseas were making disturbing, copyright-infringing videos featuring distorted versions of Disney and Marvel characters.

YouTube’s child exploitation problem has been ongoing for years

In the past, YouTube has taken a half-hearted approach to moderation that involved dealing with batches of offending videos as it became aware of them, usually because news organizations were alerting the company’s communications team when asking for comment. Although it had rules around content featuring kids and content targeted toward kids, YouTube failed to uniformly enforce these rules and its platform became overrun with complex edge cases its guidelines did not take into account, like Elsagate.

Over the past year or so, YouTube CEO Susan Wojcicki has been more proactive, in response to a massive swell of backlash from lawmakers and the public. YouTube said in February it was “aggressively approaching” its child exploitation issue. YouTube parent company Google has also discussed the possibility of moving all content featuring and directed at children over to YouTube Kids. (YouTube Kids has faced its own share of issues, as well.)

YouTube has also been taking aggressive stances against letting channels live stream if children are involved, and the company is now disabling comment sections on certain videos featuring children. YouTube is still undecided, however, on whether it should turn off the recommendation algorithm on such content for fear it could reduce engagement. Earlier this week, Bloomberg reported that YouTube is finalizing plans to remove targeted advertising on videos featuring children on its main site to help prevent potential fines under the Children’s Online Privacy Protection Act.