As a federal investigation into YouTube’s problems with children enters its late stages, the platform needs a solution. But as YouTube struggles to find a fix that addresses policymakers’ concerns, it’s left with a major roadblock: children have become integral to the overarching creator community.
Whether it’s six-year-olds reviewing toys and landing on YouTube’s Trending page, kids’ reaction videos attracting millions of views, or top-tier vloggers teaming up with four-year-olds to create more family-centric content, the platform has become a destination for videos starring kids. In 2017, Time magazine reported being told by YouTube that “time spent watching family vloggers is up 90 percent in the past year.” Melissa Hunter, a longtime family vlogger and founder of multichannel network Family Video Network, told The Verge that people are devouring kid videos on YouTube.
“Some of the smartest creators understood there are moms and dads also watching these vlogs,” Hunter said. “It’s not just kids.”
Creators really started gravitating toward kid-oriented content about two years ago
But videos focused on kids have posed an increasing problem. Pedophiles have left predatory comments under videos of children, and YouTube’s recommendation algorithm seems to have helped them find new targets. The New York Times discovered that the algorithm drove hundreds of thousands of views to one particular video of two 10-year-old girls playing “to users who watched other videos of prepubescent, partially clothed children.” Parents were removing videos of children because of predatory behavior as early as 2015, and in 2016, the New Statesmen found that parents were seeing videos of their kids embedded on pedophile forums.
While innocuous family videos uploaded by parents, which aren’t intended to pull in hundreds of thousands of views like the videos published by professional YouTubers, are a core part of the concern, popular creators have played into this, too. Creators started gravitating toward kid-oriented content about two years ago, as advertisers fled the platform amid stories about hateful content and controversies from top creators. Concerned advertisers wanted assurance that their ads wouldn’t run on disturbing, harmful, or hateful videos, and some creators saw content geared toward kids as the solution.
For some, that meant kid-focused content like slime videos, but many others took it to mean including kids in their videos. Gaming personality Kevin Chapman all but abandoned his original channel to focus on family vlogging. His new venture quickly surpassed his gaming channel, and that’s where he put his efforts into expanding.
“It’s grown to be my main channel now,” Chapman said, even rebranding the channel’s title officially to The Chapman Family Vlogs. “It’s where I spend the majority of my time now, and it’s what brings in the most views.”
“I post one of these ‘kids videos’ and it does super well and it blows up and it makes a ton of money, so it’s like what do we do?”
Others saw similar success featuring kids. The Ace Family went from nothing to amassing more than 16 million subscribers in less than three years. Jake Paul, a vlogger known for his pranks and wild antics, started working with family vloggers directly and even invited a family to live in his house so he could vlog with their four-year-old-son Tydus. In one video from earlier this year, Paul spoke about needing to create more kid-friendly content, often using Tydus in his videos as a way of guaranteeing ad revenue.
“I post one of these ‘kids videos’ and it does super well and it blows up and it makes a ton of money, so it’s like what do we do?” Paul said. “The reason it’s so hard is because creatively I want to make whatever I want, but when there’s business involved, it’s like this clashing. You can’t figure out what to do necessarily.”
YouTube is now reported to be the target of an investigation by the Federal Trade Commission around the protection of children and the way it handles children’s data. It’s taken several steps to address the problem already, closing comments on most videos with kids, requiring minors to be accompanied by an adult when live-streaming, and removing thousands of accounts each week that belong to kids under the age of 13. YouTube is also looking at more aggressive moves, including moving all videos featuring children to the company’s standalone YouTube Kids app, according to The Wall Street Journal. Employees are also said to be asking for YouTube to turn off autoplay recommendations entirely on videos featuring kids.
All of these solutions pose problems for YouTube
All of these solutions pose problems for YouTube. YouTube could turn off autoplay and stop recommending videos with children in them, as many have suggested, but YouTube has warned that this would greatly hurt the creator community. And the amount of content uploaded to YouTube each day is so great that requiring a substantial portion of videos to be filtered into another app like YouTube Kids would be a daunting moderation task. A YouTube spokesperson told The Verge earlier this week that the company considers “lots of ideas for improving YouTube and some remain just that — ideas.” YouTube declined to comment further and pointed to a blog post from earlier this month about steps it’s taking to protect kids.
“YouTube was never meant to be a place for kids and families. But it happened, and now we all need to try and figure out the best way to keep everyone safe,” said Hunter, who advises YouTube and family vloggers on how to protect kids.
Despite suggestions from critics to ban all content with kids on YouTube, Hunter says that’s not a viable plan. If it’s not YouTube, she said, it’ll just make “someone else the go-to spot for kids content.”