Skip to main content

    2017 was YouTube’s best year ever. It was also its worst.

    Will YouTube’s mounting scandals ever slow the business down?

    Illustration by Garret Beard

    If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

    2017 was a wild year for YouTube. It continued to extend its dominance as the world’s biggest video platform: in June it announced that 1.5 billion people now log in each month, a user base second only to Facebook’s and one that can earn successful creators a substantial windfall. According to recent analysis by Forbes, the top ten channels on YouTube earned $127 million in 2017, an increase of 80 percent from the year before.

    But in terms of its public image, 2017 was also the worst year YouTube has ever had. It began with the downfall of the platform’s biggest star, PewDiePie. After a Wall Street Journal report about his use of Nazi imagery and anti-semitic humor, the Swedish vlogger lost his deal with Disney and YouTube cancelled his original series. Just one month later, big brands threatened a full scale boycott of YouTube after learning that their advertising was being played alongside racist and offensive videos.

    And yet the service’s growth has been largely unaffected by these problems. The string of scandals hasn’t impacted investors’ enthusiasm for Google’s stock, says Brian Wieser, an analyst with Pivotal Research. There was no impact on the company’s earnings. Big brands returned quickly. And in fact, the company plans to raise the rates it charges advertisers in the coming year.

    Marketers’ hunger for YouTube is understandable. It’s the place for music videos from traditional acts, of course, but it’s also minting a new kind of icon. Take Jake Paul, one of the fastest growing creators this year. The tweens want to date him, be him, or die trying, and plenty of parents are willing to pay big bucks to keep their kids happy. Paul brought in over 10 million subscribers and more than 2 billion views during the course of 2017. His channel is a mashup of daily vlogging, JackAss style pranks and stunts, inspirational advice, and raps that would make Vanilla Ice cringe. He annoyed his neighbors and the local police by starting a bonfire of furniture in his empty pool. But his transgressive antics are a tried and true piece of the pop star life.

    The voraciousness of its young audience is the root of YouTube’s success, and one of its biggest problems. YouTube is tremendously popular not just with Jake Paul-watching tweens, but with a cohort that hasn’t yet learned to read. Children’s content is now one of the most popular genres on the entire platform. Since it was introduced in 2015, YouTube Kids has been downloaded by tens of millions of users, and collected over 70 billion views. Recent data from SocialBlade, which tracks online video, showed that five of the top 15 channels worldwide were dedicated to kid’s content. Driving this trend is the fact that YouTube’s audience is increasingly global, with 80 percent of video views coming from outside the United States. Many of the most popular kid’s channels rely on pre-verbal cues, bright colors, funny sounds, and simple cartoons, giving them broad appeal.

    Nazis and racists were a bad start to the year, but unbelievably, things got worse from there. In early November, YouTube was forced to apologize for surfacing inappropriate and disturbing videos on its kid’s app. The company vowed to crack down on bad actors, but the scandal kept mutating and growing. A week later the focus turned to videos of children plagued by sexual comments, and a week after that, a second advertising boycott began around this issue. That same week, new reports raised questions about child endangerment and exploitation in videos ostensibly aimed at family friendly audiences.

    2017 was a year of reckoning with the power and scale of online platforms

    2017 was a year of reckoning with the power and scale of online platforms. With Facebook and Google, both hauled before Congress for questioning, the main problem was disinformation. YouTube, owned by Google, has its own vibrant misinformation ecosystem, but the most difficult truth that came to light about the world’s largest video service was not its role in our politics, but in our parenting.

    Very young children are not just a big audience, they are an especially lucrative one. “From what I’ve seen, the reason they are getting these massive views is because kids, especially very young kids, have a tendency to want to watch one thing over and over,” says Phil Ranta, who represents what was, at the time, one of the top superhero channels, Webs & Tiaras. “Some of these are probably seen by the same child 50 times. It really helps to juice those numbers.”

    As a parent and a journalist, I feel a little embarrassed that I didn't quite understand exactly how bad this kind of video could be. In the spring of this year, before any scandal had broken, I wrote about the popularity of toy review channels featuring performers under the age of five, and of the increasing popularity of superhero themed accounts that sometimes featured bizarre, sexual, or scatological content.

    I raised questions about some potentially disturbing elements of this trend. With the accounts like Ryan ToysReview  making an estimated $11 million in a single year, who was deciding how often Ryan worked, and who would speak up if he decided he wanted to quit? He began making videos at age 3, and his little sisters have been part of the family’s YouTube efforts since birth.

    With the superhero channels, I noted that many of them focused on gross or violent content. But I saw this as something strange and hilarious, not deeply troubling. Kids love talking about poop, blood, death, and where babies come from. They play-act traumatic scenarios, like getting a shot at the doctor’s office, in preparation for handling stressful situations in real life. It didn’t surprise me that the most popular videos keyed into those same themes.

    But I should’ve seen where the trend was headed. Months later, it became clear that there were lots of channels jumping on the kid’s content bandwagon that went too far. Videos of children tied up and gagged by adults, of cartoon characters committing suicide or having sex. At the peak, accounts were using the trapping of children’s videos — adding colorful banners of cartoon superheroes and names like Kids Toys UTube Brazil — on channels that consisted exclusively of women in lingerie smoking cigarettes and french kissing one another.

    ElsaGate, a Reddit community devoted to finding and reporting disturbing or inappropriate kid’s videos, has many members who believe some darker, larger conspiracy must be at work. Why else would you combine children’s characters with what amounts to soft-core pornography? But many others are aware that the market is simply responding to demand interpreted by an algorithm. “They’re using what children find attractive,” said one of Elsagate’s moderators.

    The same dynamic was happening at other platforms, elegantly expressed in a recent story about e-commerce ads on Facebook. Wish, an online retailer, uploaded 170 million products to Facebook as potential ads. Users gravitated towards the sex toys, animal torture, and hard drugs. “It’s a consequence of Facebook’s ad system,” the company’s CEO told The Daily Beast. “It’s basically rewarding high shock value items that people will click on.” The same principle holds true for YouTube videos aimed at children. The ones that frighten and excite them will always perform the best. When we let demand-driven algorithms guide our consumption, we end up with fake news farms, USB pregnancy sticks, and Elsagate.

    Thinking about the distance between YouTube’s business and its public image, I was reminded of the early days of television. In 1961, Newton Minow, the new chairman of the Federal Communications Commission, gave a historic speech before the Senate. He warned that television was becoming a “vast wasteland,” a medium rotten with “blood and thunder, mayhem, violence, sadism, murder.” Over the next decade Minow was instrumental in carving out space for public television and educational programming that was intended to be safe and edifying for children, including iconic series like Sesame Street. The last year on YouTube shows what happens when your structure is the total opposite, when you combine accessible monetization at a massive scale with a very young audience and largely automated editorial oversight.

    YouTube is adding staff and tightening rules

    Videos of iconic children’s characters engaged in violent or sexualized behavior is still trivially easy to find. To YouTube’s credit, none of the clips I reviewed this morning contained advertising. Youtube might not be able to stamp out the existence of this content entirely, but if it can remove the economic incentive, the supply will dry up.

    YouTube has vowed to ramp up its protections, increasing its moderation team by 25 percent to more than 10,000 employees and turning the machine learning techniques it uses to identify extremist content towards hate speech and kid’s videos. It has promised to strengthen the review process around what videos can make money off advertising and to help small creators who have had their income hurt by waves of demonetization.

    2018 will test whether these new efforts can effectively regulate the more than 400 hours of video uploaded each minute to the service. If scandalous content continues slipping through the cracks, will advertisers hold YouTube accountable in a meaningful way? And if neither of those things come to pass, will regulators follow Minow’s example, demanding that more be done to police what is arguably the world’s most popular platform for children’s entertainment?