In three months, YouTube received nearly 110,000 appeals from creators who were frustrated that their videos were taken down — but less than a quarter were later reinstated.
The data comes from YouTube’s new community guidelines report, and it marks the first time that YouTube is sharing information on appeals. One of the more frustrating aspects of working as a YouTube creator is dealing with videos erroneously being taken down by the company and having to go through YouTube’s appeal process. Creators have asked for more transparency regarding the appeals process, and today, YouTube is sharing data for the first time.
YouTube says that it removed more than 5 million videos between October 2019 and December 2019. Of those videos, around 109,000 removals were appealed. YouTube reinstated around 23,000 videos, according to the report. The vast majority of those videos were removed automatically (meaning a human did not oversee the removal), and more than 60 percent were removed before the video collected any views. YouTube’s report also states that just over 2 million channels were removed. More than 80 percent of these channels were considered spam, according to the report.
YouTube reinstated approximately 23,000 videos
“Our team is focused on accurately and consistently enforcing our policies, and one of the ways we hold ourselves accountable and measure our success is by making sure that users can easily appeal our decisions and monitoring the rate at which they do,” a YouTube spokesperson told The Verge.
Creators filed appeals on “less than two percent of the videos we removed last quarter,” according to the YouTube representative. The company “overturned less than half of one percent of videos we removed last quarter.” The report doesn’t specify how many videos were removed or taken down because of copyright infringement, one of the biggest issues within the creator space.
Of the videos that were removed, more than 50 percent were for spam or deceptive practices, 15 percent were removed for child safety, and 13 percent were removed for nudity or sexually explicit content. Hateful and abusive content made up 2.9 percent of all videos removed. Just under 33,000 videos (0.6 percent) of videos were removed for cyberbullying and harassment — an area where YouTube has had to answer for over the last few years. YouTube updated its policy to prohibit creator-on-creator harassment, which became a big talking point last summer.
“This is just one more step towards providing transparency into the work we do to quickly and consistently enforce our policies,” the YouTube representative said. “We’re working to add more exhibits to this report over the course of 2020.”