YouTube has been cracking down on extremist content over the past couple months, and today the company is announcing some details on its progress and a few additional policy changes. One of the biggest: YouTube will now begin limiting the reach of videos that have “controversial religious or supremacist content” but don’t strictly violate the website’s hate speech guidelines.
Videos that get flagged by users but, on review, aren’t deemed to be in violation of the site’s rules may now be subject to several new restrictions. That includes being placed behind an interstitial (showing some kind of warning), being unable to run ads, and the loss of community features such as comments, likes, and suggested videos.
The changes are an improvement over the status quo, but it’s not entirely clear why YouTube isn’t choosing to ban these videos outright if they’re deemed to be controversial enough to hide. Asked for clarification, a YouTube spokesperson pointed us to a line in an op-ed written by Google’s general counsel, Kent Walker, back in June, when plans for this policy were first announced. “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints,” Walker wrote. He said the videos will “have less engagement and be harder to find” as a result of the changes.
YouTube says video removals for ‘violent extremism’ have doubled
It’s not entirely clear what YouTube will consider to be nearly but not quite in violation of its rules. YouTube’s existing rules say any video that “promotes violence or hatred against individuals or groups based on certain attributes” such as gender or ethnicity isn’t allowed on the site. These changes will come to YouTube’s desktop site in the “coming weeks” and to mobile sometime “soon thereafter.”
YouTube also says it’s begun working with 15 additional expert groups, including the Anti-Defamation League, to help flag extremist content. YouTube announced earlier this year that it plans to expand the number of trusted review partners to over 100 — it started around 60 — and this appears to be part of that expansion.
But the thing that YouTube says has been most helpful so far is its increased reliance on machine learning to flag extremist content. After launching the feature in June, YouTube says it doubled the number of videos it removes for “violent extremism” and that 75 percent of those videos were removed before any humans flagged them. “Our machine learning systems are faster and more effective than ever before,” the company writes in a blog post.
Along with other social media sites, YouTube has been under pressure to better police the content users publish on its platform. The EU is close to requiring companies like YouTube to block videos that incite terrorism, and Germany recently passed a law doing much the same thing. YouTube has been addressing the issue in multiple ways, including both ramping up its efforts to remove violent and extremist content and directing people seeking out such videos to playlists that debunk their hateful rhetoric.