YouTube is taking its next step in countering extremism and terrorist content on its platform. Today the company announced that effective immediately, YouTube will respond to certain English-language keyword searches by displaying playlists of pre-existing videos on the site that debunk and discredit "violent extremist recruiting narratives" from the Islamic State and other groups.
This strategy is called the Redirect Method, and the goal is to reach those people who might be feeling isolated and who are at risk of being radicalized through hours of absorbing violent extremist messaging and propaganda online. Jigsaw, a subsidiary of Alphabet, collaborated with Moonshot CVE to develop the counter-messaging approach as a means of applying technology to imperative global threats. Now, whenever YouTube recognizes keywords and phrases that are common amon people in this vulnerable position, it will display a list of curated videos that are meant to directly confront and debunk extremist mythology.
But how will YouTube know if it’s actually working and changing minds? The company says it will "measure success by how much this content is engaged." An earlier pilot of the Redirect Method led to 320,000 individuals viewing "over half a million minutes of the 116 videos we selected to refute ISIS's recruiting themes."
YouTube is working closely alongside non-governmental organizations (NGOs) to create additional video content — some separate from the Redirect Method — that could get through to potential recruits "at different parts of the radicalization funnel."
YouTube’s effort will be expanded to include other languages over the coming weeks, and machine learning will be used to dynamically update the list of keywords that trigger these anti-terrorist playlists. The company is also planning to collaborate with Jigsaw to expand the Redirect Method and increase its effectiveness inside Europe.
Last month, YouTube announced four steps it will undertake to combat extremist content online. Those were:
- Increase use of technology to help identify extremist and terrorism-related videos.
- Increase the number of independent experts in YouTube’s Trusted Flagger program.
- Take a tougher stance on videos that don’t clearly violate YouTube policies (i.e. videos that contain inflammatory religious or supremacist content).
- Expand role in counter-radicalization efforts.