Skip to main content

YouTube announces four new steps that it’s taking to combat extremist content

YouTube announces four new steps that it’s taking to combat extremist content

/

Automated systems, additional flaggers, and more

Share this story

iPad YouTube

In recent months, social media platforms have been under under pressure to do more when it comes to hosting content that promotes violent extremism and terrorist propaganda. In an op-ed published today in the Financial Times, YouTube has outlined four new steps that it’s taking to confront extremist activity on its platform.

In the op-ed, Kent Walker, the senior vice president and general counsel of Google, writes that YouTube has been working with various governments and law enforcement agencies to identify and remove this content, and has invested in systems that help with that task. Despite those efforts, he acknowledged that more had to be done in the industry, and quickly.

Automated systems, additional flaggers, and more

The first of the four steps is expanding the use of their automated systems to better identify terror-related videos, using machine learning to “train new ‘content classifiers’ to help us more quickly identify and remove such content.”

The company is also expanding its pool of Trusted Flagger users, a group of experts with special privileges to review flagged content that violates the site’s community guidelines. Walker notes that the company is almost doubling the size of the program “by adding 50 expert NGOs to the 63 organisations who are already part of the programme,” which Google will provide with additional grant money. The expanded effort will allow the company to draw on specialty groups to target specific types of videos, such as self-harm and terrorism.

The third step is to take a harder line on videos that don’t quite violate community standards. This includes “videos that contain inflammatory religious or supremacist content,” for example. These videos won’t be removed, but will be hidden behind a warning, and won’t be allowed draw ad revenue, which follows a revamp of the company’s ad policy earlier this year.

Finally, the company will do more with counter-radicalization efforts by building off of its Creators for Change program, which will redirect users targeted by extremist groups such as ISIS to counter-extremist content. “This promising approach harnesses the power of targeted online advertising to reach potential Isis recruits, and redirects them towards anti-terrorist videos that can change their minds about joining.”

The move comes following increased pressure from European regulators

Walker goes on to explain that YouTube is working alongside companies such as Facebook and Twitter to develop tools and techniques to support board anti-terror efforts online. These new steps come weeks after a deadly terror attack in London, which prompted British Prime Minister Theresa May to call for new regulations on internet companies. European regulators have also been weighing tougher options: Germany is considering a law that would levy huge fines against social media companies that don’t remove extremist content quickly, while the European Union recently approved a new set of proposals to require companies to block such content.