YouTube announces four new steps that it’s taking to combat extremist content

In recent months, social media platforms have been under under pressure to do more when it comes to hosting content that promotes violent extremism and terrorist propaganda. In an op-ed published today in the Financial Times, YouTube has outlined four new steps that it’s taking to confront extremist activity on its platform.

In the op-ed, Kent Walker, the senior vice president and general counsel of Google, writes that YouTube has been working with various governments and law enforcement agencies to identify and remove this content, and has invested in systems that help with that task. Despite those efforts, he acknowledged that more had to be done in the industry, and quickly.

The first of the four steps is expanding the use of their automated systems to better identify terror-related videos, using machine learning to “train new ‘content classifiers’ to help us more quickly identify and remove such content.”

The company is also expanding its pool of Trusted Flagger users, a group of experts with special privileges to review flagged content that violates the site’s community guidelines. Walker notes that the company is almost doubling the size of the program “by adding 50 expert NGOs to the 63 organisations who are already part of the programme,” which Google will provide with additional grant money. The expanded effort will allow the company to draw on specialty groups to target specific types of videos, such as self-harm and terrorism.

The third step is to take a harder line on videos that don’t quite violate community standards. This includes “videos that contain inflammatory religious or supremacist content,” for example. These videos won’t be removed, but will be hidden behind a warning, and won’t be allowed draw ad revenue, which follows a revamp of the company’s ad policy earlier this year.

Finally, the company will do more with counter-radicalization efforts by building off of its Creators for Change program, which will redirect users targeted by extremist groups such as ISIS to counter-extremist content. “This promising approach harnesses the power of targeted online advertising to reach potential Isis recruits, and redirects them towards anti-terrorist videos that can change their minds about joining.”

Walker goes on to explain that YouTube is working alongside companies such as Facebook and Twitter to develop tools and techniques to support board anti-terror efforts online. These new steps come weeks after a deadly terror attack in London, which prompted British Prime Minister Theresa May to call for new regulations on internet companies. European regulators have also been weighing tougher options: Germany is considering a law that would levy huge fines against social media companies that don’t remove extremist content quickly, while the European Union recently approved a new set of proposals to require companies to block such content.

Comments

Good.

I don’t know about this. I believe that what might push someone on the edge of joining extremist groups to actually join is seeing videos of drone strikes on CNN, not some dude talking in front of a flag on YouTube.

Large media outlets on TV have as much if not more responsibility to "filter" their content because their impact and authority is much wider than a video with 34 views on YouTube that gets taken down after 72 hours.
Also, I really don’t like this approach:

The third step is to take a harder line on videos that don’t quite violate community standards. This includes "videos that contain inflammatory religious or supremacist content," for example. These videos won’t be removed, but will be hidden behind a warning, and won’t be allowed draw ads revenue

You can’t take a hard line against something that doesn’t quite respect the rules. A hard line means black and white, either it does or it doesn’t. Here, they are just drawing a very blurry border between what is and isn’t allowed and that’s not good for the content creators.
I love YouTube and consume a lot of content on it, but I think more and more that they’re lucky they don’t have any real competition because I feel like they are making some bad business decisions.

It’s just Youtube doing it’s usual. Keeping its policies as vague as possible to be able to implement them at will.

There are a few of us that believe in freedom of speech. How many of us remain?

You’re free to make any comments you like, they’re free to not let you do that on their platform.

true free speech only exists in the mind

Problem is they keep the rules purposefully and misleamusingly fcbible toto allow for they narrative to seep through.

Wrong, domthomas, as soon as they cross the line of working with police and authorities it becomes censorship, period. You cannot call yourself a private platform while working with government agencies to promote a specific viewpoint and claim "free speech"

While I generally agree that a privately owned service can moderate as it sees fit, once that service allows government agencies to be involved in said moderation at an official level (meaning the Trusted Flaggers program), a line is being crossed, or skirted.

While Google has claimed that Trusted Flaggers can’t remove videos themselves, it’s clear that Google knows said people/groups carry additional weight when making a judgement.

The lack of transparency in the program itself, beyond Google’s claims, just adds fuel to the free speech fire.

The long and short of it is that once the government becomes involved the "It’s our platform" argument fails, and this is pretty damned close.

There’s a difference between freedom of speech and where you’re able to freely express your opinion. In the case of YouTube your, generally speaking, freedom of speech is limited by its affect on their profitability.

There are few people who understand what freedom of speech actually means. You’re not one of them.

look at the vesause livestream they just disabeld livechat but it was soooooooooooooooooo awful filled with rasist nazi and so on i reported 10 comments in 30 seconds

In other news: PewDiePie, currently one of the most influential men among young children, still apes neonazi paroles. Good work, Youtube.

Young children? Are you fucking stupid? Clearly you don’t follow him. Ahaha " Young children". 1st people should keep and eye on their children and not expect a company to do so for them. 2nd have you ever fucking heard of humor? 3rd you definitely consider the WSJ a reputable news source.

Ok, in the spirit of discussion, I’ll ignore the insult.

0) If you have substantial data, I’d love to read about PDP’s audience. The impression I have is that his viewers are mostly comprised of children and young teens. And no, I don’t watch his videos.

1) Definitely, and nowhere did I say that looking after children should be Youtube’s task.

2) Saying "Death to all jews" and then saying "no I meant it ironically" isn’t humor.

3) Yes, a room full of journalists is a more reputable news source than a twenty-something who yells at video games for a living. Even if I disagree with most of the WSJ’s staff opinions/ideology.

You are welcome to stop your children from watching him, or anything else on you tube. That is the price of free speech, allowing opinions of those you disagree with to still be heard. Leave the country and renounce your citizenship if you’d like to see what the alternative is. No? Didn’t think so.

Alright, I did not say, nor imply "Let’s stifle free speech". Youtube wants to combat extremism, PDP more than once slipped nazi talking points into his videos. Ergo: Youtube should do something about that, in my opinion.

I think there’s a typo.

"support board anti-terror efforts online." Did you mean broad?

View All Comments
Back to top ↑