YouTube’s CEO explains why it leaves up ‘controversial or even offensive’ videos

Illustration by Alex Castro / The Verge

YouTube must leave up some videos that are “controversial or even offensive” in order to remain an open platform, YouTube CEO Susan Wojcicki said today.

In her quarterly letter to creators, Wojcicki addressed YouTube’s perpetual struggle with troubling content and how to moderate it, saying that it’s worthwhile for the platform to allow videos the company disagrees with. “A commitment to openness is not easy,” Wojcicki wrote. She says that “hearing a broad range of perspectives ultimately makes us a stronger and more informed society.”

YouTube has long struggled with how to police and limit the spread of troubling videos, from containing conspiracy theories to stopping radicalization to limiting harassment and bullying. Most recently, YouTube was widely criticized for its handling of a situation in which a conservative YouTube commenter repeatedly made homophobic comments about a Vox host. YouTube ultimately decided that the homophobic language was acceptable because it was framed as commentary, and it took considerable backlash from the LGBTQ community both on the platform and within the company in response. (Disclosure: Vox is a publication of Vox Media, which also owns The Verge.)

Wojcicki says that problematic videos makes up “a fraction of one percent” of the content on YouTube, but they have an outsized impact in terms of potential harm and trust. That’s led to people believing that YouTube has no incentive to remove troubling videos because they lead to more views. “This is simply not true,” Wojcicki wrote. In reality, she says, the lack of trust hurts YouTube’s relationship with advertisers.

The blog post doesn’t include any changes to YouTube’s policies. Instead, Wojcicki outlined a new way that YouTube is framing its existing set of goals to keep the platform a positive, healthy space. She calls them the four “R”s: removing prohibited content quickly, raising up authoritative voices, reducing the spread of problematic content, and rewarding trusted creators. Together, those are supposed to help YouTube earn trust from creators and advertisers who have grown concerned by its actions (and, at times, inaction).

In order to keep the site vibrant, YouTube has to “strike the right balance between openness and responsibility,” Wojcicki concluded.

The concerns around YouTube moderation aren’t going away anytime soon. YouTube is still developing and revising policies to prevent major issues — its updated creator-on-creator harassment policy is still in the works, for instance — and bad actors will continue to push against the limits of those rules. This quarter’s letter shows that Wojcicki, at least, knows what’s on creators’ minds, even if she doesn’t have any changes to announce that’ll quickly make things better.


Youtube is on a roll lately with the reiteration that the inmates aren’t running the aslyum.

All this controversy would be avoided if YouTube simply enforced their community guidelines consistently and without granting exceptions. (And banned channels in response to violations, instead of demonetizing them). Hopefully this indicates that YouTube has learned their lesson, but I sincerely doubt it.

The most important thing to keep in mind is that quality, human-nuanced moderation is simply too expensive to maintain at scale while remaining profitable.

That’s not to say it shouldn’t be done. But that’s probably the biggest reason why YouTube (and other social media) relies so heavily on automated moderation. And unfortunately, AI-decisions are black-box, and can easily conflict with stated policy.

Until AI can be as nuanced as knowledgeable and wise human moderators (or find a way to have humans do it in a humane way)… we will continually have this problem.

Yeah, but when I upload a PRIVATE video of my toddler dancing in front of the TV, it gets flagged for copyright violations because there was a song being played on the TV.

Translation – "It’s easier for us to decide to do nothing."

Thought Experiment: Imagine that you put 1 billion people with radically different identities and lifestyles into a gigantic warehouse and call it a ‘Super Hangout’. Then, you tell everyone to ‘Behave’, and then just leave. Maybe you’ll leave behind some robots that punish people who don’t ‘Behave’, but you’re still completely out of the picture. What do you think will happen?

This is exactly what is happening to youtube, and to be frank, every other major social network. Its completely unsustainable, and its why susan seems to be backpedaling here. YouTube is inherently unprofitable and unmoderatable, and honestly I don’t see any of these social networks surviving that far into the future or at least in its current form.

I think it’s all solvable via ‘filters’. Yes, that means people get more disconnected, but people already start from a place of that (there are many tribes on earth), and you’re right, it’s impossible to put everyone together in the one place and expect peace.

Reminds me of a Robin Wright-starring dystopian film called ‘The Congress’. I highly recommend it.

I’m unsure what you mean by "filters" in this context, but I’m curious. Would you be willing to elaborate?

Filters for what type of content you want to see or not. Users add tags to any piece of content, like ‘hate speech’, ‘bigotry’, ‘homophobic’ or ‘fake’, and others vote whether they agree on that or not, to give community consensus of how true that might be or not.

The tags you personally specify you don’t want to see on the platform, and how severe (on a scale of 1-10) a setting of sensitivity you specify for each tag, determines how filtered you are from such content.

Smart AI / algorithms could help to mitigate vote brigading, and so on.

I hope someone does it. I’ve thought of an entire radical free speech platform based around this, but probably never will work on it.

As far as I know, the ‘Filters’ you are talking about existed when the internet was still a cluster of Forums and IRC Chatrooms. There, it was hard to see someone who used two/three forums at once, and therefore ideas didnt spread as quickly. Today, these ‘Dual users’ are such high in number that one piece of content can spread like wildfire, which is one of the main problems I see with modern social networks.

As a broadcaster YouTube should be beholden to the same laws and regulations broadcast TV is. Then again broadcast TV in the states includes fox news so maybe we should default to the same rules the BBC has to follow instead.

Yes we should all follow the example of state run media. Brilliant!

YouTube is under the same regulations as every other broadcaster that doesn’t use public airwaves: none.

The only thing regulating cable is commercial viability of content.

That’s led to people believing that YouTube has no incentive to remove troubling videos because they lead to more views. "This is simply not true," Wojcicki wrote. In reality, she says, the lack of trust hurts YouTube’s relationship with advertisers.

Sorry Susan, but this simply IS true. It doesn’t hurt the relationship with advertisers when YouTube has already heavily curtailed who can even run ad rolls and monetize in the first place. It doesn’t hurt ad revenue when ads aren’t on the troubling content in the first place. They literally run a different set of rules depending on A) are your corporately owned or B) are you a "partner" creator?

If it’s A. you can do whatever the hell you want without exception. Rules do not apply.
If it’s B. you can do almost whatever the hell you want. Most rules do not apply.

If neither, you get the book thrown at you for any issue.

Don’t get me started on the ads which break all sorts of rules constantly but there’s no way to report them and YouTube doesn’t care at all.

Not too long ago, posting a video in support of gay marriage would have been considered "controversial" in America. I remember when people used to say "I disapprove of what you say, but I will defend to the death your right to say it".

Apples to Oranges – You can’t wield the same stick for every issue. Sorry. Everything doesn’t exist in parity. Some things are far more unacceptable (particularly when they involve threatening the safety or well being of specific individuals being targeted).

What Susan is trying to communicate is that the community is diverse. So your opinion and beliefs aren’t always going to be shared by the rest. You need to be comfortable with that and show some tolerance.

Maybe for a second stop focusing on how the platform is screwing you and change it by making content you belief reflects the best of it. The best ideas will always rise to the top.

Careful with that common sense you might get banned!

Youtube leaves those videos because they get clicks, and clicks lead to ad revenue. No other reason.

The stupid thing is they demonetise the videos for the authors for wrongthink but then they let advertising that only they get money for. The whole censorship by the left in Silicon Valley is out of control. If it’s demonetised why are ads still allowed? Eg mark dice, Paul Joseph Watson etc

Happens to lefty channels, too.

But nice victim complex there.

YouTube was widely criticized for its handling of a situation in which a conservative YouTube commenter repeatedly made homophobic comments about a Vox host

As usual, Vox/The Verge choses not to mention that the conservative was demonetized, whereas nothing happened to the Vox host who has urged violence against people he disagrees with.

the Vox host who has urged violence against people he disagrees with.

Try saying that again, but with context.

But they’re demonetizing videos for arbitrary language/content.

View All Comments
Back to top ↑