Skip to main content

The YouTube shooting makes hard community management questions even harder

The YouTube shooting makes hard community management questions even harder

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Illustration by Alex Castro / The Verge

Yesterday, YouTube experienced its first workplace shooting. Four days after leaving her home in San Diego, a 39-year-old woman named Nasim Aghdam snuck into YouTube’s headquarters with a handgun, shooting three people before taking her own life. We don’t know what drove Aghdam to commit her crime — whether she was mentally unstable, had previous violent episodes, or was triggered to commit her crime by a tangential event — but it’s clear she had strong feelings about YouTube. According to her family, she hated the service after her channel was demonetized and police are listing her grudge as a primary motivation for the shooting.

A day after the incident, most of what we know about Aghdam comes from her YouTube videos. They were pulled down in the immediate aftermath of the attack, but scraped versions found their way out, most of which were stilted music videos or exercise tutorials. The most widely shared video shows her laying out her grievances with YouTube: moderators had age-restricted an ab exercise video because it was too provocative, but she ran through plenty of more provocative videos that hadn’t been restricted. She was furious. It didn’t seem fair.

Most of what we know about Aghdam comes from her YouTube videos

It’s disturbing to watch now, in part because the content itself is so familiar. If you spend time on YouTube, you’ll see lots of videos like this. As long as the company has been moderating videos, YouTubers have been complaining about it. Strike-based moderation means YouTubers are usually restricted or demonetized rather than kicked off the platform entirely, making it easy to appeal to their audience and explain how their livelihoods have been cast into doubt. The next step is usually a video announcing that they’re leaving the platform entirely, driven off by heavy-handed and arbitrary moderation. In the wake of the shooting, the most extreme broadcasters are now passing around Aghdam’s video as a sign that they were right all along.

In some ways, this kind of chafing is inevitable. YouTube is a massive platform, and the moderation system relies on a combination of user flags, algorithms, and snap judgments by workers who don’t always have time to consider nuance. Balance that against an army of fledgling celebrities, often making small fortunes off their videos, and some conflict is inevitable. It’s become a staple of YouTube culture, and until this week, there was no indication it could ever turn violent.

It’s monstrous to blame the shooting on YouTube moderation, but the internet has no shortage of monsters

In many ways, moderation is the most important thing YouTube does. Two weeks before the shooting, the platform updated its policy on gun videos, promoting a backlash from channels like TheGunCollective that had built hundreds of thousands of subscribers from firing-range demos and build videos. When YouTube was criticized for disturbing children’s content last year, it responded with more aggressive age-restricting for videos deemed to be inappropriate — the same mechanism that made Aghdam so enraged. All these were positive moves for YouTube, a sign that the platform is starting to take responsibility for its impact on users and the world at large. But those moves come at the cost of angry channel owners, who are never happy to see their videos restricted or outright banned. As YouTube takes on more responsibility for more problems, reining in the community becomes a bigger and bigger task.

To be clear, Aghdam bears the sole responsibility for what she did, and it’s monstrous to blame the shooting on YouTube moderation — but it hasn’t stopped everyone. Certain corners of the internet are already painting Aghdam as a victim of YouTube’s censorship regime, trying to exploit the shooting to settle old grudges. The hashtag #censorshipkills has taken off in alt-right corners, fitting the shooting into existing gripes against moderation policies. The National Rifle Association has weighed in also, fresh from its fight over YouTube’s gun policies. In a broadcast yesterday, one of the group's correspondents explained the shooting by saying the company’s moderation policies had “open[ed] them up to a lot of hatred.”

The problem is also much deeper than political complaints. Like most platforms, YouTube’s success rests on its users. They create the videos and the culture that makes YouTube such an interesting place. YouTube’s employees need to keep that community happy, but they also need to shape it, guide it away from misinformation, exploitation, and hate. When this week started, it was hard enough just to shape that community without driving it away. Now — horribly, unexpectedly — they have to be protected from it, too.