Skip to main content

YouTube reportedly discouraged employees from reporting fake, toxic videos

YouTube reportedly discouraged employees from reporting fake, toxic videos


In pursuit of more video views

Share this story

The YouTube logo against a black background with red X marks.
Illustration by Alex Castro / The Verge

For years, YouTube ignored its employees’ pleas to address and take down toxic videos in a quest to boost engagement, reports Bloomberg. According to more than 20 former and current YouTube staffers, employees would offer proposals to curb the spread of videos that contained disturbing, extremist content, and / or conspiracy theories, but leadership was reportedly more interested in boosting engagement than heeding those warnings.

One proposal offered a way to keep content that was “close to the line” of violating policies on the platform but remove it from the recommended tab. YouTube rejected that suggestion in 2016, a former engineer said, and instead continued to recommend videos regardless of how controversial they were. According to employees, the internal goal was to reach 1 billion hours of views a day.

Employees were discouraged from even acknowledging the existence of toxic content

“I can say with a lot of confidence that they were deeply wrong,” the engineer told Bloomberg. (YouTube implemented a policy in January 2019 that’s similar to what he initially suggested.)

Employees outside of the moderation team were also reportedly discouraged from searching YouTube for toxic videos, for lawyers said the company would have a bigger liability if there was proof that staffers knew and acknowledged those videos existed.

At least five senior employees have left YouTube over its unwillingness to tackle the issue. As described by another former staffer, YouTube CEO Susan Wojcicki would “never put her fingers on the scale,” saying that her view was to just “run the company” instead of dealing with the onslaught of misinformation and dangerous content. A YouTube spokesperson said the company began taking action in late 2016 and started to demonetize channels that promoted harmful content in 2017. However, as recently as the end of that year, less than 20 staffers were employed on its “trust and safety” team. You can read the full Bloomberg report here for more anecdotes of how staffers struggled to prevent controversial videos from going viral.

In 2018, YouTube attempted to curb fake news and conspiracies from spreading on its platform with an information box, and this year, it began to pull ads from potentially harmful content. Still, even if YouTube can prevent controversial videos from spreading, it will eventually have to wrestle with the core issue of content moderation as toxic content remains rampant on the site.