YouTube has terminated more than 400 channels and deleted tens of millions of comments in response to concerns from creators, users, and advertisers over videos being used to exploit children.
The details come from YouTube’s creator outreach team in response to a video from commentator Philip DeFranco published yesterday evening. The team’s statement said that “all of us at YouTube” are working on the problem, and that “we are continuing to grow our team in order to keep people safe.” YouTube has also been reporting comments and accounts to law enforcement, which it has to do in compliance with federal law.
Advertisers and creators have responded to the issue over the past couple of days following a video highlighting the issue that gained widespread attention. The video, from Matt Watson, demonstrated how searching terms like “bikini haul,” which features women modeling different swimsuits they’ve purchased, can lead to videos of children that feature predatory messages in the comment sections. Since the video has come out, companies including Epic Games, Nestlé, and Disney have pulled ads from the platform. Other companies like Grammarly and Peloton have asked YouTube to investigate further.
A YouTube spokesperson previously told The Verge that the team “took immediate action by deleting accounts and channels, reporting illegal activity to authorities, and disabling violative comments” when the video first caught attention.
“Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” the spokesperson said at the time. “There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
While many people are criticizing YouTube, other creators have come out in defense of the team. DeFranco said in his video that “this is something that YouTube has been consistently fighting” over the years. The company instituted community guidelines to specifically address predatory behavior on videos featuring children in 2017, and it’s given advertisers more control over where their ads get placed. This isn’t necessarily a YouTube problem, DeFranco said, so much as it’s an issue with the current online landscape.
“Once they were made aware of the offending content, they handled the situation,” DeFranco said. “Which, again, is why it’s important that instead of saying, ‘YouTube allows this and they’re happy about it’ — because once again that is an insane argument — the best thing we can do is report disgusting monsters like we would anywhere else on the internet.”