Skip to main content

Inside Elsagate, the conspiracy-fueled war on creepy YouTube kids videos

Illustration by Alex Castro

piIn June, a moderator from a little known subreddit who goes by TheLocalGamer decided to sound the alarm on something strange he was seeing across YouTube. The title of the post was simple: “I think it’s time more people knew about Elsagate.”

“Elsagate is a conspiracy in the works, being hosted on YouTube,” TheLocalGamer wrote. At the time, r/ElsaGate had just a handful of subscribers and even fewer true believers. But TheLocalGamer proceeded to lay out the problem, hiding in plain sight on some of YouTube’s most popular children’s channels: creators were drawing children in with familiar characters — most notably Elsa from Frozen, but also Spider-Man and the Joker — then arranging them in bizarre situations involving cheating spouses or public urination. Digging deeper, the mod said he had seen videos with innocent thumbnails that clicked through to “videos of children giving handjobs to old men” and other depravity. He didn’t know who was responsible, but he knew something was wrong.

“This is a conspiracy that's gonna be plastered on every news site before you know it.”

Just a few months after the Pizzagate debacle, Elsagate was formed with a very similar playbook: a group of amateur sleuths using open-source intelligence to fight a hazily defined evil threatening helpless children. “This is a conspiracy that's gonna be plastered on every news site before you know it,” TheLocalGamer wrote. “The evidence is right there.”

Five months later, that prediction has come true. For months now, YouTube has been dealing with the fallout from inappropriate children’s videos. The problems went mainstream thanks to dueling reports from The New York Times and artist James Bridle, with Bridle describing the videos as “a kind of infrastructural violence.” Just as that furor was dying down, new reporting showed a pattern of sexual comments being left on YouTube videos of children. YouTube has scrambled to delist videos and institute new restrictions, but the result has raised difficult questions about how safe children are on YouTube. With advertisers pullings ads, just this week YouTube CEO Susan Wojcicki announced a new moderation push, singling out inappropriate children’s content and bringing the total number of moderators up to 10,000.

For members of Elsagate, it’s both an unexpected victory and a surprising seal of approval from the mainstream. “It’s almost a sense of validation,” says LFodder, a moderator chosen by the forum leaders to speak to The Verge. Lfodder joined the community this summer after seeing a post reach his Reddit front page, part of a rush of more than 25,000 subscribers that arrived since LocalGamer’s call to action. Lfodder sees the subreddit as an ideal way to get the word out. “We want to spread this to as many parents as possible,” he says. “Parents should not have kids sitting on YouTube all day.”

 “It’s almost a sense of validation.” 

The Elsagate subreddit wasn’t the first community to raise the alarm around inappropriate children’s videos on YouTube — outlets like Tubefilter have been reporting on versions of the problem since February. And in response, this past June YouTube changed its guidelines to specifically ban children’s characters in inappropriate situations. Still, that ban has had little effect on the new format, and in the months since, r/elsagate has become a reliable place for amateur investigators to raise the alarm on videos that slipped through YouTube’s moderation system.

“They’re surfacing stuff that people wouldn’t otherwise find,” says TubeFilter editor Joshua Cohen. “A lot of people are getting a lot of leads from there.”

A YouTube video flagged by r/elsagate, which has since been age-restricted.

According to r/elsagate, the many bizarre kids videos found on YouTube — videos that depict characters being buried alive, kidnapped, or stuck with needles — traumatize children at a massive scale. “They’re using what children find attractive and what YouTube will promote more to children,” Lfodder says. “You have kids, maybe four years old, bombarded all day long in some cases, with these videos of chopping people’s fingers off and burning people and defecating on people.”

It’s still unclear how traumatic the videos really are, but the impression has been driven home by some members’ testimonials. In one post, a parent says her autistic four-year-old is still affected by Elsagate videos more than two months after she cut off access to YouTube. “He still obsessively repeats the strange, distorted screeching and screaming sounds heard in many videos, in deep and growly voices,” she wrote in a post. “He still cries for YouTube and has actual meltdowns over not being allowed to watch it.”

Many r/elsagate members firmly believe that there is some darker strategy at work within these videos that has yet to be revealed. “I believe there is something much more sinister going on with these videos, like mind control or behavior modification,” one r/elsagate member told me. “No evidence to back this up, but those sorts of videos have existed in the past and been used by governments for torture and brainwashing.”

“Those sorts of videos have existed in the past and been used by governments for torture and brainwashing.”

When asked for comment on Elsagate, YouTube emphasized its general commitment to moderating its community. “Content that misleads children is unacceptable to us,” the company said in a statement. “Over the last few months we’ve taken deliberate steps to tackle many of the emerging challenges around family content on YouTube. We’re committed to getting this right and are increasing both human and engineering resources to tackle this ever-evolving landscape.”

Like any conspiracy-focused community, moderators say keeping individuals on-subject is a challenge. In response, the subreddit has developed a set of nine rules. Posts must be directly related to Elsagate, according to rule #3, with no sidetracks into adjacent theories about pedophile rings or MK Ultra. “Speculating is okay,” says Rule 4, “inciting witch-hunts and knowingly slandering someone is not.” Rule 8 forbids vigilantism, or “attempting to combat Elsagate in real life.” Rules can’t stop people from posting, but when members try to call out individuals, the posts are usually shuttered by mods before they can do too much damage.

One theory currently making its way through r/elsagate has to do with how some of these videos use colors. Naming colors is a classic kid’s game, but some users have found the same pattern of colors used over and over: red, blue, yellow, pink, green. Is it some kind of code? One member made the connection to the use of color associations in cult abuse: were Elsagate videos triggering some kind of pre-programmed response in children? Often there would be gibberish comments at the bottom of the videos, either bots trying to game the algorithm or, if you believe 4chan, some kind of coded message.

“I saw the video where some guy decrypted the code on a hidden YouTube video so that each color had a corresponding command/theme assigned to it,” one poster wrote. “Really creepy shit.”

Most of the forum dismissed the theory as too far-fetched — but it’s hard to dismiss it entirely. With the basic premise of Elsagate’s concerns validated in the mainstream, why not push farther?

Another video surfaced on r/elsagate, with the caption “This should be called Ding Dong Hell!”

“It’s very difficult to draw that line,” Lfodder says, making an explicit comparison to Pizzagate. “The reason we’re being so cautious is just because of past experiences with pedophilia, I guess you could say. We don’t want to make any association with wild speculations that might discredit the movement.” But while the subreddit community is governed by moderators and rules, similar theories sprawl out on 4chan and other online spaces without the same moderating influence. The more compelling ones often end up posted back on the subreddit for critique.

LFodder says it’s a constant balancing act: spreading the message to as many parents as possible, while trying to avoid the dangerous impulse to exaggerate the threat and spark physical action. “We’re not official investigators. We’re not official law enforcement. Getting a lot of people getting upset about a set of videos can lead to a lot of damage,” he says. “So we’re very, very careful keeping that in check.”