Skip to main content

Come into my igloo: online dating as an 8-year-old

Come into my igloo: online dating as an 8-year-old

/

Kids are exploring dating in virtual worlds, but everyone is keen to stop them

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

club penguin heart screencap
club penguin heart screencap

When her 10-year-old daughter announced that she had gone on a date to the park with a boy and he’d asked her to the prom, Rebecca Levey was astounded. "Going to the prom is about seven years away," she wrote in a widely-circulated essay about the online dating life of tweens last week.

Fortunately, all of it — the park, the boy, the prom — was merely virtual. The date took place in Fantage, short for "fantastic age," a virtual world made for kids that lets users chat, attend parties, host fashion shows, play mini-games, and even go to school.

Virtual worlds like Fantage are fun, innocent, bright-colored versions of the massively-multiplayer online games that teenagers and adults play. They also unintentionally function as online dating sites for the elementary and middle school set.

Online dating for middle schoolers is pretty primitive

Online dating for middle schoolers is pretty primitive. "First, you find a good looking one," one user explained in a YouTube video entitled "How to get a girlfriend on Fantage." "Then you ask her to be ur girlfriend. Tip! Wear good clothes so she’ll say yes!"

Kids pair off by asking "say 123 if u want me" and break up just as abruptly — like, five minutes later — by dropping a quick "brb" and then disappearing to another server. Boyfriends and girlfriends can do little besides chat, go to the virtual pizza parlor, play games against each other, and send heart emoticons back and forth.

Fantage is one of many free and paid virtual worlds that have attracted 66.4 million active users from age seven to 13, according to virtual world research firm KZero. There’s Webkinz, a world based around the virtual alter egos of the plush toys found in stores. There’s Poptropica, a fantasy archipelago where kids ages six to 15 roleplay and complete quests. There’s the Finnish Habbo Hotel, the National Geographic-owned Animal Jam, the Disney-owned Club Penguin, and more.

A user-created instructional video.

 

Collin Wisniewski, a 15-year-old who lives in New York, was a very active Club Penguin user when he was 10. He and his friends liked it for the games, but he remembers there was quite a bit of flirting going on.

"People would say 'oh, are you a boy or a girl,’ and ‘oh, do you want to come to my igloo,'" he recalled. "I didn’t really put it together at the time."

What did they do in the igloos?

"Do you want to come to my igloo?"

"They would just hang out and play with each other's puffles," Wisniewski said, referring to the small, furry amoebas that Club Penguin users keep as pets. "There really isn't that much to do."'

This kind of kiddie flirtation is pretty harmless, but most sites prohibit any type of dating. Club Penguin does not consider dating to be age-appropriate for its seven-to-14-year-old audience, and Fantage goes so far as to replace the words "boyfriend" and "girlfriend" with "friend" in chat.

There are two reasons for the prohibition. The first is that some parents are uncomfortable with the idea of their young children engaging in any type of sexual role playing, no matter how innocent. The second is that these sites do attract pedophiles.

"It's like an online playground, but instead of being on the swings and climbing on the jungle gym, they’re pretending to date," said Sharon Duke Estroff, a writer and mother of four. "Look, it’s sweet on a certain level, but on another level it’s problematic."

Estroff has dived into popular virtual communities including Club Penguin, Poptropica, and Star Doll, uncovering things like cyberbullying, classism, and over-aggressive advertising. But of all the things that parents might disapprove of, she found the crude dating life to be the most unsettling. The worst was Barbie Girls, which "was really, really filled with it, and it seemed like there were a lot of creepy characters there," she told The Verge. (The worst thing we discovered on our own was probably this hardcore Animal Jam wolf sex tape.)

"It’s sweet on a certain level, but on another level it’s problematic."

Hence the ongoing censorship war between the kids and the sites' moderators. Kids can be pretty creative when it comes to "dictionary dancing" to get around word filters. This is how "valentine" becomes "val," then "vuv," then "buv," as each new term gets banned, a rapid maneuvering reminiscent of Chinese netizens circumventing the Great Firewall.

For moderators, it's a balancing act. If the site gets too draconian — for example, by banning the "heart" emoticon — users will rebel. If the site gets too lenient, parents will forbid their kids from playing. Most sites use a combination of semantic processing, user reporting, and human moderators in order to catch flirtation and sexual overtures, as well as things like cyber bullying and self-abuse.

Crisp Thinking is a third party moderation company that handles WeeWorld, Moshi Monsters, Bin Weevils, and other sites. "The tricky bit comes with kids being kids and role playing," chief technology officer Peter Maude told The Verge. "If you are going to allow it you are kind of slipping into a very gray area as to what is allowable, especially in the eyes of a guardian. But the key is to spot the real threats... especially on sexual predation."

Cybersex on Animal Jam.

The automated part of Crisp Thinking’s system watches live chats and builds "profiles" of each user. The system has a lot of false positives, so it takes a few alerts to bump up a user to the level where a moderator will be asked to intercede. Oftentimes the violating user will be silenced or kicked off the site, and the knowledge that they’re being watched is enough to make them go away, he said.

Crisp Thinking’s systems vary by client. It used to be that a site needed a moderator for every 2,500 concurrent users, Maude said, but with technology, that number can be reduced to one moderator for every 12,500 users. (A competitor, Inversoft, says it’s better to look at the volume of messages, recommending a site have at least one moderator online 24/7 for up to 10 million chat messages per month.)

Word filtering forces users into using code that’s more difficult to crack

Crisp Thinking also uses blacklists and whitelists of forbidden and permissible words, a practice employed by almost all virtual worlds aimed at young users. Whitelisting helps keep things safe and also ensures sites are in compliance with the Children's Online Privacy Protection Act, which prohibits sites from collecting personal information either purposefully or by accident.

However, whitelisting has limited power. "What you will find is that kids are very inventive and quite quickly they'll start to come up with their own codes," Maude said. "‘Tree’ becomes a replacement for three.’ ‘Hive’ becomes ‘five.’ It becomes a game. If I try and block the ‘f’ word, ‘truck’ becomes a replacement for ‘fuck.’ If I block the word ‘bitch,’ ‘bench’ becomes a replacement for the word ‘bitch.’ And it just goes on and on and on."

The danger with word filtering is that you force users into using code that’s more difficult to crack, he said. "The only game is not to play. If you overfilter, you push people off-piste and now you're into languages you don’t understand and cultural complexities for how people invent words," he explained. "It becomes futile."

It’s more important to focus on the really dangerous scenarios

When you’re dealing with sites that have tens of thousands of concurrent users, it’s necessary to prioritize, Maude said. Crisp Thinking focuses on the really dangerous scenarios in which kids are exchanging personally identifiable information, arranging a place to meet, or being "groomed" by a predator, he said. Questions like "when do your mom and dad get home," are typically at the top of the list of red flags. Crisp Thinking even gets "grooming scripts" from the UK police in order to stay up on pedophile slang.

An instructional video from a Club Penguin user.

Club Penguin is one of the most popular kiddie worlds with around 200 million registered users and an average age between 11 and 12, according to KZero. Bought by Disney in 2007, Club Penguin does its moderation in-house. Its system uses white lists, word filters, and the other known tricks, but it goes a bit further.

Club Penguin players can’t tell when their messages are being blocked, for one thing, online safety manager Gerard Poitras told The Verge. Moderators also immerse themselves in the game, going undercover to monitor slang trends and look for new phrases.

"Our moderators are working around the clock," Poitras said. "They're in-game all the time. We're able to get a very good sense of what's going on in the environment and in the audience."

There’s still something about virtual dating that just bugs parents

Poitras said Club Penguin has "hundreds" of moderators worldwide, which he believes is one of the largest if not the largest moderation staff out of comparable sites.

"This site is for younger children, and we do a lot of things to stop that sort of behavior for the audience," he said, when asked about dating within Club Penguin.

Ultimately, stopping predators from grooming children is much more important than preventing kids from pretend dating. But virtual dating still bugs parents, Estroff said, and it’s disturbing that the activity seems to flourish in online worlds made for children. By contrast, there’s little e-dating culture on Instagram, the virtual community she’s currently studying.

"It’s just the idea of these little kids," she said. "They’re girl scouts by day and hanging out with a boy in an igloo at night."