Microsoft has released a new tool for identifying child predators who groom children for abuse in online chats. Project Artemis, based on a technique Microsoft has been using on the Xbox, will now be made available to other online companies with chat functions. It comes at a time when multiple platforms are dealing with child predators targeting kids for sexual abuse by striking up conversations in chat windows.
Artemis works by recognizing specific words and speech patterns and flagging suspicious messages for review by a human moderator. The moderator then determines whether to escalate the situation by contacting police or other law enforcement officials. If a moderator finds a request for child sexual exploitation or images of child abuse, the National Center for Missing and Exploited Children will be notified for further action.
“Sometimes we yell at the platforms — and there is abuse on every platform that has online chat — but we should applaud them for putting mechanisms in place,” says Julie Cordua, CEO of nonprofit tech organization Thorn, which works to prevent online sexual abuse of children. “If someone says, ‘oh we don’t have abuse’ I’ll say to them, ‘well, are you looking?’”
In December, The New York Times found that online chat platforms were fertile “hunting grounds” for child predators who groom their victims by first befriending them and then insinuating themselves into a child’s life, both online and off. Most major platforms are dealing with some measure of abuse by child predators, including Microsoft’s Xbox Live. In 2017, as the Times noted, a man was sentenced to 15 years in prison for threatening children with rape and murder over the Xbox Live chat.
Detection of online child sexual abuse and policies for handling it can vary greatly from company to company, with many of the companies involved wary of potential privacy violations, the Times reported. In 2018, Facebook announced a system to catch predators that looks at whether someone quickly contacts many children and how often they’re blocked. But Facebook also has access to much more data about its users than other platforms might.
Microsoft’s tool is important, according to Thorn, because it’s available to any company using chat and helps to set an industry standard for what detection and monitoring of predators should look like, helping with the development of future prevention tools. Chats are difficult to monitor for potential child abuse because there can be so much nuance in a conversation, Cordua says.
Child predators can lurk in online chat rooms to find victims much like they would offline, but with much more immediate access, says Elizabeth Jeglic, a professor of psychology at John Jay College of Criminal Justice in New York who has written extensively about protecting children from online sexual abuse, in particular, the often subtle practice of grooming. “Within 30 minutes they may be talking sexually with a child,” she says. “In person it’s harder to get access to a child, but online a predator is able to go in, test the waters and if it doesn’t work, go ahead and move on to the next victim.”
It doesn’t stop with one platform, Cordua adds. “They’ll try to isolate the child and will follow them across multiple platforms, so they can have multiple exploitation points,” she says. A predator may ask a child for a photo, then ratchet up the demands to videos, increasing the level of sexual content. “The child is racked with guilt and fear, and this is why the predator goes across platforms: he can say ‘oh I know all your friends on Facebook, if you don’t send me a video I’ll send that first photo to everyone at your junior high.’”
Artemis has been in development for more than 14 months, Microsoft says, beginning in November 2018 at a Microsoft “360 Cross-Industry Hackathon,” which was co-sponsored by two children’s protection groups, the WePROTECT Global Alliance and the Child Dignity Alliance. A team including Roblox, Kik, Thorn, and The Meet Group worked with Microsoft on the project. It was led by Hany Farid who developed the PhotoDNA tool for detecting and reporting images of child sexual exploitation online.
Some of the details about how the Artemis tool will work in practice are unclear, however, and are likely to vary depending on which platform is using it. It’s not stated whether Artemis would work with chat programs that use end-to-end encryption, or what steps will be taken to prevent potential PTSD among moderators.
Thorn will be administering the program and handling licensing and support to get participating companies onboarded, Microsoft says.
Cordua says while Artemis has some initial limitations — it currently only works in English — the tool is a huge step in the right direction. Since each company that uses the tool can customize it for its own audience (chats on gaming platforms will obviously vary from those on social apps), there will be ample opportunity to adapt and refine how the tool works. And, she says, it’s about time platforms move away from the failed practices of self-policing and toward pro-active prevention of child grooming and abuse.
In its blog post, Microsoft adds that the Artemis tool is “by no means a panacea” but is a first step toward detecting online grooming of children by sexual predators, which it terms “weighty” problems.
“The first step is we need to get better at identifying where this is happening,” Cordua says. “But all companies that host any chat or video should be doing this or they are complicit in allowing the abuse of children on their platforms.”