Skip to main content

Deepfake bots on Telegram make the work of creating fake nudes dangerously easy

Deepfake bots on Telegram make the work of creating fake nudes dangerously easy

/

Bots have been used to create more than 100,000 images

Share this story

A stock privacy image of an eye.
Illustration by Alex Castro / The Verge

Researchers have discovered a “deepfake ecosystem” on the messaging app Telegram centered around bots that generate fake nudes on request. Users interacting with these bots say they’re mainly creating nudes of women they know using images taken from social media, which they then share and trade with one another in various Telegram channels.

The investigation comes from security firm Sensity, which focuses on what it calls “visual threat intelligence,” particularly the spread of deepfakes. Sensity’s researchers found more than 100,000 images have been generated and shared in public Telegram channels up to July 2020 (meaning the total number of generated images, including those never shared and those made since July, is much higher). Most of the users in these channels, roughly 70 percent, come from Russia and neighboring countries, says Sensity. The Verge was able to confirm that many of the channels investigated by the company are still active.

The bots’ targets apparently include underage children

The bots are free to use, but they generate fake nudes with watermarks or only partial nudity. Users can then pay a fee equal to just a few cents to “uncover” the pictures completely. One “beginner rate” charges users 100 rubles (around $1.28) to generate 100 fake nudes without watermarks over a seven day period. Sensity says “a limited number” of the bot-generated images feature targets “who appeared to be underage.”

Both The Verge and Sensity have contacted Telegram to ask why they permit this content on their app but have yet to receive replies. Sensity says it’s also contacted the relevant law enforcement authorities.

In a poll in one of the main channels for sharing deepfake nudes (originally posted in both Russian and English), most users said they wanted to generate images of women they knew in “real life.”
In a poll in one of the main channels for sharing deepfake nudes (originally posted in both Russian and English), most users said they wanted to generate images of women they knew in “real life.”
Image: Sensity

The software being used to generate these images is known as DeepNude. It first appeared on the web last June, but its creator took down its website hours after it received mainstream press coverage, saying “the probability that people will misuse it is too high.” However, the software has continued to spread over backchannels, and Sensity says DeepNude “has since been reverse engineered and can be found in enhanced forms on open source repositories and torrenting websites.” It’s now being used to power Telegram bots, which handle payments automatically to generate revenue for their creators.

DeepNude uses an AI technique known as generative adversarial networks, or GANs, to generate fake nudes, with the resulting images varying in quality. Most are obviously fake, with smeared or pixellated flesh, but some can easily be mistaken for real pictures.

The Telegram bots make generating fake nudes as easy as sending a picture

Since before the arrival of Photoshop, people have created nonconsensual fake nudes of women. There are many forums and websites currently dedicated to this activity using non-AI tools, with users sharing nudes of both celebrities and people they know. But deepfakes have led to the faster generation of more realistic images. Now, automating this process via Telegram bots makes generating fake nudes as easy as sending and receiving pictures.

“The key difference is accessibility of this technology,” Sensity’s CEO and co-author of the report, Giorgio Patrini, told The Verge. “It’s important to notice that other versions of the AI core of this bot, the image processing and synthesis, are freely available on code repositories online. But you need to be a programmer and have some understanding of computer vision to get them to work, other than powerful hardware. Right now, all of this is irrelevant as it is taken care of by the bot embedded into a messaging app.”

Sensity’s report says it’s “reasonable to assume” that most of the people using these bots “are primarily interested in consuming deepfake pornography” (which remains a popular category on porn sites). But these images and videos can also be used for extortion, blackmail, harassment, and more. There have been a number of documented cases of women being targeted using AI-generated nudes, and it’s possible some of those creating nudes using the bots on Telegram are doing so with these motives in mind.

“demeaning fake videos and photos of each of us can ruin our reputation”

Patrini told The Verge that Sensity’s researchers had not seen direct evidence of the bot’s creations being used for these purposes, but said the company believed this was happening. He added that while the political threat of deepfakes had been “miscalculated” (“from the point of view of perpetrators, it is easier and cheaper to resort to photoshopping images and obtain a similar impact for spreading disinformation, with less effort”), it’s clear the technology poses “a series threat for personal reputation and security.”