The FBI has issued an advisory warning of an “uptick” in extortion schemes involving fake nudes created with the help of AI editing tools.
The agency says that as of April this year, it’s received an increasing number of reports of such “sextortion” schemes. Malicious actors find benign images of a victim on social media then edit them using AI to create realistic and sexually-explicit content.
“The photos are then sent directly to the victims by malicious actors for sextortion or harassment,” writes the agency. “Once circulated, victims can face significant challenges in preventing the continual sharing of the manipulated content or removal from the internet.”
“The key motivators for this are a desire for more illicit content, financial gain, or to bully and harass others”
The FBI says blackmailers typically use such material to demand real nude images from a victim or payments of some sort. Says the agency: “The key motivators for this are a desire for more illicit content, financial gain, or to bully and harass others.”
The agency recommends that the public “exercise caution” when sharing images of themselves online, but this is difficult advice to follow. Only a few images or videos are needed to create a deepfake, and no-one can be completely safe from such extortion schemes unless they remove all images of themselves from the web. Even then, individuals could covertly capture photographs in real life if they know their target personally.
Nude deepfakes first began to spread online in 2017, when users on forums like Reddit began using new AI research methods to create sexually explicit content of female celebrities. Although there have been some attempts to counter the spread of this content online, tools and sites to create deepfake nudes are easily accessible.
The FBI notes that such extortion schemes “may violate several federal criminal statutes.” There are also a limited number of global laws that criminalize the creation of such non-consensual fake images. In Virginia in the US, for example, deepfakes are outlawed as a type of “revenge porn” while the UK is currently planning to make the sharing of such images illegal in its upcoming Online Safety Bill.