Skip to main content

Is it legal to swap someone’s face into porn without consent?

Is it legal to swap someone’s face into porn without consent?

/

Yes, no, maybe

Share this story

For victims of revenge porn and other explicit material shared without consent, legal remedies have arrived only within the last decade. But thanks to AI-assisted technology, anyone with an online presence could now end up starring in pornography against their will — and there’s little that the law can do about it.

For the past several weeks, a subreddit called “deepfakes” has been saturated with doctored images that depict famous figures, mostly women, engaging in sexual acts, where their faces are believably mapped onto pornographic pictures, GIFs, or videos. “ScarJo, take three” appears to show Scarlett Johansson masturbating in a shower. “Taylor Swift” is a blurry-faced shot of the singer being penetrated. “Emma Watson sex tape” seems to feature the actor stripping.

In December 2017, Motherboard broke the news that a Redditor by the name of “deepfakes” had figured out how to create this kind of face-swapped fake porn, and the AI-assisted tech advanced quickly. By January, not only was there a subreddit dedicated to “deepfakes,” but there was an app designed to make creating them as easy as possible.

As the community around it has grown, from the subreddit to a now-banned Discord channel, so have the number and quality of deepfakes. Although there are benign applications of this technology — it’s harmless to swap in actor Nicolas Cage for a bunch of goofy cameos — it’s a lot less cute in the hands of someone with more malicious goals, like placing unwilling participants in explicit sex videos. Photoshopped pornography is already a common harassment tool deployed against women on the internet; a video makes the violation far more active, and harder to identify as forged.

A deepfake of Daisy Ridley from the subreddit
A deepfake of Daisy Ridley from the subreddit

As deepfakes become more refined and easier to create, they also highlight the inadequacy of the law to protect would-be victims of this new technology. What, if anything, can you do if you’re inserted into pornographic images or videos against your will? Is it against the law to create, share, and spread falsified pornography with someone else’s face?

The answer is complicated. The best way to get a pornographic face-swapped photo or video taken down is for the victim to claim either defamation or copyright, but neither provides a guaranteed path of success, says Eric Goldman, a law professor at Santa Clara University School of Law and director of the school’s High Tech Law Institute. Although there are many laws that could apply, there is no single law that covers the creation of fake pornographic videos — and there are no legal remedies that fully ameliorate the damage that deepfakes can cause.

“It’s almost impossible to erase a video once it’s been published to the internet,” Goldman says. “... If you’re looking for the magic wand that can erase that video permanently, it probably doesn’t exist.” 

A defamation claim could potentially be effective because the person depicted in the video isn’t actually in it, Goldman explains. It’s a false statement of fact about the victim’s presence, so they could theoretically get a judgment against the perpetrator that orders the removal of the video or images. However, a defamation claim is hard to win. “[Defamation claims] can be expensive, and if you’re dealing with overseas or anonymous content publishers, they’re not even all that helpful,” Goldman says.

As Wired points out in a piece on the legality of deepfakes, the fact that it isn’t a celebrity’s body makes it difficult to pursue as a privacy violation: “You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.”

Getting the content removed could be a possible First Amendment violation. “All content is presumptively protected by the First Amendment,” Goldman says. The exceptions to free speech are narrowly defined, such as obscenity, some forms of incitement to violence, and child pornography. (Most deepfakes are careful to use images of people 18 and older.) “Other incursions into the First Amendment, such as defamation or publicity/privacy rights, are structured to balance First Amendment considerations with general tort or crime principles,” he says. “So the burden will be on the plaintiff to find a doctrine outside the First Amendment or to explain how the claim avoids any First Amendment protections.”

“It’s almost impossible to erase a video once it’s been published to the internet.”

If deepfakes victims are hoping to get help from platforms themselves, they’re also facing a hard road. Platforms could ban the images or communities for violating their terms of service, as Discord did. But section 230 of the Communications Decency Act (often shortened to CDA 230) says that websites aren’t liable for third-party content. “So if a bad guy creates a fake video and posts it on a third-party site, that third-party site isn’t going to be liable for that video and cannot be forced to remove it,” Goldman says. Any injunction that a victim received would only apply to the person who shared the content, and not the platform.

It could also be possible to get a video removed with a copyright claim. The person or persons who own the copyright to the original video — that is, the untampered pornographic footage deepfakes build upon — could claim infringement based on the modification and republication.

“[The copyright owner] would have the right to assert that the re-publication of the video is copyright infringement,” Goldman says. “A couple advantages of that. One is that injunctions are a standard remedy for copyright infringement, unlike defamation, where it’s a little bit more murky. And two is that section 230 doesn’t apply to copyright claims.”

In other words, while a website has no obligation to remove a video for defamation, it would need to pull a video that infringes on copyright — or face liability equal to the person who posted the video. However, this isn’t much help to the specific victim featured in the video, as it’s likely they don’t own that copyright.

The deepfakes community has already begun to move some of its content away from Reddit. While some of the videos have been shifted to PornHub, another user started a site dedicated specifically to celeb deepfakes. The site defines its content as “satirical art” and claims, “We respect each and every celebrity featured. The OBVIOUS fake face swap porn is in no way meant to be demeaning. It’s art that celebrates the human body and sexuality.”

The site also notes that it makes no claims to own the rights to the images or videos on the site. In theory, this could help lessen confusion about the veracity of the content, Goldman says, thereby addressing would-be claims of defamation. However, it won’t help with copyright. “Furthermore, for videos that ‘leak’ from the site to the rest of the Internet, the disclaimers likely will not help with any legal defense,” he adds.

But again, each video depicts at minimum two people: the person whose body is truthfully being represented, and the person whose face has falsely been added. Unfortunately, Goldman says, the former likely doesn’t have a good legal claim either. There is no falsifying of that person’s body, and it’s likely the actor portrayed does not have a copyright claim to the film.

Laws surrounding revenge porn are one possible avenue for victims seeking justice

“If the body were recognizable, then it might be possible that they would either have defamation or some privacy claims for the false depiction of another person’s face,” Goldman says. “So for example, if someone has really distinctive tattoos that everyone knows, it’s possible that we’ll know then that the body is associated with a particular person and that might create some possible claims. But that’s an unlikely circumstance.”

Private citizens are likely to have more of a legal advantage in these situations than celebrities because they aren’t considered public figures. “[Celebrities are] going to have possibly fewer privacy rights,” Goldman says, “and defamation law will actually adjust and scale back the protection because of the fact that they’re famous.”

Goldman also points to laws surrounding revenge porn as one possible avenue for victims seeking justice, especially as that particular field of legislation continues to develop. He wrote a paper on the dissemination of non-consensual pornography, which discusses the common law tort intentional infliction of emotional distress.

“It’s causing somebody emotional distress intentionally,” Goldman says. “You’re looking for a way to give them a bad day. A law like that normally has quite significant limits. We don’t want everyone suing each other for ordinary anti-social behavior. But it’s very powerful in a non-consensual pornography case, because usually the release of non-consensual pornography is in fact designed exactly for that purpose: to intentionally inflict emotional distress.”

On the deepfakes subreddit, however, many users have pushed back against the idea that these images and videos are harmful, despite their non-consensual and pornographic nature. In a lengthy Reddit post, a user by the name of Gravity_Horse says that “the work that we create here in this community is not with malicious intent. Quite the opposite. We are painting with revolutionary, experimental technology, one that could quite possibly shape the future of media and creative design.”

“We have to prepare for a world where we are routinely exposed to a mix of truthful and fake photos and videos.”

Not everyone on the subreddit thinks that faked, non-consensual porn is so benign, however, particularly for those pictured in it. Another post from harmenj argues, “This must feel as digital rape for the women involved.” In a post titled “This is fucking insane,” Reddit user here_for_the_schloc added, “The quality of these forgeries is incredible and almost indistinguishable from reality... [they can] make it seem like celebrities and political figures say and do whatever you want in a recorded way or blackmail people with videos that don’t really exist. And you guys are just whacking it.”

Deepfakes could also expand to problematic areas beyond pornography and use the technology to create “fake news” involving politicians and other public figures — or just about anyone. Although legislators could attempt to craft new laws that address face-swapped porn in the context of the First Amendment, Goldman thinks the solution will need to go beyond just a legal one. “I think we have to prepare for a world where we are routinely exposed to a mix of truthful and fake photos and videos,” he says.

“We have to look for better ways of technologically verifying content. We also have to train people to become better consumers of content so that they start with the premise that this could be true or this could be fake, and I have to go and figure out that before I take any actions or make any judgements,” Goldman adds. That’s a much harder idea to enforce, he says, one that requires a thorough education in digital literacy — especially for kids.

“It absolutely bears repeating that so much of our brains’ cognitive capacities are predicated on believing what we see,” Goldman says. “The proliferation of tools to make fake photos and fake videos that are indistinguishable from real photos and videos is going to test that basic, human capacity.”