The people making fake AI porn have been temporarily distracted by Nicolas Cage

Nicolas Cage appears in Raiders of the Lost Ark, thanks to AI.
Image: Derpfakes / Gfycat

Over the past few months, the internet has found a creepy new hobby: using artificial intelligence to paste the faces of celebrities onto porn videos. The technology has some disturbing implications, especially with regards to consent, harassment, and fake revenge porn. But on Reddit, where most of this content is being shared, users have found at least one wholesome (sort of) use for it: inserting Nicolas Cage into movies he never acted in.

Cage has long been an internet favorite, thanks in part to his willingness to appear in just about anything. Now, with the help of AI, his likeness can be seen in an even wider range of roles: replacing Sean Connery as James Bond in Dr. No; standing in as Indiana Jones in Raiders of the Lost Ark; and taking the place of Amy Adams in Batman v Superman. In a stroke of meta-genius, the user responsible for these videos, “derpfakes,” even pasted Cage’s face onto Andy Samberg’s during a Saturday Night Live skit featuring Nicolas Cage himself. How’s that for a face-off?

Despite the popularity of these clips, derpfakes is giving up on Cage. “He’s had a good run and hopefully made some people take an interest in this sub and the tech behind it, but it is time to move on,” he said in a comment. Users quickly began suggesting replacements for a new “torchbearer” for AI fakery. “I would prefer seeing John Malkovich’s face plastered on everyone in the movie, Being John Malkovich,” was one suggestion.

What’s notable, though, is derpfake’s assumption that the Cage clips might prompt people to “take an interest” in this technology. Using AI to paste people’s faces onto video clips has fantastic potential for fun and weird content, but we can’t ignore the fact that the push to make this technology accessible is coming from a group of internet users primarily interested in using it to create fake celebrity porn.

Right now, this content is still niche, but what happens when it hits the mainstream? When it’s targeting not only celebrities but private citizens? When it’s used to harass women (because it certainly will be), or when it finds its way into high schools? We’re seeing the first signs of this already. A recent Wired article noted that there are hardly any legal options for controlling this sort of media. “It falls through the cracks because it’s all very betwixt and between,” one law professor said.

Right now, Nicolas Cage is — quite literally — the acceptable face of this technology. But things are only going to get weirder and worse from here.

Recommended by Outbrain

Comments

This is why aliens won’t visit us.
Or asteroids won’t crash into us. Because, what’s the point?

Talking about the legal precedents; if I take a video and put someone’s face in it to make it seem like that person did something they never did – aren’t their already existing laws to cover that?

This specific tech may be new, but the intended goal isn’t. Surely there are laws that would cover me fabricating a newspaper article about a person I didn’t like then spreading it around my neighborhood? I don’t see how distributing a doctored photo is fundamentally different than distributing a doctored video in terms of the laws that would apply.

I assume you’re talking about slander/libel, which AFAIK only covers written/spoken word. Don’t think they imagined a future like this when the laws were made.

If you’re trying to prosecute someone with any sort of nuanced argument, it will have a giant hole the defense can take advantage of, so the laws will have to be changed.

So slander/libel doesn’t account for me doctoring photos of some person having an affair or accepting bribes, etc.?

Not in this manner, no. However, the second those items are brought forward as evidence in a court of law, someone is going to prison.

Sorry, I don’t understand your answer.

You’re saying slander laws don’t account for me doctoring photos but if someone showed evidence of me doctoring photos in a slander case I would be found guilty? Doesn’t that essentially mean that slander laws do take account me doctoring photos?

It’s not slander that will get you in trouble at that point, it’s falsifying evidence.

No, that has nothing to do with falsifying evidence.

Falsified evidence is false evidence that you created explicitly for the purpose of swaying the trial in question. It doesn’t apply to something I made up previously for some other reason. If I created fake images specifically to present during a trial that would be one thing, but if it was pictures that I created and sent to newspapers or other avenues of distribution for the purpose of slandering someone then that has nothing to do with falsifying evidence.

I think you have a valid point. The concept of visual slander or libel may not have occurred to the original framers of the laws for false textual or verbal representations, (though I’m surprised that retouched still photos haven’t prompted this already). This seems fully analogous to those offenses, where the mapping of someone’s face onto a body in an inappropriate act, while mapping their voice and lips into misrepresentative utterances is the trifecta of that abuse.

There will probably be some legislation offered very soon to address this, which will be gleefully signed by this president, since he already wants slander and libel to expand to cover statements when he doesn’t like them.

You can absolutely be sued for slander for spreading rudely photoshopped pictures of people. The laws are not targeted exclusively at slander performed by human tongue and lips or the latin alphabet. That would be an oddly specific law.

It would likely be sued for defamation rather than slander, slander is oral defamation, libel written defamation.
We’re talking doctored videos so it’s neither slander nor libel but it could definitely be argued as defamation.
The law is picky about words, use the wrong one and you’d lose

I don’t think so in the affair scenario, but since taking a bribe is illegal, you would be falsifying evidence, which is illegal.

I think falsified evidence is usually very clearly defined as material created explicitly to be used in a court case. I guess it’s debatable, but I don’t think false documents that were created and put out there anonymously for the purpose of slandering someone could be considered false evidence since I didn’t explicitly create them for a case and likely never expected any legal proceedings to result from them.

At that point it’s not falsified evidence; it’s just evidence lol. If the case is about using doctored media to defame someone, then the doctored media are simply evidence.

Yeah, that’s my point.

You can make up any newspaper article you feel like about anyone, however scurrilous the content, as long as you acknowledge it’s fake. I don’t think anyone making this stuff is trying to allege that A list actresses suddenly put on 20 pounds of weight, started wearing school uniforms/chambermaid outfits/etc and making out on camera. They’re quite open that what they’re doing is fake.

The part I was more focusing on is the potential use in harassment and bullying. The implication the article makes is that current laws aren’t equipped to handle these new advances in technology. But I don’t see how these new advances in technology wouldn’t be affected by current laws unless current laws are written so specifically that they cover existing things like doctored photos while excluding doctored video.

That was where I went with this as well. Grab sufficient video of another kid at school, apply face to porn. Tap into homophobia and make it gay porn. Schools can come down on it as hard as they like, but the damage will have been done.

We can already do the same with photoshop that looks even more realistic/less fake than the current deepfake AI. Whatever law covers that would cover this as well.

IANAL I am a comic book lawyer only

Rec for last sentence.

That was my initial point. This tech may be new, but the idea of putting out doctored information to insult/harass someone else is not new and has been technically possible for decades.

Like you said, if there are laws that restricts me from doing it in photoshop then that same law would apply to this.

I don’t think it’s a matter of acknowledging that it’s fake per se. Tabloids don’t explicitly label themselves as satire.

Rather, celebrities and public figures have a higher bar for defamation compared to regular people. It would have to be seriously malicious and they would have to prove that the publisher knew it was false — the writer saying they heard a rumor would probably be sufficient to cover their ass.

On the other hand, publishing the same kinds of things about an average joe would land them in a lot of trouble.

How about porn with Nicolas Cage

I wish someone would deepfake this meme and replace Jeff Goldblum with Nicolas Cage.

View All Comments
Back to top ↑