Skip to main content

California has banned political deepfakes during election season

California has banned political deepfakes during election season


The bill has raised questions about speech protections

Share this story

Facial Recognition
Illustration by James Bareham / The Verge

California has passed a law meant to prevent altered “deepfake” videos from influencing elections, in a plan that has raised free speech concerns.

Law sunsets in 2023

Last week, Gov. Gavin Newsom signed into law AB 730, which makes it a crime to distribute audio or video that gives a false, damaging impression of a politician’s words or actions. The law applies to any candidate within 60 days of an election, but includes some exceptions. News media will be exempt from the requirement, as will videos made for satire or parody. Potentially deceptive video or audio will also be allowed if it includes a disclaimer noting that it’s fake. The law will sunset in 2023.

While the word “deepfake” doesn’t appear in the legislation, the bill clearly takes aim at doctored works. Lawmakers have raised concerns recently that distorted deepfake videos, like a slowed video of House Speaker Nancy Pelosi that appeared over the summer, could be used to influence elections in the future.

At the same time, Newsom also signed a law that would ban pornographic deepfakes made without consent. While political deepfakes have generated headlines, at least one study recently found that the majority of deepfakes are pornographic.

The election law has also raised concerns about free speech, and groups like the American Civil Liberties Union of California have questioned the legislation’s value. “Despite the author’s good intentions, this bill will not solve the problem of deceptive political videos,” the group said in a statement noted by the Associated Press, “it will only result in voter confusion, malicious litigation, and repression of free speech.”