As the creation of deepfakes becomes accessible, more people may be harmed by manipulated videos. In response, some regions are beginning to create laws to mitigate these risks. This is the case of California, which approved two projects in this regard.
One of them makes it illegal to publish deepfakes that disparage candidates in the 60 days leading up to an election. The other approved project opens up California citizens to sue anyone who uses their images in pornographic videos manipulated without consent.
The new laws come in response to two of the areas most affected by deepfakes: politics and pornography. Recently, the current president of the United States House of Representatives, Nancy Pelosi, had her image used in a video that hinted that she was drunk during a speech.
Technology has also led to the emergence of adult videos manipulated by software with the image of famous and anonymous women. Even Mark Zuckerberg had his image used to make a controversial statement.
The video made the executive “say” that he was able to control stolen data, secrets and users’ lives. The material was created as an experiment and became a provocation to Facebook, which refused to remove Nancy Pelosi’s deepfake.
California Legislative Assembly member Marc Berman, who presented the two approved bills, says voters have a right to know when photos, videos and audios shown to them are not real.
For the American Civil Liberties Union (ACLU), the restriction on political deepfakes will not solve any problems. “This will only result in voter confusion, malicious litigation and repression of freedom of expression,” said the association.