Governor Gavin Newsom Takes Aim At Deepfakes

Imagine if you could create an ultra-realistic video of your least favourite politician saying, in their own words, a string of racist, sexist and homophobic remarks.

Now, imagine the impact you could have if you released such a video in the midst of said politician’s re-election campaign. Other than a very notable example, it would probably spell the end of their political career.

That technology is already upon us and has already caused a slew of damage. Deepfakes — a name for the technology used to astounding effect by Ctrl Shift Face, seamlessly merging comic actor Bill Hader’s face with that of actors Tom Cruise and Seth Rogen — has taken an expected and horrific turn towards the dark side. Actress Scarlett Johanssen has been the victim of deepfaked pornographic videos, while videos of Nancy Pelosi, Speaker of the US House of Representatives, had been deepfaked to make it look as if she was drunkenly slurring her words during a speech.

In fact, according to the BBC, the number of deepfake videos online has almost doubled in the past nine months—mostly, it turns out, for pornography.

 

deepfakes

A deepfake mix between actors Jack Nicholson and Jim Carrey. Photo on El Confidencial

 

All of this has been the impetus for two new bills signed into law by California Governor Gavin Newsom. Engadget reports that these will make it illegal to post deepfaked videos of politicians to discredit them within 60 days of an election and allow people in the state to sue anyone who puts their image into a pornographic video using deepfake technology.

The site quotes California Assembly representative Marc Berman, who said that voters “have a right to know” when video, audio, and images have been manipulated “to try to influence their vote in an upcoming election.”

“That makes deepfake technology a powerful and dangerous new tool in the arsenal of those who want to wage misinformation campaigns to confuse voters,” he added.

However, comments in the Guardian by Professor of Media Ethics Jane Kirtley suggest that deepfake laws may be difficult to enforce and less effective than existing copyright legislation.

“Political speech enjoys the highest level of protection under US law,” she said. “The desire to protect people from deceptive content in the run-up to an election is very strong and very understandable, but I am skeptical about whether they are going to be able to enforce this law.”

Leave a Reply

Your email address will not be published. Required fields are marked *