Screenshot 2024-03-20 142917

Illinois bill would crack down on political deepfakes

SPRINGFIELD, Il. (WCIA) – In an election year, voters see all kinds of information about candidates and issues they’ll be voting for on the ballot.

But some of that information isn’t always true and could even be a political deepfake – digitally altered images, videos and audio that can make it look like a candidate is saying or doing something they never did.

“The AI-generated material is often indistinguishable from reality, where maybe several years ago, if you saw a video or saw an image that was meant to portray a candidate, you can kind of tell most people would be able to say, ‘I know that’s fake, or that’s photoshopped,’” State Rep. Abdelnasser Rashid, (D-Bridgeview) said.

Some voters in New Hampshire experienced it first hand. People received a robocall from President Joe Biden discouraging them from heading to the polls to cast their ballot.

“The messages really matter,” Michelle Nelson, a professor in the Department of Advertising and Institute of Communications Research at the University of Illinois, said. “People are using information we see on the Internet, we hear from our friends and family to make a really important decision, or lots of important political decisions in November, and so the concern there is for deception, for the consequences and the risks involved.”

Rashid is pushing for a bill to stop people from distributing or working with someone else to knowingly distribute deceptive political media, including the intent to harm a candidate’s electoral chances and to influence voting behavior.

“It can do very serious damage to the integrity of our elections, because people believe it to be true when it is not,” Rashid said.

The bill does carve out an exception for political media that includes a disclaimer informing people that the content was manipulated with the help of technology, or by disclosing what was said or what occurred in the content didn’t actually happen.

Nelson said disclosure laws can help, pointing to an AI-generated ad the Republican National Committee made attacking Biden. They labeled the video informing viewers that it was made entirely with AI.

“A lot of our disclosure laws want to convey a clear and conspicuous kind of label to educate people about the content,” Nelson said.

The First Amendment of the U.S. Constitution protects political speech and political advertising.

Nelson said finding a balance between regulations and allowing the free flow of political information can be a challenge.

“We don’t want to have a chilling effect by regulating too much, but here we are in a different time with a different technology, and I firmly believe that disclosure is good,” Nelson said. “We need to know the source of the message, we need to know who sent it and why.”

If the bill becomes law, anyone who violates it could be charged with a Class C misdemeanor. If someone violates it again, it would be considered a Class 3 felony.