ChatGPT Homescreen (copy)

Wisconsin lawmakers could require that political ads disclose whether they include AI video or audio, as experts worry about nefarious uses of the new technology.

Wisconsin lawmakers are considering a bill to force political campaigns to tell the audience when they used artificial intelligence to create an ad, hoping to stop the use of "deepfakes" that give false impressions of candidates.

The bill, experts say, would be a good step but wouldn't eliminate the need for federal action to address the broader problem of AI in politics.

While many are bullish on the possibilities of ChatGPT and other AI tools, the ability to quickly and convincingly create manufactured video, audio or photos has led to concerns that the technology could be used by bad actors to spread disinformation and influence the outcome of an election.

AI has already begun creeping into political ads, both in the United States and globally.

A political action committee supporting Florida Gov. Ron DeSantis’ bid for president drew national attention for including AI-generated photos of former President Donald Trump embracing Anthony Fauci, his COVID-19 czar. Another ad used an AI version of Trump’s voice.

In Slovakia, an AI-generated audio clip of a party leader, Michal Šimečka, purportedly discussing election rigging with a newspaper reporter came out two days before an election during a period where news media and politicians are banned from discussing the results. Šimečka’s party ultimately lost its legislative majority in the election.

Dietram Scheufele, a University of Wisconsin-Madison professor whose work includes looking at AI and political communication, noted the new technology comes at the same time there's been a rise in disinformation delivered by foreign adversaries to disrupt the U.S. electoral system.

He raised the specter of an AI-generated video of President Joe Biden slurring his speech, released within days of the election before it could be adequately fact-checked, and shared by his potential political rival, former President Donald Trump.

“What does that do to the electoral system?” Scheufele said. “And none of the laws that we're thinking about, the legal tools that we’re developing, would help us with that at all.”

The power of AI is such that state Rep. Clinton Anderson, D-Beloit, the bill's co-author, had ChatGPT draft his testimony on the issue before appearing before the Assembly Committee on Campaigns and Elections on Jan. 9.

"I think it is very important coming into an election cycle that we have some certainty and get this into law," Rep. Adam Neylon, R-Pewaukee, the bill's other co-author, told the committee. "I think that it's the wild west out there when we don't have some level of protections or at least show that we're considering these issues and that Wisconsin is not a good state to try to spread disinformation through artificial intelligence."

Some states have tried to take steps to regulate or even ban AI “deepfakes” in campaign ads. 

A Minnesota law bans artificially-generated images or video within 90 days of an election if “a reasonable person would believe it depicts speech or conduct of an individual who did not in fact engage in such speech or conduct” and was intended to influence the outcome of an election.

A bipartisan group of U.S. senators has introduced legislation to ban false or misleading AI content in ads designed to influence federal elections, and the Federal Elections Commission has sought public comment on whether it should regulate the issue.

Facebook parent company Meta, meanwhile, has said it will attach a label to political ads that use AI. And even the American Association of Political Consultants has denounced the use of deepfakes, though it believes generative AI could be beneficial in other ways, particularly for smaller campaigns with fewer resources.

Wisconsin bill would require disclosure

Wisconsin legislators aren’t seeking to ban the practice but rather require a disclaimer for any ad — including those distributed by a campaign, political action committee or party — that uses audio or video that is “substantially produced” with generative AI.

Intentionally violating the disclaimer requirements would carry a $1,000 fine for each offense, though some lawmakers have expressed interest in stiffening the penalty.

"I just think that if we don't make a statement really upfront about how serious this is, it might not be taken as seriously as we want it to be taken," said Rep. Donna Rozar, R-Marshfield.

The bill has drawn concerns from the Wisconsin Ethics Commission, which would have to enforce the requirement and believes it would receive as many as 50 complaints a year.

More staff may need to be hired to investigate if an ad includes AI, the agency said in a memo estimating the bill's fiscal impact. The total cost of the bill is uncertain but the authors said they would support looking at additional funding if needed after it takes effect.

But the potential effects of AI on the state's political system are much broader than what is covered by the bill, Scheufele said, although this isn't because of poor bill drafting. It's because we don't yet know all the ways artificial intelligence might shape elections.

ideafest ai 091823 002-09182023210616 (copy)

University of Wisconsin-Madison professor Dietram Scheufele, director of graduate studies in the Department of Life Sciences Communication, said that a Wisconsin bill on AI and political campaigns could be a "stopgap" until broader federal action.

“I don't think anybody ... has a really good handle yet of where (AI in politics) is going to go," he said.

Gov. Tony Evers has already set up a taskforce on the issue of AI comprising state agency heads, higher education leaders and the private sector. Assembly Speaker Robin Vos, R-Rochester, formed his own panel of legislators, which has conducted hearings to gather input from business and tech leaders, as well as think tank groups.

Free speech concerns create limits

There are other concerns with regulating AI in politics. A more sweeping bill could violate free speech protections in the Constitution. Legal and advocacy groups note that manipulated content can sometimes have significant satirical or artistic value. 

Anderson said the Wisconsin bill was crafted in a way that preserves the free speech rights of campaigns — and avoids a costly court battle.

"We want to recognize free speech and we don’t want this to get struck down in the middle of an election cycle," he said. "I can only imagine if we had this struck down in September or October of 2024."

Plus, there's already a state statute that bars making false statements to influence an election, regardless of whether it involves AI. That could be used if someone were to make an AI-generated video spreading disinformation about when or how to vote.

Meantime, there is a general acknowledgement that the state alone cannot tackle the issue.

A clear federal framework for what uses of AI are acceptable would be valuable, Scheufele said, as would tools to help safeguard American democracy, such as a way of quickly detecting and noting the use of artificial intelligence in an effort to influence a close election.

That isn’t to say that Wisconsin legislators shouldn’t consider their own regulations, he said, just that the proposed law would have a limited impact.

“At best they’re stopgap measures,” Scheufele said. “And ultimately they’ll help us get to, or maybe show the urgency of, a federal set of rules and processes.”

Andrew Bahl joined the Cap Times in September 2023, covering Wisconsin politics and government. He is a University of Wisconsin-Madison alum and has covered state government in Pennsylvania and Kansas.

You can follow Andrew on X @AndrewBahl. You also can support Andrew’s work by becoming a Cap Times member.