Skip to main content
Cloudy icon
79º

Michigan to join state-level effort to regulate AI political ads as federal legislation is pending

FILE - Michigan Gov. Gretchen Whitmer speaks at the SelectUSA Investment Summit, May 4, 2023, in Oxon Hill, Md. Campaigns will be required to clearly state that political advertisements airing in Michigan were created with the use of artificial intelligence under legislation expected to be signed in the coming days by Whitmer. The use of AI-generated deepfakes within 90 days of an election will be prohibited without a disclosure identifying the media as manipulated. (AP Photo/Alex Brandon, File) (Alex Brandon, Copyright 2023 The Associated Press. All rights reserved.)

LANSING, Mich. – Michigan is joining an effort to curb deceptive uses of artificial intelligence and manipulated media through state-level policies as Congress and the Federal Elections Commission continue to debate more sweeping regulations ahead of the 2024 elections.

Campaigns on the state and federal level will be required to clearly say which political advertisements airing in Michigan were created using artificial intelligence under legislation expected to be signed in the coming days by Gov. Gretchen Whitmer, a Democrat. It also would prohibit use of AI-generated deepfakes within 90 days of an election without a separate disclosure identifying the media as manipulated.

Recommended Videos



Deepfakes are fake media that misrepresent someone as doing or saying something they didn't. They're created using generative artificial intelligence, a type of AI that can create convincing images, videos or audio clips in seconds.

There are increasing concerns that generative AI will be used in the 2024 presidential race to mislead voters, impersonate candidates and undermine elections on a scale and at a speed not yet seen.

Candidates and committees in the race already are experimenting with the rapidly advancing technology, which in recent years has become cheaper, faster and easier for the public to use.

The Republican National Committee in April released an entirely AI-generated ad meant to show the future of the United States if President Joe Biden is reelected. Disclosing in small print that it was made with AI, it featured fake but realistic photos showing boarded-up storefronts, armored military patrols in the streets, and huge increases in immigration creating panic.

In July, Never Back Down, a super PAC supporting Republican Florida Gov. Ron DeSantis, used an AI voice cloning tool to imitate former President Donald Trump’s voice, making it seem like he narrated a social media post he made despite never saying the statement aloud.

Experts say these are just glimpses of what could ensue if campaigns or outside actors decide to use AI deepfakes in more malicious ways.

So far, states including California, Minnesota, Texas and Washington have passed laws regulating deepfakes in political advertising. Similar legislation has been introduced in Illinois, New Jersey and New York, according to the nonprofit advocacy group Public Citizen.

Under Michigan's legislation, any person, committee or other entity that distributes an advertisement for a candidate would be required to clearly state if it uses generative AI. The disclosure would need to be in the same font size as the majority of the text in print ads, and would need to appear “for at least four seconds in letters that are as large as the majority of any text" in television ads, according to a legislative analysis from the state House Fiscal Agency.

Deepfakes used within 90 days of the election would require a separate disclaimer informing the viewer that the content is manipulated to depict speech or conduct that did not occur. If the media is a video, the disclaimer would need to be clearly visible and appear throughout the video's entirety.

Campaigns could face a misdemeanor punishable by up to 93 days in prison, a fine of up to $1,000, or both for the first violation of the proposed laws. The attorney general or the candidate harmed by the deceptive media could apply to the appropriate circuit court for relief.

Federal lawmakers on both sides have stressed the importance of legislating deepfakes in political advertising, and held meetings to discuss it, but Congress has not yet passed anything.

A recent bipartisan Senate bill, co-sponsored by Democratic Sen. Amy Klobuchar of Minnesota, Republican Sen. Josh Hawley of Missouri and others, would ban “materially deceptive” deepfakes relating to federal candidates, with exceptions for parody and satire.

Michigan Secretary of State Jocelyn Benson flew to Washington, D.C. in early November to participate in a bipartisan discussion on AI and elections and called on senators to pass Klobuchar and Hawley's federal Deceptive AI Act. Benson said she also encouraged senators to return home and lobby their state lawmakers to pass similar legislation that makes sense for their states.

Federal law is limited in its ability to regulate AI at the state and local levels, Benson said in an interview, adding that states also need federal funds to tackle the challenges posed by AI.

“All of this is made real if the federal government gave us money to hire someone to just handle AI in our states, and similarly educate voters about how to spot deepfakes and what to do when you find them,” Benson said. “That solves a lot of the problems. We can’t do it on our own.”

In August, the Federal Election Commission took a procedural step toward potentially regulating AI-generated deepfakes in political ads under its existing rules against “fraudulent misrepresentation.” Though the commission held a public comment period on the petition, brought by Public Citizen, it hasn’t yet made any ruling.

Social media companies also have announced some guidelines meant to mitigate the spread of harmful deepfakes. Meta, which owns Facebook and Instagram, announced earlier this month that it will require political ads running on the platforms to disclose if they were created using AI. Google unveiled a similar AI labeling policy in September for political ads that play on YouTube or other Google platforms.

___

The story has been updated to remove Kentucky from states where similar legislation has been introduced. Kentucky’s legislation is a bill request and has not been officially introduced yet.

___

Swenson reported from New York. Associated Press writer Christina A. Cassidy contributed from Washington.

___

The Associated Press receives support from several private foundations to enhance its explanatory coverage of elections and democracy. See more about AP’s democracy initiative here. The AP is solely responsible for all content.


Loading...

Recommended Videos