Watch CBS News

New Minnesota law regulates "deepfakes" to curb influence on elections

New election law designed to safeguard process from AI
New election law designed to safeguard process from AI 02:09

MINNEAPOLIS — A new law in Minnesota prohibits the misuse of manipulated video, images and audio — "deepfakes" — that seek to influence elections. 

The provisions, which took effect this summer, are among the first state regulations like them nationwide, said Minnesota Secretary of State Steve Simon. And it addresses growing fears among officials that artificial intelligence poses a threat to elections.

The statute prohibits using this AI-generated content if it's created without the consent of the person depicted and with intent of hurting a candidate or influencing an election, within 90 days of Election Day.  

A deepfake can look and sound identical to the person whose likeness it intends to represent, but it is created without the individual's knowledge through artificial intelligence.

"It's really the same old poison in a different bottle, meaning we're looking at disinformation and dishonesty and lies about the election system, and how that can be spread," Simon said in an interview. "And this is a new way to spread it."

He described next Tuesday's local elections in Minnesota as a "dress rehearsal" for what will be a high-stakes and intense 2024 presidential election contest. Four years ago, people cast doubt about the results, contributing to lies that the election was stolen and rigged for President Joe Biden.  

Minnesota's new law, he believes, helps to bolster confidence that elections are run fairly.

"It's not a different threat, but it's a new way to amplify the old threats, and we want people going into the 2024 election to have that basic level of trust and confidence," Simon said. 

Minnesota joins just seven other states with laws regulating deepfakes, according to a Bloomberg report. The change passed nearly unanimously in the state legislature this past spring. 

A person can face prison time and fines for violating the new rules, which also extend to distributing AI-generated content that show sexual acts without the approval of the person whose likeness is in the video or photo. 

President Biden on Monday signed an executive order with new AI privacy and safety standards. He is instructing the U.S. Department of Commerce to develop guidance for authenticating and clearly labeling AI content. 

Dr. Manjeet Rege, director of the Center for Applied Artificial Intelligence at University of St. Thomas, said safeguards regulating artificial intelligence like these are crucial because now the technology is widely available.

"To preserve our democracy, I think having AI laws is extremely important, so that both the good and the bad guys know the repercussions of misusing AI," Rege said. "AI can do a lot of good, but we don't want AI in the wrong hands where it does a lot of damage."

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.