Governor Walz sings first-of-its-kind law to stop AI being used for CSAM

Governor Walz sings first-of-its-kind law to stop AI being used for CSAM

Minnesota is now the first state in the country to enact a law that prohibits access to what's known as "nudification" technology.  

On Thursday, Gov. Tim Walz announced he signed HF 1606, which received bipartisan support. 

"Technology is moving fast, but our responsibility to protect Minnesotans, especially kids and families, moves faster," Governor Walz said in a statement, "This bill makes clear that using technology to create fake, non-consensual intimate images is unacceptable and puts Minnesota at the forefront of addressing the harms of AI while protecting people from exploitation and abuse." 

The bill "prohibits the access, download, or use of nudification technology, except when the website, app, or software requires the substantial application of technological or artistic skill by a human creator directing and controlling the output," according to the House. It targets programs that use AI to fabricate a nude photo or pornographic video from someone's image.

Rep. Jess Hanson, who authored the bill that moved through the House, said that the legislation is critical in getting to the source of a problem that is growing as technology improves. The Internet Watch Foundation tracks the use of AI to sexually abuse children; in 2025, the organization identified 8,029 AI-generated images and videos "showing realistic child sexual abuse." 

"Once those images are out there, there's really no bringing them back," Rep. Hanson said, "the creation of this content has just exploded in the last couple of years. A lot of people don't know it exists." 

Rep. Hanson said that when the law goes into effect in August of this year, Minnesotans will not be able to access the technology that allows for nudification. It also creates a pathway for victims and the Minnesota Attorney General's Office to go after companies that are behind this kind of AI. 

While the civil penalties won't be retroactive, Rep. Hanson said that she wishes this kind of law had gone on the books sooner. On Thursday, the U.S. Attorney's Office for the District of Minnesota announced that 30-year-old Michael Haslach, a former school district employee, pleaded guilty to attempted production of child pornography and production of an obscene visual representation of child sexual abuse. 

In that case, investigators say the cloud storage company Dropbox submitted a tip in January of 2025 after detecting that a user in Maplewood, MN, was uploading child pornography. Local police then located and arrested Haslach, finding more than 690 "morphed" images of at least 91 children. He had obtained photos of these victims on the job as a lunch monitor and traffic guard at ISD 622 in North St. Paul and as a youth summer programs assistant at ISD 834 in Stillwater. He then used AI to take these images and turn them into pornographic ones.

Megan Hurley, who spoke about her experience at the Capitol in support of the bill now signed into law, said that a man she once considered a dear friend did something similar to her and dozens of other women.

"It was really disturbing. It looked real," Hurley said, "I had never taken nudes of myself I had never exchanged nudes with people." 

She said that she continues to suffer from trauma, noting that she needed to change her hours at work to avoid being alone. Hurley is concerned that these images, depicting her doing things she's never done in real life, are still out there.   

"There is no way to protect against it unless you never leave your home," Hurley said.   

For Hurley and Rep. Hanson, that's the key part of the new law. It gets to the source of the problem, which they say is the technology.   

Read more
f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.