Colorado legislation would make it a crime to create a fake nude of a real person

Colorado legislation would make it a crime to create a fake nude of a real person

With advancements in artificial intelligence, there are a growing number of websites that can turn real pictures of people into real-looking nude pictures. The digitally altered images are known as deepfake porn, and you don't have to be a tech genius to create them. There are dozens of websites that will do it for you using pictures of real people to generate digital nudes.  

"It's so realistic that it's hard to tell if it's real, if it's fake and it's also hard to tell if that's actually the person's face with their body," said state Rep. Matt Soper, who is co-sponsoring a bill with state Sen. Robert Rodriguez to make deepfake porn a crime.

"If this happens to somebody, law enforcement doesn't know what to do. They're like 'We have nothing for this.' Teachers that have this happen in school with kids (wonder) 'What do we do? Who do we tell?'" said Rodriguez, who notes 37 other states have passed similar measures.

Under their bill, posting an intimate deepfake would be a class 1 misdemeanor unless the image is used to influence a court proceeding or election, or if it poses a serious threat to the person depicted or their family. Then it would be a class 6 felony.

Getty Images

"If there's any recognizable feature at all and it's AI generated, it's computer generated, we're making that a crime," said Soper.

The bill also criminalizes digital nudes of fake people, but Soper says they are removing that provision.

"We got into the sticky wicket of having to explain where does art end and pornography begin," he said.

The Colorado District Attorneys Council -- which supports the bill -- says over the last two years the National Center for Missing & Exploited Children has received more than 7,000 reports of intimate deepfakes involving kids.

"We're in an evolution that this stuff's moving faster than we can legislate. This is the first step of getting it in Colorado," said Rodriguez.

The bill also allows victims to sue for up to $150,000 in damages and attorney fees.

While congress just passed a bill criminalizing intimate deepfakes, without a state law, district attorney's can't bring charges. Instead, the U.S. Attorney's Office would have to prosecute cases and it has a significant backlog right now.

Read more
f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.