Amid increasing reports from young girls and their parents about deepfake AI-generated sexual images landing on the phones of classmates, neighbors, and child predators, some state legislators are rushing to enact legislation to protect minors.
In response to incidents such as the victimization of a 14-year-old girl at a New Jersey school, who suffered due to AI-generated nude photos appearing on her classmates’ phones, several legislative bills have been publicly introduced.
“We started dealing with this in the 90s when we advocated the passage of legislation to regulate the internet and how to extend laws governing illegal content and activity.
“Back then we called it morphed child porn where a photographer would create an image using the face of a child and body of another,” Enough is Enough’s president and chairman Donna Rice Hughes told The Epoch Times.
“Now you have AI-morphed computer-generated content that’s impossible to tell if it’s a real child or not.”
There have been more and more instances of deepfake AI images showing up in schools across the U.S., including in Alabama, where Republican state legislator Matt Woods is taking action after hearing about another 14-year-old girl being victimized in school. Mr. Woods is also a father and small business owner.
“My friend’s daughter actually had to sit in class with the young man who did this to her. I have a 14-year-old daughter too and can’t imagine sending her to school and having to face that every day. They have not moved him out as of yet,” he told The Epoch Times.
Alabama law currently protects children under 17 against child pornography. Still, after hearing his friend’s story, Rep. Woods will introduce legislation called the Alabama Child Protection Act, which, among other things, will raise the age of protection to 18.
He told The Epoch Times he’s had widespread support for the legislation including from the Alabama attorney general and legislators from both parties. Still, the issue itself has been a personal wake-up call for Mr. Woods, who also has a 14-year-old daughter. “It’s kind of scary. We have a good child pornography law here that is pretty broad, but it doesn’t address AI. I hope I never understand why they want to do this and my mind will never work that way.”
The Alabama Child Protection Act would also give victims of child pornography a civil remedy, with violators liable for actual damages, legal fees, as well as punitive damages up to $10,000 per image for all forms of child pornography involving real children.
Several states have passed legislation in the past few years to fight the AI pornography problem, including Texas, Minnesota, Wisconsin, South Dakota, and New York.
National Legislation Stalled
In Washington, the story is different. Ms. Rice Hughes says there are efforts from Democrats and Republicans to battle AI with new federal legislation, but they have stalled for reasons she doesn’t understand.“States are ramping up on child protection because the federal government is slower and they can act more quickly. This is a good thing, but a lot of what’s happening in the states is mimicking what’s happening federally. We have bipartisan support for five to 10 major pieces of legislation in the Senate, but we can’t get (Senate Majority Leader) Senator Schumer to get these to a floor vote. We don’t know why,” she said.
A prime example is bipartisan legislation introduced by U.S. Senator Lindsey Graham (R-South Carolina) and U.S. Senator Richard Blumenthal (D-Connecticut) to encourage the tech industry to take online child sexual exploitation seriously. The bill called the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (EARN IT) would remove blanket immunity for violations of laws related to online child sexual abuse material.
The bill was initially introduced in 2020 and again in 2022 and 2023 but has never reached a floor vote because of concerns by some that a federal commission and state governments would have too much power in regulating the internet.
Just last week, U.S. Reps. María Elvira Salazar (R-FL) and Madeleine Dean (D-PA) introduced the No Artificial Intelligence Fake Replicas And Unauthorized Duplications (No AI FRAUD) Act. It aims to establish a federal framework designed to safeguard the rights of Americans to their likeness and voice, protecting them against AI-generated fakes and forgeries.
“Not only does our bill protect artists and performers, but it gives all Americans the tools to protect their digital personas,” said Rep. Dean in a written statement. “By shielding individuals’ images and voices from manipulation, the No AI FRAUD Act prevents artificial intelligence from being used for harassment, bullying, or abuse. I am encouraged to see collaboration across the aisle to get these crucial protections passed.”
Another thing the deepfake AI emergence has helped prompt is a change in terminology. While advocates like Enough is Enough’s Ms. Rice Hughes have been fighting to make the internet safer for children by addressing and combating child pornography since the early 90s, that battle has taken on a greater context.
“We have now started, along with lawmakers, to talk about Child Sex Abuse Material (CSAM),” she said but warned that even though legislation is being introduced nationwide, it’s still up to parents to know what’s going on. “They have to get educated and equipped and empowered and know their children are vulnerable to all this. There is little legal recourse right now, especially with AI.”