Bill That Would Criminalize AI-Generated Child Porn Passes California Assembly

The author said AB 1831 closes a loophole that lets those creating obscene material with AI go uncharged. The bill had zero opposition.
Bill That Would Criminalize AI-Generated Child Porn Passes California Assembly
The California Capitol on March 13, 2024, in Sacramento. (Arturo Holmes/Getty Images for National Urban League)
Travis Gillmore
Updated:
0:00

A bill that would outlaw child sexual abuse material created with artificial intelligence passed the Assembly on May 22 in a 71–0 vote.

Assembly Bill 1831—introduced by Assemblyman Marc Berman to prohibit the creation, possession, and distribution of child pornography generated by artificial intelligence technologies—cleared three committees earlier this year without a single vote against the measure.

The author said the bill will protect children by closing a loophole in existing law that prevents prosecutors from leveling child porn charges against those using AI to create obscene material involving sexual abuse of children.

“AB 1831 ... says that anybody who possesses, creates, or distributes that type of porn material, you are going to be found, you are going to be caught, you are going to be prosecuted, and you are going to get the penalties you deserve,” Mr. Berman told The Epoch Times. “It’s really important to note that the images are created using thousands of images of actual children, so it isn’t just that there’s one victim, there’s thousands of victims.”

He said that offenders are currently using pictures found on the internet, including on social media profiles, school websites, and other platforms, to produce illicit content.

“This is important for parents to understand,” Mr. Berman said.

He pointed to evidence—including a report published in 2023 by the Internet Watch Foundation—that shows that people with access to child pornography are much more likely to act on their urges and physically victimize children.

“And that’s why it’s so important that we identify this and nip it in the bud as soon as possible,” Mr. Berman said. “That shouldn’t happen to any teen. I don’t care if they’re an actress that’s famous or just a child at the playground, nobody period, especially no children, should have to go through [this].”

One child actress and victim of AI-generated child porn, Kaylin Hayman—known for her role in “Just Roll With It”—a Disney Channel comedy series—was notified by the FBI in 2023 that a man was caught in possession of obscene material that depicted a picture of her taken when she was 12 years old that was manipulated to look as if she were involved in sexual activity.

“I felt violated and disgusted to think about the fact that grown men see me in such a horrendous manner,” Ms. Hayman said while testifying in support of the measure at the Assembly’s Public Safety Committee hearing on April 9. “While speaking about this topic is daunting, I know deep down I need to share my voice.”

She said she was traumatized emotionally and mentally and urged lawmakers to help protect other children from experiencing the same.

“Since my victimization, it is a constant thought that every man has malicious intent against me because I feel a lack of protection in my everyday life,” Ms. Hayman said. “These circumstances have made me feel uneasy and angry.”

She said the bill would prove immensely beneficial for children around the world because of California’s leadership role and the example the law would set.

“Not only is this going to help children all over the world, but it will protect their inner peace and innocence,” Ms. Hayman said. “This law would protect minors in the industry from being sexually exploited like they have been for decades. No more kids would have to be susceptible to the feeling that they were not protected.”

With the rapid advancement of AI technologies, child pornography generated via computer is practically indistinguishable from real photos, according to law enforcement officials who say the technology allows perpetrators to produce what are seemingly nude photographs of young children by modifying pictures scraped off the internet.

“The images are often so realistic that the human eye cannot tell they are fake,” the Ventura County District Attorney’s office said in legislative analyses.

Supporters say the bill will help prevent the sexual exploitation of children by clarifying that no legal difference exists between AI-generated and other child pornography.

“AB 1831 sends a clear message that our society will not tolerate the malicious use of AI to produce harmful sexual content involving minors,” the district attorney’s office said. “[The bill] will ensure that the worst of these morphed images are unlawful even if the child cannot be identified.”

Critics argue the bill would potentially violate the right to free speech guaranteed by the U.S. Constitution’s First Amendment.

“While we appreciate the potential harms caused by many forms of new technology, we fear that the current version of AB 1831 improperly restricts lawful speech and runs afoul of the First Amendment,” the American Civil Liberties Union’s California Action group said in legislative analyses.

The group contests the language that makes the material illegal even if a real child is not depicted or if the person depicted appears to be younger than 18 but is not.

“AB 1831 suffers from such overbreadth,” the group said. “The author may wish to consider whether this nuance should be clarified as the bill moves through the Assembly.”

Critics cite a 1973 U.S. Supreme Court decision in Miller v. California—in which the court defined obscene material as that lacking “serious literary, artistic, political, or scientific value”—as evidence the bill is too broad and would violate federal law.

In the ruling, justices urged caution when regulating free speech.

“There are inherent dangers in undertaking to regulate any form of expression,” justices wrote in the judgment. “State statutes designed to regulate obscene materials must be carefully limited.”

Supporters of the bill highlight a subsequent ruling by the court in New York v. Ferber in 1982 that further defined situations in which states can regulate speech related to child pornography and ruled that the state has a vested interest in protecting children.

After passing the Assembly, the bill will next be heard by respective committees in the Senate in the coming weeks.

Travis Gillmore is an avid reader and journalism connoisseur based in California covering finance, politics, the State Capitol, and breaking news for The Epoch Times.