Commentary
On Aug. 22, Wired Magazine published an article titled, “
Could AI-generated porn help protect children?” Making the wrongheaded assumption that pedophilia is biologically determined and therefore justifies obscene solutions, the piece suggests that AI-generated child pornography could be “one way of diminishing risks.”
While such headlines may shock most Americans—who are still instinctively repulsed by child pornography, computer-generated or otherwise—this latest piece is only the most recent example of political elites on both left and right defending AI porn. For example, in a debate earlier this year, conservative radio talk show host
Dennis Prager asked, “Who is being hurt?” For AI child porn to be considered wrong, he insisted, “You have to have a victim.”
Other conservatives have been quick to
criticize Prager’s view; to date, however, no one has outright refuted his claim that AI child porn is a victimless crime. But considering the elite consensus on the question and the rapidly increasing sophistication and prevalence of AI pornography, this claim desperately needs to be refuted.
Simulations of child pornography aren’t victimless. Even if we ignore the obvious degradation of the person watching the AI-generated child porn and the person who made it, there are real children who are being exploited behind each computer-generated image.
Yes, predators use real videos and images of child exploitation to “train” AI programs to create more content. Moreover, predators often use social media images of kids to create “deep fake” look-a-likes for their perverse programs, which is enough to give any parents pause before they post images of their children online. So, while the actual video may not be of a real child suffering abuse, the AI program “learned” from real videos and uses the faces and likenesses of real children.
Though despicable, this isn’t rare. Avi Jager, the head of child safety and human exploitation at ActiveFence, said that
80 percent of respondents in a 3,000-person dark web pedophile forum admitted that they had or intended “to use AI tools to create child sexual abuse images.” AI porn is becoming so prevalent, in fact, that it could soon overwhelm attempts to find real victims from clues left by video footage.
Earlier this year, the movie “The Sound of Freedom” alerted lawmakers to the prevalence of such victims in the United States, which is estimated to host
over 50 percent of child porn websites. And indeed, saving these children should be a top priority for policymakers. But the truth is, if we only worry about children once they’ve been abused, we will be fighting a losing battle. To stop abuse before it happens, we must start paying attention to the harm that AI child porn—and porn of all kinds, for that matter—does to users.
Pornography rewires viewers’ brains based on the videos they see. As
Norman Doidge of Columbia University puts it, “Pornography, by offering an endless harem of sexual objects, hyperactivates the appetitive system.” It changes how a person sees other people, especially when young kids view this material.
AI-generated child pornography is sophisticated and has the same effect on a viewer’s brain as “real” porn. It can stimulate in users the same addictive descent into more and more graphic material.
Indeed,
studies show a link between watching child porn and sexually abusing children in real life. Michael C. Seto, a leading Canadian forensic psychologist, and colleagues found that
50 to 60 percent of viewers admit to abusing children.
In other words, even if images of child pornography don’t show real children, it still puts future children in harm’s way. The National Center for Missing and Exploited Children condemns AI-generated child pornography as a “
horrible societal harm” because of its links to real child abuse. Such content normalizes the sexualization of children and their abuse.
And simulations of child sex abuse are just the beginning of what unregulated, predatory use of AI programs will create.
Amazon,
Instagram, and
Wish have come under fire in recent years for allowing sellers to advertise sex dolls that approximate the size and features of children. If AI can create simulations of child pornography, it’s not a stretch for such technology to animate sex dolls or robots with a child’s voice or vocabulary.
With these horrifying possibilities on the horizon and evidence mounting up that AI porn has consequences in the real world, one might think that Mr. Prager’s position would prove untenable. But ever since a 2002 Supreme Court decision (
pdf) struck down part of a congressional ban on “virtual child pornography,” that’s precisely the position that has been enshrined in Court precedent. And for over two decades it has deterred Congress from passing sufficient regulations on the porn industry.
It’s time for that to end. AI-generated child porn isn’t a solution. It’s just the latest reminder of how much more advanced technology is today than it was in 2002, and of how much more we know about the way pornography—real or AI-generated—hurts its victims and its users. Given these changes, it’s time for Congress to once again consider measures to regulate the porn industry, starting with this latest AI-generated child porn.
While
some have suggested giving simulations “personhood” status as if they are real children, a better approach would be to simply restore obscenity laws and, if challenged, to let the Court reconsider its current position. Today, there would almost certainly be bipartisan support for a well-crafted bill, but if Congress doesn’t act soon, the public could become desensitized to even this extreme content, leaving AI-generated child porn to continue to devour the minds—and bodies—of our children.
Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.