A Missouri congresswoman has introduced bipartisan legislation enhancing online platforms’ reporting requirements on the sex trafficking of children.
The legislation is supported by Reps. Don Bacon (R-Nebraska), Zach Nunn (R-Iowa), and Sylvia Garcia (D-Texas).
The Child Online Safety Modernization Act proposes to increase the reporting of online child exploitation by requiring social media platforms to collaborate with law enforcement to find children found in images identified as Child Sexual Abuse Material (CSAM), a phrase that the National Center for Missing and Exploited Children (NCMEC) prefers to the former description of “child pornography” because, according to NCMEC, it more appropriate describes what is taking place in the images and videos, which is the sexual abuse and exploitation of children.
The NCMEC, a nonprofit instituted by Congress, operates the CyberTipline, the centralized reporting system where sites can submit incidents of online exploitation.
By 2022, the tip line had received over 32 million reports of CSAM, an 89 percent increase since 2019, which equaled around 87,600 reports per day.
According to the legislation, 50 percent of the 32 million reports submitted to NCMEC didn’t have enough information to contribute to the locating of children, according to the legislation.
“Additionally, online platforms are currently not required to report instances of child sex trafficking or the sexual enticement of a child to the CyberTipline,” the legislation says. “Tragically, since 2018, NCMEC has seen a 567% increase in sexual enticement of a child.”
The legislation proposes to enhance the CyberTipline requirements, which would include allowing NCMEC to share “technical identifiers” from CSAM with other nonprofits to assist in the cause.
The legislation would also extend the storage of the reports in CyberTipline from 90 days to one year.
“Due to the immense volume of reports and the deliberative and time-consuming process that an investigation of these cases requires, extending this preservation gives law enforcement more time to properly and comprehensively investigate crimes against children,” the legislation says.
The legislation also would replace the term “child pornography” with CSAM in federal statutes.
“'Child pornography’ is an inaccurate and misleading term to describe an image or video of a child being raped or sexually abused,” the legislation states. “Whereas ‘pornography’ refers to imagery of consenting adults engaged in sexual acts, the child victims depicted in such imagery have no consent and no control over their sexual exploitation. U.S. federal law should accurately reflect this abuse.”
‘Troubling and Heartbreaking’
According to NCMEC, CSAM depict crimes being committed against children that cause children to be revictimized every time those files are shared and viewed on the internet.“It’s important to remember CSAM consists of much more than just images and video files,” NCMEC said. “While CSAM is seen and transmitted on computers and through other technology, these images and videos depict actual crimes being committed against children. The human element, children at risk, must always be considered when talking about this offense that is based in a high-tech world.”
In a press release on the legislation, Rep. Garcia called the legislation “a step toward preventing online sexual abuse from occurring in our society today.”
“In today’s modern society, it has become increasingly important to hold accountable those individuals that would sexually coerce and extort our children,” Garcia said. “As elected officials, there is no greater responsibility than ensuring we are keeping our children safe.”
Rep. Bacon called it “troubling and heartbreaking” to see the rise of online exploitation.
“By requiring reports from online platforms to provide more information to help law enforcement identify and locate the child victim and the individual who posted the image, we can save more children from a life of sexual trauma,” Rep. Bacon said.