Short videos depicting sexual content involving children are being spread on the Amazon-owned streaming platform Twitch, with such videos receiving thousands of views.
Such exhibitionism was found to often be triggered by the encouragement they get from livestream viewers. Those clips had been watched 2,700 times. Some of the remaining 49 clips involved children being subjected to sexual grooming, with the videos racking up 7,300 views.
When a livestream viewer captures such material, it “becomes an almost permanent record of that sexual abuse,” according to Canadian Centre Director Stephen Sauer. “There’s a broader victimization that occurs once the initial livestream and grooming incident has happened because of the possibility of further distribution of this material.”
Mr. Sauer insisted that social media firms can’t be looked upon to regulate child abuse content and called for government intervention.
“We’ve been on the sidelines watching the industry do voluntary regulation for 25 years now. ... We know it’s just not working. We see far too many kids being exploited on these platforms. And we want to see government step in and say, ‘These are the safeguards you have to put in place.’”
In a statement to Bloomberg, Twitch CEO Dan Clancy said that “youth harm, anywhere online, is deeply disturbing.” When alerted by the outlet, the company deleted the child sexual content.
“Even one instance is too many, and we take this issue extremely seriously,” Mr. Clancy said.
During this period, Twitch issued 13,801 enforcements for violating the firm’s Youth Safety Policy.
However, Twitch submitted fewer tips to the U.S. National Center for Missing and Exploited Children (NCMEC). Between the second half of 2022 and the first half of 2023, the number of tips fell to 3,300 from 7,600.
The company insisted that the decrease “reflects a change in our categorization to ensure we are accurately reporting illegal content.”
“It does not represent a change in our enforcement for content that may endanger youth,” it stated.
“Twitch represents a clandestine, threatening digital environment where minors are interacting with adult strangers without parental supervision,” the study reads.
“Young users clearly feel a false sense of safety on the platform; a significant proportion were willing to reveal personal information despite having no knowledge of who might be listening.”
Internet Child Exploitation Material
The issue of child sexual content proliferation isn’t just limited to Twitch. Many tech firms, such as Twitter, TikTok, Google, and Facebook, are facing similar accusations.“The creation, dissemination, and viewing of online child sexual abuse inflicts incalculable trauma and ruins lives. It is also illegal,” she said at the time. “It is vital that tech companies take all the steps they reasonably can to remove this material from their platforms and services.”
The companies argued that the First Amendment protected them from liability for the content they published. However, District Judge Yvonne Gonzalez Rogers pointed out that many violations alleged in the lawsuit do “not constitute speech or expression, or publication of same.”
For instance, plaintiffs accused the social media firms of not providing effective parental controls to parents, not offering options for users to self-restrict time spent on a platform, not using robust age verification, and not implementing reporting protocols that would allow users to report CSAM and other such material.
“Addressing these defects would not require that defendants change how or what speech they disseminate,” the judge wrote.
“This bill will make it clear that images and videos of children being raped is not ‘pornography,’ it is sexual abuse of a child. America cannot, and should not, accept a reality where innocent children are sexually exploited for financial gain,” she said.
In a Dec. 7, 2023, statement, the Canadian Centre for Child Protection stated that Meta’s decision means that “millions of child sexual abuse and exploitation cases will cease to be reported.”
Since 2020, Meta has forwarded 74.4 million reports of suspected child sexual abuse and exploitation to the NCMEC as per legal requirements, it stated. These reports have triggered numerous investigations by law enforcement.
Meta’s decision means that law enforcement “will lose its ability to effectively monitor these crimes unfolding across large swaths of their platforms, including Facebook and Instagram,” according to the Canadian Centre’s statement.
“NCMEC, which processes Meta’s child exploitation reports, has estimated these actions could cause as much as 70 percent of all reportable cases on its services to go undetected,” it reads.