WASHINGTON—Sen. Josh Hawley (R-Mo.) told a Senate Judiciary Committee hearing on July 9 that he is “sickened” by a report of digital giant YouTube’s “refusal” to change its algorithm to protect children from pedophiles using the site to find and “groom” potential victims.
“Why not? Because their [business] model is that 70 percent of their business, 70 percent of their traffic, comes from these auto-recommended videos,” Hawley said. “In other words, ad revenues would be lost if they actually took some enforcement steps to stop this exploitation of children.”
“But YouTube has not put in place the one change that researchers say would prevent this from happening again: turning off its recommendation system on videos of children, though the platform can identify such videos automatically,” the article states.
“The company said that because recommendations are the biggest traffic driver, removing them would hurt ‘creators’ who rely on those clicks. It did say it would limit recommendations on videos that it deems as putting children at risk.”
Hawley told the hearing, “This report was sickening.”
That measure ends digital firms’ immunity to publishing liability under Section 230 of the Communications Decency Act, unless they submit “clear and convincing evidence” that their algorithms and content-removal policies are politically neutral.
During the hearing, Hawley asked a panel of witnesses if “for some of these companies, aspects of their business model actually conflict with protecting the safety of children.”
McKenna’s firm provides an app and other resources designed to help parents protect their children from online dangers, especially including sexual predators and traffickers.
McKenna said sites push engagement—the process of connecting users with other users—because it is the most effective for generating ad revenues.
“In March 2019, CNN reported that Instagram was the leading social media platform for child grooming by sexual predators,” McKenna told the committee.
“Our own test accounts quickly discovered that young people, particularly young girls, can be hunted like prey. We started an Instagram account with two stock photos and tried to mimic the behavior of an average teen girl. We posted two selfies with three hashtags each, searched a few hashtags, and liked a few photos.
“Within a week, we had dozens of men sending us images of their penises, telling us they were horny, and sending us pornography through direct messages—even after we told all of them that we were ‘only 12.’ They were relentless.”
Professor Angela Campbell of the Georgetown University Law School said in response to Hawley that “YouTube actually has a product intended for children, called ‘YouTube Kids’ and it’s got some good policies.
“The problem is, again, they’re not really enforcing those policies. There is a lot of content, even on YouTube Kids, that is inappropriate, and we complain to the FTC about this.”
- Bans recommending videos that feature children.
- Prohibits video-hosting websites from recommending videos that feature minors. Those videos, however, could still appear in search results.
- Would apply only to videos that primarily feature minors, not videos that simply have minors in the background.
- Would exempt professionally produced videos, such as prime-time talent-show competitions.
- Requires the Federal Trade Commission to impose criminal penalties and stiff fines for violations.