Instagram’s algorithms connect a vast network of pedophiles, an investigation by The Wall Street Journal, Stanford University, and the University of Massachusetts–Amherst has discovered.
An investigation by Stanford Internet Observatory chief technologist David Thiel, research manager Renée DiResta, and director Alex Stamos discovered that a massive network of English-speaking social media accounts buys, sells, and shares child sexual abuse material across several social media networks.
Some child pornography is created by children or teenagers trying to earn money, impress a date, or appease a blackmailer. The report calls this content self-generated child sexual abuse material (SG-CSAM).
The pedophile accounts navigate these accounts through hashtags referencing pedophilia, according to the Stanford report. Some of these references are obvious, such as hashtags that use variations of the word “pedo.”
Other references use emojis, mentions of pedophile code words, or other subtler references, the report added.
Social media algorithms effectively serve as the pimps that connect pedophiles to children, the report’s findings suggest, with the algorithms’ recommendations pushing pedophilic content into the hands of those who desire it.
Although Instagram is the most important platform for pedophile networks, other sites, including Twitter, Telegram, Discord, Snapchat, TikTok, Mastodon, and others also connect pedophiles to their victims and each other, the report states.
Social media algorithms have created a system through which thousands of pedophiles buy pornographic images, pornographic videos, online video meetings, and even in-person meetings with children and teens, the report concludes.
How It Works
According to the report, the social media accounts on which children produce porn resemble sites such as OnlyFans. Victims create “menus” of sex acts that customers can ask for. Customers pay for the sex acts through gift-card swapping, exchange platforms such as G2G, and other websites that allow anonymous payment, the report concludes.On social media, many of the “seller” accounts tend to claim ages between 13 and 17 years old, the report notes. But often, these children sell sexual pictures of themselves from far younger ages, it adds.
“Menus” can include sexually explicit videos or photos, videos of self-harm, videos of sex acts with animals, video from when children were younger, and paid sexual acts, according to the report.
Sellers use simple tricks to outsmart social media algorithms, the report notes. They reverse the digits of ages (e.g., “16” to “61”), partially blot out words about sexual content, and use code words such as “menu,” the report adds.
Helpful Algorithms
Pedophiles and sellers didn’t have to look hard to find each other because algorithms did the work for them, the report suggests.Most keywords that involve child porn return results on Instagram’s website, according to the report.
For some terms, Instagram displays a warning message, the report reads.
“These results may contain images of child sexual abuse,” the message reads. “Child sexual abuse or viewing sexual imagery of children can lead to imprisonment and other severe consequences. This abuse causes extreme harm to children and searching and viewing such materials adds to that harm. To get confidential help or learn how to report any content as inappropriate, visit our Health Center.”
At the bottom, Instagram’s message offers users two options: “Get resources” and “See results anyway.”
“Due to the widespread use of hashtags, relatively long life of seller accounts and, especially, the effective recommendation algorithm, Instagram serves as the key discovery mechanism for this specific community of buyers and sellers,” the report states.
Companies Respond
A Meta representative told The Epoch Times that the company works “aggressively” to fight child porn on all its platforms.“Predators constantly change their tactics in their pursuit to harm children, and that’s why we have strict policies and technology to prevent them from finding or interacting with teens on our apps, and hire specialist teams who focus on understanding their evolving behaviors so we can eliminate abusive networks,” the representative said.
From 2020 to 2022, Meta’s teams dismantled 27 abusive networks, and in January 2023, Meta disabled more than 490,000 accounts that violated child safety policies, the representative added.
“We fixed a technical issue that unexpectedly prevented certain user reports from reaching content reviewers, we provided updated guidance to our content reviewers to more easily identify and remove predatory accounts, and we restricted thousands of additional search terms and hashtags on Instagram. We’re committed to continuing our work to protect teens, obstruct criminals, and support law enforcement in bringing them to justice,” the representative said.
Twitter also has children creating child porn on its platform, the report adds. Although Twitter does better at taking down child porn, people who view some seller accounts still get recommendations for other seller accounts, the report notes.
Furthermore, Twitter allows nudity, which makes it easier for child porn to appear on the platform, according to the report. Sometimes, accounts posted multiple images known to be child porn before Twitter removed them, the report notes.
Since learning about this problem, Twitter has largely fixed the issue, the report says.
When contacted, Twitter’s press line auto-replied with a poop emoji.
Commercialized child porn groups on communications apps Discord and Telegram had thousands of members, the report noted.
According to the report, Telegram’s official policies don’t address child porn in private chats, sexualization of children, and grooming. Discord’s policies don’t address the sexualization of children, the report adds.
Telegram spokesman Remi Vaughn told The Epoch Times that the site has worked hard to moderate child abuse content on the public parts of the app.
“In the case of child abuse content, we publish daily reports about our efforts at t.me/stopca. At time of writing, nearly 10,000 groups, channels, and bots have been removed this month,” Vaughn said.
“These Telegram and Discord groups had hundreds or thousands of users; some appeared to be managed by individual sellers, though there were also multi-seller groups,” the report added.
The Epoch Times reached out to Discord but received no comment.
On Snapchat, pedophiles can connect directly to children through peer-to-peer communication features, the report notes.
The Epoch Times reached out to Snapchat but received no comment.
TikTok did a better job of limiting pedophile content, the report noted. But even there, problems existed.
“The fact that TikTok is far more oriented around content recommendations instead of hashtag-based discovery or friend recommendations also makes it harder for users to discover specific types of material intentionally,” the report reads.
Stop Abusers or Stop Business
According to Jon Uhler, a psychologist with 15 years of experience with more than 4,000 sexual predators, this story shows how creatively predators pursue children.“Their dark creativity knows no bounds,” he said.
He added that society ignores how sexually explicit content leads to sexual predation by lowering children’s defenses.
“Anybody dealing with child development understands if you introduce highly sexualized content that is above their ability developmental level to process and understand, then you set them up to be easy prey,” Uhler said. “Because they don’t have the intuitive sense of evil intent.”
Furthermore, men become predatory by watching large amounts of porn, Uhler said.
“Deviance starts with lust and then objectification. And then it gets into power and control and degradation, and eventually the desire to have a negative effect,” Uhler said.
He added that sexual deviance works differently in men and women—men are far more vulnerable to going down this path.
“The female sex offenders and the male sex offenders are different by nature, in terms of the nature of their offense,” said Uhler. “You will never see any female who has been arrested for a sex offense that used objects on her victims.”
Uhler said that social media companies should make stopping predators their priority and that the companies are smart enough to stop predators.
“You guys are capable, really capable. If you are not dedicated to the protection of children, then close your site down,” he said, referring to social media companies. “If you’re going to build one of these things, you know predators are coming.”