Teenage Boys Driving ‘Sadistic’ Online Harm Networks, Warns NCA

Young members of online groups known as ‘com networks’ are coercing peers into self-harm and spreading misogynistic content, according to law enforcement.
Teenage Boys Driving ‘Sadistic’ Online Harm Networks, Warns NCA
A teenage child looks at the screen of a mobile phone in London on Jan. 17, 2023. Leon Neal/Getty Images
Evgenia Filimianova
Updated:
0:00

Teenage boys are at the heart of “sadistic” online networks involved in harrowing crimes including child sexual abuse and the sharing of graphic, violent content, the National Crime Agency (NCA) has warned.

The so-called “com networks”—online groups operating primarily on mainstream social media and messaging apps—target victims both online and offline. The offences range from fraud to extremism, serious violence, and child sexual abuse.

Between 2022 and 2024 reports of this emerging threat in the UK increased six-fold. According to NCA, thousands of users in Britain and overseas have exchanged millions of messages online about sexual and physical abuse.

Teenage boys dominate these networks that often share “sadistic and misogynistic material” targeting users their own age or younger.

The NCA Director General Graeme Biggar has described it as a “complex and deeply concerning phenomenon.”

“These groups are not lurking on the dark web, they exist in the same online world and platforms young people use on a daily basis. It is especially concerning to see the impact this is having on young girls who are often groomed into hurting themselves and in some cases, even encouraged to attempt suicide,” he said.

Biggar warned that operating online gives offenders a false sense of safety.

“There have already been convictions, we and partners have made arrests in the UK and overseas, and further investigations are ongoing.”

Recent examples linked to these online harm groups include the case of 17-year-old Richard Ehiemere, from Hackney, who was convicted of fraud and possession of indecent images of children in February.

Earlier this year, 19-year-old Cameron Finnigan of West Sussex was jailed for assisting suicide, possession of child abuse material, and a terror offence, after a separate probe by Counter Terrorism Policing.

The “com networks” use manipulation tactics to coerce victims, often children, into harming themselves, others, or animals. The abuse is often filmed and shared online.

The NCA said that offenders engage in “doxing,” or the malicious publication of personal details of victims.

Biggar said that young girls are often manipulated into self-harm and, in some cases, encouraged towards suicide.

“I’d encourage parents and carers to have regular conversations with their child about what they do online, and ensure they know they have your support should they need it,” he said.

Online Safety Law

Responding to the report findings, safeguarding minister Jess Phillips described the scale of online child sexual abuse as “absolutely horrific.”

“My message to tech companies is simple: this is your responsibility too. You must ensure your platforms are safe for children, so that we can protect the most vulnerable and put predators behind bars,” she said.

The government’s Online Safety Act, which kicked into force on March 17, imposes new legal duties on Big Tech companies and service providers, overseen by the regulator Ofcom.

If social media platforms do not comply with these rules, Ofcom can fine them up to £18 million or 10 percent of their global annual revenue, whichever is biggest, meaning fines handed down to the biggest platforms could reach “billions of pounds.”

Andy Burrows, chief executive of the Molly Rose Foundation, criticised the government and regulator Ofcom for failing to act decisively on “com networks.”

“These horrendous groups pose a deeply disturbing and sharply growing risk to children, especially teenage girls who are being sadistically groomed into acts of self-harm and even suicide online.

“Despite being repeatedly warned of the threat posed by these groups, Ofcom has failed to introduce a single targeted measure to tackle disturbing suicide and self-harm offences. This glaring gap in its regulatory regime must be closed,” he said.

Ofcom’s new anti-abuse and anti-exploitation guidelines are part of the Online Safety Act. These include mandatory risk assessments for online platforms and stricter age checks to prevent underage access.

Platforms will have to risk assess for harms to children from spring and the child safety regime will be fully in effect by summer this year.

Last month, the watchdog warned tech companies that those failing to protect women and girls online will be publicly named, so users can decide whether to keep using them.

This was followed by an announcement in March, when Ofcom launched an investigation into TikTok, Reddit, and Imgur over their use of children’s personal information.

The investigation was prompted by concerns that online platform algorithms may expose young users to inappropriate or harmful content.

Evgenia Filimianova
Evgenia Filimianova
Author
Evgenia Filimianova is a UK-based journalist covering a wide range of national stories, with a particular interest in UK politics, parliamentary proceedings and socioeconomic issues.