Telegram has agreed to deploy tools from an internet watchdog to prevent child sexual abuse imagery from being spread on its platform.
Telegram will now use a range of IWF services, including the unique digital fingerprints of millions of known child sexual abuse images and videos (CSAM), to instantly spot when this content is being shared.
Telegram
The IWF has previously stated it had confirmed thousands of reports of CSAM imagery on Telegram since 2022, including category A imagery, the most severe category.When the IWF reported this content to Telegram, it was removed by the platform.
Telegram is an app that allows for one-on-one conversations, group chats, and “channels” that can involve hundreds of people.
Pavel Durov
Pavel Durov, CEO and co-founder of the messaging app Telegram, was arrested in France last August and charged with a series of criminal offenses.Among the accusations was that he was allowing Telegram to be used for CSAM and drug trafficking and that the company had refused to cooperate with criminal investigators.
Russia-born Durov, writing on his Telegram account on Sept. 5, said: “The claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day.
“We have urgent hotlines with NGOs to process urgent moderation requests faster.”
Telegram has to comply with UK online safety legislation that kicked in this year.
The UK’s Online Safety Act places a duty on social media platforms for the content they host, including CSAM.
The legislation imposes legal duties on big tech companies and service providers, which are overseen by regulator Ofcom.
‘Horror’
Earlier this year, IWF and other advocacy groups claimed that their outreach to Telegram about CSAM on the platform was largely “ignored.”However on Dec. 4, IWF interim CEO Derek Ray-Hill said that Telegram’s involvement was “a transformational first step on a much longer journey.”
Social media companies Meta and X are IWF members.
“Child sexual abuse imagery is a horror that blights our world wherever it exists,” Ray-Hill said.
“The children in these images and videos matter. I want to be able to say to every single victim that we will stop at nothing to prevent the images and videos of their suffering being spread online.”
Remi Vaughn, head of press and media relations at Telegram, said: “Telegram removes hundreds of thousands of child abuse materials each month, relying on reports and proactive moderation which includes AI, machine learning, and hash-matching.
“The IWF’s datasets and tools will strengthen the mechanisms Telegram has in place to protect its public platform—and further ensure that Telegram can continue to effectively delete child abuse materials before they can reach any users.”