The app Telegram has a reputation for disregarding child abuse advocacy organizations before its CEO was detained in France.
Three of those organizations, the Canadian Centre for Child Protection, the U.K.-based Internet Watch Foundation, and the U.S.-based National Center for Missing & Exploited Children (NCMEC), all told NBC News that they have reached out to Telegram regarding child sexual abuse material—often abbreviated as CSAM—on the platform, but have largely been ignored.
The CEO and co-founder of Telegram, a messaging and news app that is popular in the former Soviet Union and has gained traction with far-right organizations in the US and those barred from other platforms, Pavel Durov was detained on Saturday and is currently being held by French police.
Durov was taken into custody as a part of an investigation into an unidentified individual, according to the Paris prosecutor, who has not yet made any accusations public. According to a statement from the prosecutor, the charges against the individual include “complicity” in unlawful activities as well as possessing and disseminating evidence of child sexual abuse.
Telegram stated that it complies with EU legislation in a statement on X. It stated that Durov is “not hiding anything” and that it is “laughable to assert that the owner of a platform or the platform itself is accountable for misuse of that platform.”
Since its founding, Telegram has positioned itself as mostly unmoderated and uninterested in cooperating with law authorities. Durov claimed to have 900 million active users in April.
Given how much of a safe haven Telegram has been for CSAM, John Shehan, senior vice president of NCMEC’s Exploited Children Division & International Engagement, expressed encouragement over France’s move to detain Durov.
He said, “In terms of their lack of content moderation or even interest in preventing child sexual exploitation activity on their platform, Telegram is truly in a league of their own.”
Shehan stated, “It is encouraging to see the French police and government taking some action to potentially rectify this type of activity.”
According to the Telegram website, “even if reported by a user,” the company never reacts to allegations of any form of unlawful conduct in private or group chats. Additionally, it states that “we have disclosed 0 bytes of user data to third parties, including governments,” in contrast to other significant tech companies that regularly comply with court orders and warrants for user data.
Telegram was contacted by NBC News to respond to the groups’ allegations that their attempts to report CSAM had gone unanswered. Telegram spokesman Remi Vaughan claimed in a statement that the company “actively moderates harmful content on its platform, including child abuse materials,” but she did not respond to their criticisms.
“To delete information that violates Telegram’s terms of service, moderators employ a combination of proactive monitoring of the platform’s public areas, AI technologies, and user reports,” Vaughan explained. Telegram claims hundreds of public groups are banned every day, and it maintains a channel that provides daily reports on how many groups and channels have been reported for child abuse.
The Stanford Internet Observatory reported last year on the ways in which platforms enforce child sexual exploitation (CSAM). They found that although Telegram states that sharing CSAM in public channels is against their policies, it is the only major tech platform whose privacy policy does not specifically forbid CSAM or the grooming of minors in private chats.
By law, American-based platforms must collaborate with NCMEC, which manages the biggest global coordination center for law enforcement, social media companies, and tipsters, to identify and quickly remove verified abuse content. Durov, who was born in the former Soviet Union, has said that Telegram is situated in Dubai, United Arab Emirates, a neutral nation that does not subject his platform to any one authority.
However, Shehan stated that significant digital businesses situated outside of the United States, such as the Canadian conglomerate Aylo, which owns Pornhub, the U.K.-based Fenix, which owns OnlyFans, and TikTok, which is owned by the Chinese company ByteDance, all delete CSAM that NCMEC identifies.
Telegram claims to provide users with the ability to encrypt their private conversations end-to-end, which means that only users and not the platform can read them. However, Telegram lacks the ability for users to report and send unlawful information, in contrast to other end-to-end encrypted messaging platforms like WhatsApp.
According to Shehan, NCMEC has received 570,000 reports of CSAM on Telegram overall. 2013 saw the release of the app.
“They have made it quite evident that they have no interest in the team. We still sometimes get in touch, but not as frequently as we used to,” he remarked. “They never reply at all.”
In an attempt to stop the spread of child sexual abuse imagery, the independent nonprofit Internet Watch Foundation in the United Kingdom, through a spokesman, stated that while the organization has made several attempts to collaborate with Telegram over the past year, Telegram has declined to “take of any of its services to block, prevent and disrupt the sharing of child sexual abuse imagery.”
Heidi Kempster, the group’s deputy CEO, stated, “There’s no excuse.” “Every platform has the responsibility to take immediate action to stop the dissemination of images of child sexual assault. Any decision not to halt the spread of this known stuff is one that we have made consciously and actively. We have the means and the information.

In an email statement, Stephen Sauer, the head of the Canadian Centre for Child Protection’s national CSAM tip line, said that Telegram has not only disregarded its requests to report CSAM, but that the amount of abuse content on the platform has increased.
“According to our observations, offenders are increasingly using Telegram’s platform to access CSAM.” Telegram accounts and links are frequently promoted on online forums and even on popular American social media sites, serving as a conduit for traffic to illicit Telegram-based content, the source stated.
We truly have no idea how Telegram’s moderation procedures work since they are so opaque. Similarly, when we submit content to the corporation for moderation, we never hear back from them regarding the conclusion. More crucially, Sauer stated, “despite its known use for facilitating the exchange of this type of material, it does not appear the platform itself is taking adequate proactive steps to curb the spread of CSAM on their service.”