Telegram, the popular messaging app with over 950 million users, is under intense scrutiny following the detention of its founder, Pavel Durov, in France. Durov, a 39-year-old billionaire, is accused of failing to cooperate with law enforcement over issues including drug trafficking, child sexual content, and fraud. This comes amidst growing criticism that Telegram is not doing enough to combat child abuse material on its platform.
Unlike other major social networks such as Facebook, Google, Instagram, TikTok, Twitter (X), Snapchat, and WhatsApp, Telegram has repeatedly refused to join key international programs designed to detect and remove child sexual abuse material (CSAM). Notably, it is not a member of the U.S.-based National Centre for Missing and Exploited Children (NCMEC) or the UK’s Internet Watch Foundation (IWF). These organizations work with hundreds of internet companies to proactively find, report, and remove illegal content.
Durov’s Arrest and the Case Against Telegram
Pavel Durov was detained in France on charges of failing to moderate harmful content on his platform. According to officials, Telegram’s refusal to collaborate with law enforcement agencies over serious crimes such as drug trafficking, child sexual content, and fraud has made it a target of investigation.
While Telegram insists that its moderation is “within industry standards and constantly improving,” critics argue that its practices fall far short of those of other major platforms. Unlike its competitors, Telegram is not registered with NCMEC’s CyberTipline, a crucial resource that helps companies identify and report CSAM. More than 1,600 internet companies, including 16% that are based outside of the U.S., are registered with this service, providing a stark contrast to Telegram’s stance.
Lack of Cooperation with Child Protection Organizations
Reports suggest that both NCMEC and the IWF have repeatedly asked Telegram to join their programs, but the messaging app has consistently ignored these requests. An IWF spokesperson recently highlighted the lack of cooperation, stating, “Despite attempts to proactively engage with Telegram over the last year, they are not members of the IWF and do not take any of our services to block, prevent, and disrupt the sharing of child sexual abuse imagery.”
Telegram’s refusal to collaborate means that it cannot proactively find, remove, or block confirmed CSAM that is identified by these organizations. While the platform does remove CSAM once confirmed, the IWF has pointed out that this process is slower and less responsive compared to other platforms, making it less effective in addressing the issue.
Transparency and Moderation Concerns
In addition to its controversial stance on child protection, Telegram has also been criticized for its lack of transparency in content moderation. Unlike other social networks that publish regular transparency reports listing content removed due to police requests, Telegram’s reporting is sporadic and lacks historical data. The company’s transparency reports are only available via a channel on the app, with no publicly accessible archive.
Furthermore, Telegram’s method for handling media inquiries is seen as inadequate. Journalists have reported difficulties in reaching the platform for comment, often being directed to an automated bot on the app, which rarely responds. The BBC has reached out to Telegram for comment on its refusal to join child protection schemes but has yet to receive a reply.
Challenges Ahead for Telegram
Telegram’s approach to child protection and content moderation raises significant concerns, especially considering its widespread use in countries like Russia, Ukraine, Iran, and other former Soviet states. The platform’s popularity in these regions, combined with its refusal to adhere to international child protection norms, puts it at the center of a global debate on online safety.
As Durov remains in custody, the future of Telegram and its policies towards child protection and cooperation with law enforcement remain uncertain. Critics argue that without substantial changes, the platform could face increasing pressure from governments and international bodies to improve its moderation practices and join key child protection programs.
For now, Telegram’s stance reflects a broader challenge in balancing user privacy with the need for increased online safety and accountability.