Telegram, the controversial messaging app, has announced that it will finally cooperate with the Internet Watch Foundation (IWF), a globally recognized organization focused on tackling the spread of child sexual abuse material (CSAM) online. This move marks a departure from its previous stance, where the app had consistently resisted joining child protection schemes, despite mounting pressure from authorities and advocacy groups.
Telegram, with nearly 950 million users worldwide, has long positioned itself as a platform that prioritizes user privacy over moderation. However, its failure to effectively monitor and remove harmful content led to concerns about the app being used for illegal activities, including the distribution of CSAM, drugs, and cybercrime. The app’s security features, particularly its end-to-end encryption, were designed to shield user communications from third-party surveillance, but this also made it difficult for law enforcement to intervene in criminal activities conducted on the platform. Some critics even went as far as calling Telegram “the dark web in your pocket.”
Telegram’s decision to collaborate with the IWF comes after years of resistance. For a long time, Telegram’s founder, Pavel Durov, and his company refused to engage with the IWF or similar organizations. This changed in the wake of a significant event: Durov’s arrest in Paris in August 2024, stemming from accusations of failing to cooperate with law enforcement over illegal content on the platform. The arrest, which involved allegations related to CSAM, drug trafficking, and fraud, underscored the growing international pressure on Telegram to change its approach to content moderation.
Following his arrest, Durov vowed to reform Telegram’s reputation and improve its moderation policies. In a statement, he committed to transforming the platform’s content moderation system “from an area of criticism into one of praise.” This includes providing more transparency in how the app handles illicit content and cooperating more actively with authorities.
The partnership with the IWF is seen as a critical first step in this reform process. The IWF, known for its ability to identify and remove CSAM through its ever-evolving list of known abuse material, will now work with Telegram to detect and block such content on the app. While Telegram had previously removed hundreds of thousands of pieces of CSAM each month using its own internal systems, the IWF’s tools will help strengthen these efforts, providing more effective detection and prevention.
The IWF has described Telegram’s decision as “transformational” but also cautioned that it represents only the beginning of a much longer journey for the platform. Derek Ray-Hill, Interim CEO of the IWF, emphasized that Telegram’s engagement with the foundation would enable the use of advanced tools to ensure that CSAM cannot be shared on the service. However, he also noted that Telegram must continue to make substantial improvements to its content moderation practices.
Despite this progress, many observers remain skeptical about Telegram’s commitment to reform. The platform’s reputation for prioritizing privacy over accountability still raises concerns about how effective its moderation will be, especially in the absence of full end-to-end encryption for the majority of messages exchanged on the app. Telegram’s end-to-end encryption is only used in certain private chats, while most communications are encrypted with standard methods, leaving them more susceptible to interception and hacking.
Telegram’s decision to join the IWF comes as part of a broader set of changes to its operations. These include handing over the IP addresses and phone numbers of rule violators to law enforcement in response to valid legal requests, disabling features like “people nearby” that were exploited by scammers and bots, and publishing regular transparency reports about content removals. These changes reflect Telegram’s shift towards greater accountability, which was previously a contentious issue for the platform.
The pressure on Telegram to improve its content moderation policies is far from over. While the partnership with the IWF is a positive development, the platform’s long-standing refusal to engage with child safety initiatives and its minimal cooperation with law enforcement mean that much more work lies ahead. It remains to be seen how Telegram’s new approach will balance user privacy with the need to protect vulnerable individuals from exploitation on its platform.