Telegram has announced that it will provide user data, including IP addresses and phone numbers, to authorities that present valid legal requests such as search warrants.
This significant change to its terms of service and privacy policy aims to “discourage criminals,” according to CEO Pavel Durov.
“While 99.999% of Telegram users have nothing to do with crime, the 0.001% involved in illicit activities create a bad image for the entire platform, putting the interests of our almost billion users at risk,” Durov stated in a Telegram post.
This decision marks a notable reversal for Durov, the platform’s Russian-born co-founder, who was detained by French authorities last month at an airport near Paris.
He faced charges for allegedly facilitating criminal activity on the platform, including the distribution of child abuse images and drug trafficking, as well as failing to comply with law enforcement requests. Durov has denied these charges, expressing his discontent with being held accountable for crimes committed by third parties using the platform.
Critics argue that Telegram has become a haven for misinformation, child pornography, and terrorism-related content, partly due to its feature that allows groups to consist of up to 200,000 members. In contrast, Meta-owned WhatsApp limits group sizes to 1,000 members.
Recently, Telegram has been scrutinized for hosting far-right channels that have been linked to violence in English cities. Earlier this week, Ukraine banned the app on state-issued devices to minimize potential threats from Russia.
Durov’s arrest has ignited discussions about the future of free speech protections online. Following his detention, concerns arose regarding whether Telegram remains a safe haven for political dissidents. John Scott-Railton, a senior researcher at the University of Toronto’s Citizen Lab, noted that this latest policy shift has heightened alarm among various communities.
“Telegram’s marketing as a platform that would resist government demands attracted people that wanted to feel safe sharing their political views in places like Russia, Belarus, and the Middle East,”
Scott-Railton explained. “Many are now scrutinizing Telegram’s announcement with a basic question in mind: does this mean the platform will start cooperating with authorities in repressive regimes?”
The company has not provided much clarity on how it will handle demands from such regimes moving forward. Cybersecurity experts have pointed out that Telegram, while having removed some problematic groups in the past, maintains a far weaker moderation system for extremist and illegal content compared to other social media platforms.
Before this policy change, Telegram would only share information about terror suspects, according to 404 Media. On Monday, Durov mentioned that the app is now employing “a dedicated team of moderators” who utilize artificial intelligence to obscure problematic content from search results. However, experts suggest that this may not meet the legal requirements outlined by French or European law.
“Anything that Telegram employees look at and can recognize with reasonable certainty is illegal, they should be removing entirely,” said Daphne Keller, from Stanford University’s Center for Internet and Society.
She emphasized that in certain jurisdictions, companies must also notify authorities about specific types of illegal content, such as child sexual abuse material.
Keller questioned whether the recent changes would sufficiently address law enforcement’s needs for information about investigation targets, including their communications and message content. “It sounds like a commitment that is likely less than what law enforcement wants,” she said.