The shift in policy marks a significant change for Telegram, known for its strong privacy protections
Critics have raised concerns about platform's role in hosting harmful content, including misinformation and extremist material
Telegram has announced it will begin handing over users' IP addresses and phone numbers to authorities who present search warrants or other valid legal requests.
The change, part of an update to its privacy policy, was revealed on Monday by CEO Pavel Durov, who said the new measures "should discourage criminals" from using the app.
Durov emphasized that the change would only affect a small percentage of users, claiming that "99.999%" of Telegram’s nearly billion-strong user base has no involvement in criminal activity.
However, he warned that the "0.001%" of users engaging in illicit activities were tarnishing the platform’s reputation and putting its broader user base at risk.
This announcement represents a significant shift for Durov, a Russian-born entrepreneur who co-founded Telegram and has long promoted the platform’s strong privacy protections.
Durov himself was recently arrested by French authorities at an airport north of Paris. Following his arrest, French prosecutors charged him with enabling criminal activity on Telegram, including the spread of child abuse images and drug trafficking. He was also accused of failing to comply with law enforcement.
Durov has denied the charges, calling them "surprising" and "misguided". In a statement following his detention, he lashed out at authorities, arguing that holding him responsible for crimes committed by third parties on the platform was unreasonable.
Critics have long argued that Telegram’s privacy policies and lax moderation have allowed it to become a hub for misinformation, child exploitation, and extremist content. The app’s unique feature, which allows groups of up to 200,000 members, has also been singled out as a factor enabling the spread of harmful material. By comparison, Meta-owned WhatsApp limits group sizes to 1,000 users.
Founder and CEO of Telegram Pavel Durov delivers a keynote speech during the Mobile World Congress in Barcelona, Spain February 23, 2016. Reuters
Last month, Telegram faced scrutiny after it was found to be hosting far-right channels that contributed to violent incidents in cities across England.
In a related development, Ukraine recently banned the app from state-issued devices, citing national security concerns over Russian influence.
The arrest of Durov, 39, has reignited debates over free speech and privacy online. John Scott-Railton, a senior researcher at the University of Toronto’s Citizen Lab, said Durov’s detention and the subsequent policy shift have raised new concerns about the safety of Telegram as a platform for political dissidents, particularly in repressive regimes.
"Telegram’s marketing as a platform that would resist government demands attracted people that wanted to feel safe sharing their political views in places like Russia, Belarus, and the Middle East," said Scott-Railton. "Many are now scrutinizing Telegram's announcement with a basic question in mind: Does this mean the platform will start cooperating with authorities in repressive regimes?"
Telegram has not provided specifics on how it will handle legal demands from governments with poor human rights records, further fueling anxiety among users in these regions.
Prior to the change in its privacy policy, Telegram only provided user data in cases involving terrorism suspects, according to 404 Media.
On Monday, Durov said that Telegram was now utilizing "a dedicated team of moderators" who use artificial intelligence to conceal problematic content from search results. However, experts have questioned whether these measures will be enough to satisfy European authorities.
Daphne Keller, an expert at Stanford University's Center for Internet and Society, said that while Telegram’s use of AI to make harmful content harder to find is a positive step, it likely won't meet the full legal requirements in jurisdictions such as France and the broader European Union.
“Anything that Telegram employees look at and can recognize with reasonable certainty is illegal, they should be removing entirely,” Keller said. She also pointed out that in some countries, companies are required to notify authorities about certain types of seriously illegal content, such as child sexual abuse material.
Keller added that the updated privacy policy may not go far enough to meet the demands of law enforcement, particularly regarding investigations that require access to user communications.
"It sounds like a commitment that is likely less than what law enforcement wants," Keller said, referring to the policy’s provisions for sharing user data like IP addresses and phone numbers.
Popular
Spotlight
More from Science
2Africa submarine cable lands in Pakistan, to be operational by late 2025
The development is set to significantly enhance internet speed in the country
More from World
Ousted Bangladesh PM Hasina's son denies graft in $12.65 billion nuclear deal
Anti-corruption commission alleges $5B financial irregularities involving Hasina, her son Sajeeb Wazed, and UK minister Tulip Siddiq
Comments
See what people are discussing