- Telegram has expanded its moderation policy to include private and group chats, allowing users to flag illegal content.
- CEO Pavel Durov emphasized his commitment to preventing abuse on the platform, especially following legal pressures and charges related to Telegram’s handling of illegal activities.
Telegram has quietly expanded its moderation policies to include private and group chats. This move now allows users to flag illegal content within these previously protected areas, representing a significant shift in Telegram’s approach to managing harmful activities. This change was announced on Thursday, accompanied by a statement from CEO Pavel Durov, who reaffirmed his commitment to preventing abuse on the platform:
“I made it my personal goal to prevent abusers of Telegram’s platform.”
I'm still trying to understand what happened in France. But we hear the concerns. I made it my personal goal to prevent abusers of Telegram's platform from interfering with the future of our 950+ million users.
My full post below. https://t.co/cDvRSodjst
— Pavel Durov (@durov) September 5, 2024
The change comes amidst increasing scrutiny and legal pressures. Just weeks ago, French authorities detained Durov near Paris for questioning regarding Telegram’s role in facilitating illegal activities. Although he was later released, Durov was charged with complicity, a serious accusation that carries potential penalties of up to 10 years in prison and a €500,000 fine. Durov criticized the charges, arguing that using outdated laws to hold platform CEOs accountable for user actions is fundamentally flawed.
Private and Group Chats No Longer Untouchable
Telegram’s reputation as a bastion of privacy has been built on its encrypted messaging system and the assurance that private chats and group discussions remain out of the platform’s moderation scope. With over 950 million users globally, Telegram has become a crucial platform for communication, including among financial services firms. However, this popularity has also attracted scammers, impersonators, and other malicious actors, exploiting the platform’s privacy features to operate with impunity.
Historically, Telegram’s response to user reports of illegal activities has been inconsistent, often leading to criticism of the platform’s opacity in handling such cases. Recently, FXStreet’s Co-CEO, Pere Monguió, shared his challenges with reporting impersonators posing as his company’s staff on Telegram, highlighting the platform’s inadequate response mechanisms.
A survey conducted jointly by Finance Magnates and FXStreet revealed that 60.09% of traders who fell victim to scams on Telegram lost money, making it the platform with the highest success rate for scammers. This data underscores the urgency of Telegram’s need to enhance its content moderation capabilities to protect its users from abuse.
In response to these growing concerns, Telegram has updated its policies, removing the previous stipulation that all private and group chats were beyond moderation. This strategic shift is part of Durov’s broader efforts to combat the rise of criminal activity on the platform. He noted,
“We take down millions of harmful posts and channels every day. We publish daily transparency reports and have direct hotlines with NGOs to process urgent moderation requests faster.”
However, Durov acknowledged that Telegram’s rapid growth, particularly its surge to 950 million users, has introduced new challenges that have been exploited by criminals. This expansion has heightened calls for the platform to take more decisive actions against abuse. Durov emphasized that the internal process of improving Telegram’s moderation capabilities has already begun, with further updates on the platform’s progress to be shared soon.