In the weeks since Trump-sympathizing rioters laid siege to the U.S. Capitol, platforms have scrambled to account for the ways right-wing zealots used their communication networks to strategize, coordinate or otherwise inflame tensions related to the deadly insurrection. Apparently, Telegram has been no exception: On Monday, the app’s CEO and founder, Pavel Durov, said that the platform shut down “hundreds of public calls for violence” on US channels last week.
In a blog post published on Monday, Durov said that although US users represent less than 2% of Telegram’s base, the platform’s moderation team had received “an increased number of reports about US-related public activity” in the days leading up to and after the attack on the Capitol. In response, he said, the team “acted decisively by clamping down on US channels that advocated violence.”
“Telegram welcomes peaceful debate and protest, but our Terms of Service explicitly prohibit distributing public calls to violence,” Durov wrote. “Сivil movements all over the world rely on Telegram in order to stand up for human rights without resorting to inflicting harm.”
The attack on the Capitol has forced platforms with long histories of failures in moderating extremist content to finally confront the violent threats and discourse they have allowed to proliferate. On January 13, the walkie-talkie app Zello announced in a press release that it was with “deep sadness and anger” that its leadership team had discovered “evidence of Zello being misused by some individuals while storming the United States Capitol building last week.” And on January 10, Parler, the social platform that bills itself as a safe haven for free speech and radical discourse, was unceremoniously booted from its server, Amazon Web Services, after the platform repeatedly declined to enact stricter content moderation policies.
Although Durov acknowledged that Telegram had removed hundreds of public calls for violence on the grounds that they violated the platform’s terms of service, his blog post notably failed to mention how the app — which is literally built on encryption technology designed to protect communication from being viewed by unwanted eyes — would be vetted by content moderators in the future.