Russia-born tech tycoon Pavel Durov, the founder of Telegram, was arrested in Paris on August 24, 2024. French authorities announced that Mr. Durov is under investigation for a litany of serious crimes, including enabling the distribution of child sexual abuse material on the app, facilitating drug trafficking, and refusing to cooperate with law enforcement. Should digital platform owners be held liable for user-generated content? Pranesh Prakash and Rohit Kumar discuss the question in a conversation moderated by Aaratrika Bhaumik. Edited excerpts:
Does Telegram’s lax content moderation policy and purported reluctance to cooperate with law enforcement agencies justify the indictment of its founder?
Rohit Kumar: While it is understandable that Telegram aims to foster free speech, it is crucial to acknowledge the real-world harms associated with unregulated messaging platforms. Ideally, directors and founders should not be held personally liable. However, if there is clear evidence of direct complicity or knowledge, criminal liability may be imposed. Nonetheless, the threshold for such liability is generally set very high, necessitating substantial evidence.
From a policy standpoint, to what extent should social media intermediaries be held accountable for the content they host?
Pranesh Prakash: In the case of fully end-to-end encrypted platforms, their ability to view reported messages and take action is inherently limited. Additionally, platforms that are designed to minimally record metadata or not record it at all face significant constraints in cooperating with law enforcement agencies regarding user data. Under EU (European Union) law, there is a clear prohibition against requiring platforms to monitor or spy on their users. When it comes to Telegram, while it upholds the confidentiality of private one-on-one and group chats and does not allow enforcement actions on these communications, it does permit scrutiny of content on public channels.
Could even liberal democracies increasingly push for stricter content moderation from these platforms? Does the passage of the Digital Services Act (DSA), 2024, the EU’s latest attempt to regulate big-tech excesses, signal a broader shift in this direction?
Rohit Kumar: The key difference between the past and present lies in the accelerated pace at which disinformation spreads. This is not merely a conflict between the desire to protect free speech and the need to manage disinformation; it transcends simple political narratives. As instances of misuse and real-world harm escalate, the argument for stricter oversight becomes more compelling. For instance, the decision of X to de-platform Donald Trump during the last U.S. presidential election was made by the platform itself. But should platforms have the power to determine who has a voice and who doesn’t? We need greater procedural clarity on how these decisions are made, who makes them, where liability lies, and when government intervention is appropriate.
Could Telegram’s laissez-faire approach to content moderation jeopardise its safe harbour protection under the Information Technology (IT) Act, 2000, in India?
Rohit Kumar: The Ministry of Electronics and Information Technology has announced that it is investigating Telegram over concerns that it is being used for illegal activities such as extortion and gambling. Additionally, some of the requirements under the 2023 IT Rules, such as submitting transparency reports and designating a compliance officer, are quite extensive. Although the Indian government has maintained that Telegram is compliant with these regulations, I agree with Pranesh that there is always a risk of selective prosecution.
Could the threat of personal liability push tech executives to reassess the risks of unregulated content?
Pranesh Prakash: It definitely will. However, it should also prompt countries to reconsider their approach. One potential consequence is that more messaging platforms might adopt end-to-end encryption and minimise metadata storage to avoid assisting law enforcement. So this kind of wilful blindness is likely to emerge more rapidly if founders face personal liability for user-generated content.
Do you think this is likely to be an isolated incident or become the norm?
Rohit Kumar: Social media intermediaries will likely reassess their systems and procedures more carefully. This could lead to greater adoption of encryption, which platforms are already promoting as a marketing tactic. Additionally, major platforms may rush to negotiate safeguards with various governments to prevent misuse of power by both parties. This issue has evolved beyond merely free speech to encompass questions of sovereignty.
Pranesh Prakash is Co-founder and former policy director at the Centre for Internet and Society; Rohit Kumar is Founding partner of the Quantum Hub
