Russia-born tech tycoon Pavel Durov, the founder of Telegram, was arrested in Paris on August 24, 2024. French authorities announced that Mr. Durov is under investigation for a litany of serious crimes, including enabling the distribution of child sexual abuse material on the app, facilitating drug trafficking, and refusing to cooperate with law enforcement. Should digital platform owners be held liable for user-generated content? Pranesh Prakash and Rohit Kumar discuss the question in a conversation moderated by Aaratrika Bhaumik. Edited excerpts:

Does Telegram’s lax content moderation policy and purported reluctance to cooperate with law enforcement agencies justify the indictment of its founder?

Pranesh Prakash: It remains unclear the extent to which Telegram’s content moderation policies are lax or whether there was any genuine reluctance to cooperate with law enforcement agencies. In 2022, Germany reported that Telegram had complied with a request to remove 64 channels that potentially breached German hate speech laws. In this particular case, beyond the list of charges, the specific facts and circumstances remain ambiguous. However, I will say that except in instances where there is personal complicity or direct involvement, the founder of a messaging platform should not incur any criminal liability for the acts of the platform’s users.

Rohit Kumar: While it is understandable that Telegram aims to foster free speech, it is crucial to acknowledge the real-world harms associated with unregulated messaging platforms. Ideally, directors and founders should not be held personally liable. However, if there is clear evidence of direct complicity or knowledge, criminal liability may be imposed. Nonetheless, the threshold for such liability is generally set very high, necessitating substantial evidence.

From a policy standpoint, to what extent should social media intermediaries be held accountable for the content they host?

Rohit Kumar: A well-established principle is that of safe harbour, which stipulates that a platform should not be held liable for user-generated content as it merely acts as an intermediary. Privacy must also be preserved, meaning platforms should avoid excessive monitoring or interception of user communications. For instance, when there was dissemination of misinformation on WhatsApp during elections in India, the platform limited the ability to simultaneously forward messages to multiple groups and reduced group sizes. Additionally, platforms should have compliance officers or designated representatives to cooperate with law enforcement, provided that due process is followed. Ensuring such measures and establishing clear procedural protocols should be a key focus for messaging platforms.

Pranesh Prakash: In the case of fully end-to-end encrypted platforms, their ability to view reported messages and take action is inherently limited. Additionally, platforms that are designed to minimally record metadata or not record it at all face significant constraints in cooperating with law enforcement agencies regarding user data. Under EU (European Union) law, there is a clear prohibition against requiring platforms to monitor or spy on their users. When it comes to Telegram, while it upholds the confidentiality of private one-on-one and group chats and does not allow enforcement actions on these communications, it does permit scrutiny of content on public channels.

Could even liberal democracies increasingly push for stricter content moderation from these platforms? Does the passage of the Digital Services Act (DSA), 2024, the EU’s latest attempt to regulate big-tech excesses, signal a broader shift in this direction?

Pranesh Prakash: I don’t believe it does. For instance, in 2000, a French court ordered Yahoo! Inc. to block French users from accessing Nazi memorabilia auctions on its U.S.-based website — an instance of direct content regulation by a court in a liberal democracy. This demonstrates that content regulation is not a recent development. What has changed, however, is that many who once staunchly defended free speech now seem to prioritise the perceived harms of ‘disinformation’ over the need for freedom of expression. This shift represents complicity in the over-regulation of free speech, which to an extent the DSA also embodies.

Rohit Kumar: The key difference between the past and present lies in the accelerated pace at which disinformation spreads. This is not merely a conflict between the desire to protect free speech and the need to manage disinformation; it transcends simple political narratives. As instances of misuse and real-world harm escalate, the argument for stricter oversight becomes more compelling. For instance, the decision of X to de-platform Donald Trump during the last U.S. presidential election was made by the platform itself. But should platforms have the power to determine who has a voice and who doesn’t? We need greater procedural clarity on how these decisions are made, who makes them, where liability lies, and when government intervention is appropriate.

Could Telegram’s laissez-faire approach to content moderation jeopardise its safe harbour protection under the Information Technology (IT) Act, 2000, in India?

Pranesh Prakash: Telegram does not comply with certain provisions of the 2023 IT Rules, which mandate specific terms of service for entities operating in India. In fact, only a handful of global companies adhere to these regulations, primarily those with a physical presence in India. A parallel comparison can also be drawn with the situation in France. One of the key charges levelled by French authorities against Telegram is that it provided cryptology services aimed at ensuring confidentiality without a license. This regulation, however, is outdated and applied inconsistently. For instance, anonymity networks like Tor have not come under scrutiny in France. A similar pattern of selective enforcement could potentially arise in India should the government decide to target Telegram under the IT Act.

Rohit Kumar: The Ministry of Electronics and Information Technology has announced that it is investigating Telegram over concerns that it is being used for illegal activities such as extortion and gambling. Additionally, some of the requirements under the 2023 IT Rules, such as submitting transparency reports and designating a compliance officer, are quite extensive. Although the Indian government has maintained that Telegram is compliant with these regulations, I agree with Pranesh that there is always a risk of selective prosecution.

Could the threat of personal liability push tech executives to reassess the risks of unregulated content?

Rohit Kumar: In the Indian context, the threat of personal liability has been wielded multiple times. For example, the former IT Minister had issued warnings to X for non-compliance with the new IT Rules. Such high-profile arrests of senior executives inevitably provoke anxiety among social media companies, particularly if they occur in liberal democracies. Nonetheless, there is broad consensus among stakeholders in India that personal liability for regulatory violations should not be imposed. Instead, it may be more effective to impose higher penalties for repeated offences or consider banning persistently non-compliant entities.

Pranesh Prakash: It definitely will. However, it should also prompt countries to reconsider their approach. One potential consequence is that more messaging platforms might adopt end-to-end encryption and minimise metadata storage to avoid assisting law enforcement. So this kind of wilful blindness is likely to emerge more rapidly if founders face personal liability for user-generated content.

Do you think this is likely to be an isolated incident or become the norm?

Pranesh Prakash: As moral panic over disinformation grows, we can expect to see more such arrests. However, in Durov’s case, French prosecutors have cited specific crimes rather than speech-related offences. We will also witness increased censorship and more bans on apps like Telegram and WhatsApp. Telegram is already banned in over a dozen countries.

Rohit Kumar: Social media intermediaries will likely reassess their systems and procedures more carefully. This could lead to greater adoption of encryption, which platforms are already promoting as a marketing tactic. Additionally, major platforms may rush to negotiate safeguards with various governments to prevent misuse of power by both parties. This issue has evolved beyond merely free speech to encompass questions of sovereignty.

Listen to the conversation in The Hindu Parley podcast

Pranesh Prakash is Co-founder and former policy director at the Centre for Internet and Society; Rohit Kumar is Founding partner of the Quantum Hub

Published - September 06, 2024 02:26 am IST