“Your Users, Your Liability” - How the ECtHR Established Liability for Hate Speech on Online News Platforms 

By: Paul Weber, Junior Research Associate, PILPG-NL

Online platforms have become one of the most important ways in which we communicate today.  The European Court of Human Rights (the Court, ECtHR) recognized that the internet “provides an unprecedented platform for the exercise of freedom of expression”.  Yet, hatred and incitement to violence are common on these fora.  In many cases, authors of such comments hide behind the anonymity provided to them by the internet.  Therefore, victims often find themselves unable to hold the authors directly accountable for their comments.  The ECtHR has addressed this issue by finding that online news platforms are liable for improperly managing hate speech in their user’s comments.  This blog post will outline how the Court established the liability of online platforms for the comments of their users.

 The ECtHR’s case law on the liability of online platforms for unlawful user comments began with Delfi AS v Estonia.  This case concerned the news platform Delfi.  Delfi was held liable for failing to remove personally insulting and threatening user comments under one of its news articles.  The article concerned the business practices of a local company.  Delfi AS removed the comments only after receiving the information that the affected company had filed a lawsuit against it.  In the domestic legal proceedings, the Estonian courts held that the comments violated the affected company’s personality rights and were thus not protected by freedom of speech.  The domestic proceedings imposed legal liability on Delfi AS, as it had failed to provide a system for quick removal of hateful comments.  Subsequently, Delfi AS complained before the ECtHR that it was a violation of  freedom of expression under Article 10 of the European Convention on Human Rights (ECHR). 

 The “Delfi Criteria”

In its decision, the ECtHR developed four criteria to evaluate a platform’s liability concerning comments (paras. 142-143).  These were: the context of the comments, what steps the company took to prevent or remove the unlawful comments, the alternative of holding the actual authors of the comments accountable instead, and, lastly, the consequences of the domestic ruling for the company.  It is necessary to have a closer look at the Court’s application of those criteria to the Delfi AS case to understand what it meant by each of them.

 First, in examining the context of the comments, the ECtHR found that the platform was not a “passive, purely technical service provider” (para. 146).  The actual authors could neither edit nor delete their comments once published.  Thus, Delfi AS had ultimate control over the comments and economically profited from them. 

Second, the Court found that Delfi AS employed upload filters to delete comments containing certain hateful words, and, on several occasions, administrators had deleted hateful comments on their initiative.  However, in the present case, these measures had failed to remove comments containing hate speech without delay and without having been notified.  There, the ECtHR found that the Delfi AS’ response to the comments was insufficient. 

Third, the Court evaluated whether holding the actual authors liable for their comments might be an alternative approach.  The Delfi news platform allowed its users to comment anonymously.  In the eyes of the ECtHR, this anonymity stood in the way of redress for the victims and, hence, could not be an alternative approach in this case. 

Fourth, the Court did not believe that the consequences of the domestic proceedings for Delfi AS were overly harsh.  The company only had to pay a small fine and was not substantially hindered in its operation, according to the ECtHR.  For these reasons, the Court held that the Estonian ruling did not violate freedom of expression. 

With these four criteria, the ECtHR laid the groundwork for its jurisprudence on online news platforms.   Their liability arises, because news platforms “provide for economic purposes a platform for user-generated comments on previously published content” (para. 116).  In the eyes of the ECtHR, this new case law does not extend to social media platforms and blogs.  Such platforms do not provide content of their own and the individuals providing actual content do so as a hobby.  In the eyes of the ECtHR these platforms therefore do not have the same responsibilities as news platforms.

 The Legacy of Delfi AS

Less than a year after the Delfi AS decision, the ECtHR affirmed and refined its criteria for online platform liability in MTE and Index.hu v. Hungary.  Here, the Court additionally evaluated the effect that the comments in question had on the persons that they addressed. With these refined criteria, the Court has evaluated several more cases.  However, this new case law is far from uncontroversial. 

First criticism came with the Delfi AS judgment itself, as Judges Sajó and Tsotsoria wrote a joint dissenting opinion.  The two judges particularly questioned the focus on professional news platforms, remarking that “Freedom of expression cannot be a matter of a hobby” (Diss. Op. para. 9).  Other commentators, like Lorna Woods, criticized the implicit rejection of a notice-and-take-down system as sufficient, as the Court had found that platforms would have to act on their initiative.  However, as Neville Cox observed, Delfi and the case law that followed also provided “a bulwark against the enhanced possibilities for the exercise of freedom of expression de facto provided by the internet”.

Conclusion

The Delfi criteria, for the first time, allowed victims of hate speech online to hold the economic profiters, namely the platform providers, responsible for the wrongs they endured.  In doing so, the Court put greater limits on the freedom of speech online.  However, thereby, it also addressed the issue that victims are often left without redress due to the anonymity provided to authors of hate speech by the internet.  Several Council of Europe member states place greater responsibility on service providers online.  A notable example is Germany and its network enforcement act, which increases the duties of social media platforms.  Thus, the Delfi case law of the ECtHR may be the first of further interesting developments in the jurisprudence on hate speech online.