key: cord-1045941-4lyot736 authors: van der Donk, Berdien B E title: Should Critique on Governmental Policy Regarding Covid-19 Be Tolerated on Online Platforms? An Analysis of Recent Case-Law in the Netherlands date: 2022-02-15 journal: J Hum Rights Pract DOI: 10.1093/jhuman/huab025 sha: 476c2623105d3f40e4975df6f910037b804e2868 doc_id: 1045941 cord_uid: 4lyot736 This policy and practice note describes and discusses two recent decisions by the District Court in Amsterdam regarding the applicability of YouTube’s and Facebook’s Community Guidelines on Covid-19 misinformation. The decisions (Café Weltschmerz/YouTube and Smart Exit/Facebook) illustrate the tense intersection between, on the one hand, the freedom to express criticism of the government’s policy for fighting the outbreak of Covid-19 in the Netherlands, and on the other hand, the prevention of (dis)information with the potential to harm public health. The author will point out that the two decisions, although covering the same subject matter, differ significantly in argumentation regarding the (scope of the) application of the freedom of expression. Analysing this divergence in argumentation will show that its roots can be traced back to a different valuation of the role of the online platforms regarding the dissemination of speech. A debate on this divergence is needed to prevent inconsistency in future decisions and to contribute to the broader discussion on content regulation in the European Union. In 2015, the European Union initiated its plan to decrease the online dissemination of (harmful) disinformation. In line with this plan, on 10 June 2020, the Commission issued a joint communication on tackling Covid-19 disinformation (European Commission 2020). One of the aspects addressed in this communication is the need to understand the various aspects of disinformation, and specifically: 'the need to clearly differentiate between the various forms of false or misleading content'. (European Commission 2020) . This difference is the main focus in the two preliminary rulings discussed in this note. On the one hand there is criticism of the government's policy to fight the outbreak of Covid-19 in the Netherlands, and on the other hand, there is a need to stop the spreading of disinformation with the potential to harm public health. The cases brought before the District Court in Amsterdam in September and October 2020 illustrate, for example, the fine line between criticism of governmental policy regarding Covid-19 and disinformation. The central question addresses who should decide whether content is in fact (harmful) disinformation or a welcome contribution to an ongoing debate. First, the facts of the two decisions will be discussed. Although they cover the same subject matter and reach an identical conclusion-that the deletion of content was justifiedthe reasoning of the District Court in Amsterdam differs significantly in argumentation. Therefore, the author will subsequently analyse this divergence in argumentation and will show that its root can be traced back to a different valuation of the role of the online platforms involved in the dissemination of the content. Last, the conclusion on the divergence in reasoning in these two decisions will be placed in the broader current discussion on disinformation during the Covid-19 pandemic. The first ruling by the District Court in Amsterdam covers a dispute between two individuals and YouTube. The case revolves around the deletion of two videos that were posted to the YouTube channel of 'civil journalism platform' Caf e Weltschmerz. More specifically the plaintiffs demanded that YouTube reinstate two interviews with a general practitioner, in which he describes the effectiveness of hydroxychloroquine (HCQ) to treat Covid-19. However, according to YouTube, the said videos are to be regarded as disinformation and breach its Community Guidelines policy on COVID-19 Medical Misinformation. The claim that HCQ is effective to treat Covid-19 goes against the statements of the World Health Organization and the RIVM (the Dutch local health authority). In YouTube's policy it is explicitly stated: 'YouTube doesn't allow content that spreads medical misinformation that contradicts local health authorities' or the World Health Organization's (WHO) medical information about COVID-19. This is limited to content that contradicts WHO or local health authorities' guidance on treatment, prevention, diagnostic, transmission'(YouTube Help 2020). In line with this policy, YouTube deleted the interviews. The plaintiffs contest the validity of this action. In their view, the interviews are a contribution to a public debate on the treatment of Covid-19. To support this claim they point to the fact that both the WHO and the RIVM regularly amend their guidelines on the matter, as there is no global consensus on the most effective treatment of Covid-19. By removing their videos, the plaintiffs argue, YouTube has breached its duty of care to ensure the freedom of expression of its users and third parties. The Court follows this reasoning. It agrees with the plaintiffs that a strict application of YouTube's Community Guidelines on Covid-19-removing all content that is not in line with the WHO or local health authorities' guidance-is too restrictive and would not be compatible with the horizontal application of the freedom of expression (Caf e Weltschmerz/YouTube, §4.10). A further explanation on how exactly the freedom of expression would apply horizontally in this dispute between two private parties has been omitted. The Court reaches a conclusion that corresponds to the plaintiff's argumentation set out above: For the benefit of the public debate in a democratic society and the role YouTube plays as one of the important online video platforms, it is not compatible with the right to freedom of expression to allow content that corresponds to the views of the WHO and RIVM and to not, on the contrary, allow content criticising these views. In this way, the YouTube user, who may expect a broad range of diverse content, would only be able to hear the opinion of the group of experts advising the WHO and RIVM, whilst the method of combating and treating Covid-19 is still under investigation worldwide and is far from certain. The WHO and RIVM are also still updating their advice (Caf e Weltschmerz/YouTube, §4.10). Thus, restricting all information that does not correspond to the view of the WHO or the local health authority is too restrictive. However, the Court continues and states that YouTube's guideline is based on the European Commission's guidance to prevent the spreading of disinformation regarding Covid-19. Based thereon, YouTube is allowed to restrict access to incorrect, harmful and dangerous information. The Court therefore continues with an assessment of whether the contested videos can be qualified as such. It concludes that 'a general practitioner who claims, without conclusive evidence and scientifically-based tests, that HCQ or an alternative drug-which can be obtained without a prescription-works, misinforms the public. Such information can be harmful and dangerous' (Caf e Weltschmerz/YouTube, §4.15). The Court underlines that public debate is necessary in a democratic society. However, the unnuanced way the general practitioner has formulated his point of view in the interviews does not contribute to such a debate. Therefore, the removal of the two interviews promoting the use of HCQ for the treatment of Covid-19 is justifiable. As a last note, the Court clarifies that Caf e Weltschmerz's criticism of the government's policy regarding self-isolation and the 1.5-metre distancing requirement, as opposed to the claims regarding HCQ, were of added value to the debate and that such speech should therefore not be restricted, even though they might not be in line with the point of view of the WHO or the RIVM. That seems to hint in the direction that, according to the Court, the plaintiffs have a right to have these comments reinstated. Unfortunately, as the comments on self-isolation and the 1.5-metre distancing requirement were expressed in the same video as the HCQ-comments, the Court suffices to conclude that the videos were legitimately removed (due to the HCQ-comments). A month later, on 13 October 2020, the District Court was asked to decide on a similar matter between Smart Exit, a self-proclaimed interest group for the hospitality industry, Viruswaarheid (Virus truth), a self-proclaimed interest group for democratic values, and Facebook. This case revolves around the deletion of two Facebook pages called 'No to 1.5metre' and 'Viruswaanzin' (Virus madness), which were used to voice criticism of the Dutch government's policy on Covid-19. The plaintiffs demand the reinstatement of these two pages, which were used to bring their message to the attention of the general public. Facebook's strict interpretation of its Covid-19 guidelines frustrates this possibility, which the plaintiffs claim is a limitation of their right to freedom of expression. Facebook contests that the content was deleted justly. To limit the dissemination of incorrect information about Covid-19, Facebook cooperates with the WHO and with (socalled) independent fact-checkers. Facebooks adds that it does not delete content solely based on the fact that the information is not in line with the WHO or other local health authorities. Rather, information is removed when it could cause harm to its users. Whether the pages disputed were causing harm to users remains unclear, as the parties do not agree on the exact pages that have been removed (Smart Exit/Facebook, §4.1- §4.3). The Court therefore decides on the case at a general level. According to the Court, the general question to be answered in this dispute is whether the right to freedom of expression limits Facebook's application of its Covid-19 policy, and more specifically, whether the right to freedom of expression should be interpreted as a chance for everyone to express his or her opinion without restriction, through any medium or platform. Following the reasoning in Appleby & Others v. United Kingdom, 1 the Court answers this question in the negative (Smart Exit/Facebook, §4.20). It concludes that the freedom of expression does not create any horizontal obligations between private parties. It is the State's responsibility to intervene in cases where the essence of the freedom of expression has been destroyed and further effective exercise of that freedom has been made impossible. The Court concludes that that is not the case, as the plaintiffs had the opportunity to express their opinions elsewhere. The Court adds hereto that, despite Facebook's global scale and influence, it does not have any obligation to host specific statements from users without a specific legal basis. Neither State intervention is required as the essence of the right of the plaintiffs has not been fully destroyed. In the eyes of the Court, the plaintiffs could have turned to newspapers or created their own website to disseminate their information. Contrarily, the Court finds that Facebook has 'a social duty to comply with government guidelines, unless these are clearly inaccurate' (Smart Exit/Facebook, §4.23). According to the Court, as the public debate on Covid-19 is still ongoing, it cannot be concluded that the government's guidelines can be deemed clearly inaccurate. The Court has omitted to substantiate the grounds for the existence of such a social duty to comply with government guidelines. The Court concludes with a reflection on a possible indirect horizontal application of the freedom of expression. It finds that freedom of expression is not an absolute right and can be limited to protect the rights of other people and/or to protect public health. The Court values Facebook's fundamental right to property highly and, in order to protect this right, the Court concludes that it is up to Facebook to set and enforce the rules that apply to its platform. This includes a policy on information regarding Covid-19 (Smart Exit/ Facebook, §4.23). Moreover, because Facebook's guidelines are an extension of governmental policy to protect public health, the Court concludes that the limitation of the right to freedom of expression by strictly applying Facebook's Covid-19 policy was justified. The reasoning of the Court in these two cases illustrates two opposing points of view. On the one side, in Caf e Weltschmerz v. YouTube, the Court concludes that blindly following the guidelines of the WHO and local health authorities would not contribute to a public debate and should be avoided. As long as there is no consensus on the treatment of Covid-19, a nuanced stream of information from both sides is necessary to discuss potential measures and limitations to society. The Court specifically highlights the global scale of the platform as an argument to broaden the scope of protection of the right to freedom of expression. Only information that has been proven to be wrong and/or harmful should be restricted. In Smart Exit v. Facebook, the Court reaches the exact opposite decision: as long as the situation is unclear, the government's policy on Covid-19 shall be followed, unless that policy is evidently wrong. The size of the platform is not of importance. Rather, these platforms have a social duty to follow government guidelines, unless these guidelines have proven to be wrong. Accelerated by a global pandemic, the question of who should set the rules on online platforms, has gained an increased importance. In times of unprecedented crises, access to information is of great importance in order to move forward towards a potential solution. Restricting access to information, especially when targeting only 'one side' of the debate, should therefore be carried out cautiously. As pointed out by Carver, much of the public discussion covers the necessity of restrictions in order to try to curb the pandemic (Carver 2020) . However, do the platforms' new guidelines on Covid-19 allow for such a public discussion? The analysis of these cases highlights an uncertainty as to what public discussion is allowed to take place online. The line set out in Smart Exit v. Facebook points in the direction of a de facto zerotolerance for criticism of governmental policy. In the current light of increased dissatisfaction with Covid-19 measures, silencing the opportunity to express critique and restricting the chance to publicly call the government's policy into question could potentially make matters worse, as it can increase the already existing idea that human rights protection is only available to certain groups in society (Seyhan 2020) . In addition, for oppressive governments, such a duty to follow government guidelines can easily be twisted and turned into an application that would substantially decrease protection for free speech. From a legal point of view, the two decisions also indicate an interesting fork in the road regarding the balancing of the different fundamental rights at stake. In short, these two disputes can be boiled down to a single dilemma: can an online platform be forced to carry (legal) information, even though such information goes against its own user terms? According to the conclusion in Smart Exit v. Facebook the question should certainly be answered in the negative. Without a legal obligation to carry certain content, the platform has no obligation to provide for a two-sided debate. The fact that the platform carries out government instructions was made as a subordinate argument: already on the sole basis of its fundamental right to property, Facebook is supposedly free to design its user terms, regardless of the consequences for the public debate. That means that despite the fact that there is no consensus on the treatment of Covid-19 worldwide, an online social media platform can push the developments of the public debate in a certain direction by prohibiting information that diverts from the current governmental policy. In line with the Court's reasoning in Smart Exit v. Facebook, remedies of human rights law would only apply if all other possibilities to express an opinion were restricted. According to this reasoning, if one has the possibility to express an opinion in any other way-the Court refers to the ability to 'set up a website' or 'contact the traditional media' as alternative ways to express speech-there is no problem. However, Facebook and other social media platforms are increasingly used to access news. In 2020, a worldwide average of 31 per cent of adults use these sources for local news and information, and in the Netherlands, 28 per cent of adults reported using Facebook as a source for news (Reuters Institute 2020). Therefore, it seems that simply referring to the ability to 'set up a website' or 'contact the traditional media' as alternative ways to express speech disregards the significant role these online platforms nowadays play in the lives of people. When the access to opinions that differ from the majority in society are restricted, progress in controversial debates would quickly stagnate. A change in society often takes off because a small number of people disagree with the status quo and challenge it. Perhaps a departure from the reasoning in Caf e Weltschmerz v. YouTube, where a more nuanced conclusion was reached, would be desirable. Whereas harmful disinformation was restricted due to legislative instruction of the European Union, strict enforcement to restrict any content opposing the WHO's or local health authorities' view-as YouTube's Covid-19 guidelines stated-is unreasonable in the Court's eyes. By so finding, the Court opened up a debate on the freedom of online platforms to freely draft and enforce their house rules as they please, especially in times of a pandemic. Perhaps, as Benesch recently proposed, the time has come to implement human rights norms to online platforms in order to give users a basis to hold online companies accountable for unreasonable user terms (Benesch 2020) . In any event, in the absence of consensus regarding the scope of online platforms' freedom to draft and enforce user terms, it remains unclear as to what extent these platforms are free to decide on the amount of critique to be tolerated on their platforms. Berdien van der Donk, LL.M, is a PhD scholar at the University of Copenhagen. In her thesis, she explores the freedom of online platforms to draft their user terms and to restrict access to content under European law. But Facebook's Not a Country: How to Interpret Human Rights Law for Social Media Companies Forum: Human Rights Practice in the Age of Pandemic Joint Communication to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions tackling COVID-19 disinformation -Getting the facts right Reuters Institute. 2020. Executive Summary and Key Findings of the Report Pandemic Powers: Why Human Rights Organizations Should Not Lose Focus on Civil and Political Rights COVID-19 Medical Misinformation Policy