key: cord-0597120-zanham26 authors: Nightingale, Sophie; Faddoul, Marc; Farid, Hany title: Quantifying the Reach and Belief in COVID-19 Misinformation date: 2020-06-15 journal: nan DOI: nan sha: 4d73d85955a4f7832dde4d91cc5109bfc706f8b1 doc_id: 597120 cord_uid: zanham26 The global COVID-19 pandemic has led to a startling rise in social-media fueled misinformation and conspiracies, leading to dangerous outcomes for our society and health. We quantify the reach and belief in COVID-19 related misinformation, revealing a troubling breadth and depth of misinformation that is highly partisan. The COVID-19 global pandemic has been an ideal breeding ground for online misinformation: Social-media traffic has reached an all-time record [1] as people are forced to remain at home, often idle, anxious, and hungry for information [2] , while at the same time, social-media services are unable to rely on human moderators to enforce their rules [3] . The resulting spike in COVID-19 related misinformation is of grave concern to health professionals [4] . The World Health Organization has listed the need for surveys and qualitative research about the infodemic in its top priorities to contain the pandemic [5] . A recent survey confirmed that belief in COVID-19 conspiracy theories is associated with smaller compliance with public health directives [6] . Another recent study found that political affiliation is a strong predictor of knowledge of COVID-19 related information [7] . Building on this earlier work, we launched a large-scale US-based study to examine the belief in 20 prevalent COVID-19 related false statements, and 20 corresponding true statements. We evaluate the reach and belief in these statements and correlate the results with political leaning and primary source of media consumption. A total of 611 participants were recruited from U.S.-based Mechanical Turk workers. 1 Participants were instructed that they would participate in a study to evaluate the reach and belief in COVID-19 related misinformation. They were asked to read, one at a time, 40 statements (Table 1) , half of which are true and half are not, and specify: (1) if they had seen/heard the statement before; (2) if they believed the statement to be true; and (3) if they know someone that believes or is likely to believe the statement. The 40 statements were sourced from reputable fact-checking websites (e.g., snopes.com/fact-check and reuters.com/fact-check). To ensure a balanced design, each false statement was matched with a similarly themed true statement. The 40 statements plus three attention-check questions (Table 1) were presented in a random order. At the end of the survey, participants were asked how they consume news, their political leaning, and basic demographics: education-level, age, gender, and race. All responses were collected between April 11, 2020 and April 21, 2020, amidst the global COVID-19 crisis. The three, obviously false, attention-check questions were used to ensure that participants were paying attention to the survey. A participant's data was discarded if they failed to correctly answer any of these attention-check questions: 111 of the 611 responses were discarded, yielding a total of 500 usable responses. Participants were paid $2.00 for their participation in the study. At the end of the study, participants were again informed that half of the statements they read were not true, asked to confirm that they understood this, and were directed to several websites with accurate health information. On average, 55.7%/29.8% of true/false statements reached participants, of which 57.8%/10.9% are believed (Table 1) . When participants are asked if they know someone that believes or is likely to believe a statement, 71.4%/42.7% of the true/false statements are believed by others known to the participant. The median number of true/false statements that reached a participant is 11/6 ( Figure 1(a) ); the median number of true/false statements believed by a participant is 12/2 ( Figure 1(b) ); the median number of true/false statements believed by others known to the participant is 15/8 ( Figure 1 (c)); and 31% claimed to believe at least one false conspiracy (cf. [8] ). It is generally encouraging that true statements have a wider reach and wider belief than false statements. The reach and belief in false statements, however, is still troubling, particularly given the potentially deadly consequences that might arise from misinformation. Even more troubling is the partisan divide that emerges upon closer examination of the data. We conducted six negative binomial regression models with six outcome variables corresponding to the reach, belief, and belief by others in each true/false statement. The predictor variables included participant demographics: gender, age, education, political leaning, and main source of news. We briefly review the largest effects of demographics. Political leaning and main news source had an effect on the likelihood of the number of false statements that are believed. The number of false statements believed by those on the right of the political spectrum 2 is 2.15 times greater than those on the left (95% CI [1.84, 2.53]). Although a smaller effect, those on the right were also less likely to believe true statements than those on the left (0.89, 95% CI [0.83, 0.95]). The number of false statements believed by those with social media as their main source of news is 1.41 times greater than those who cited another main news source (95% CI [1.19, 1.66]). We next performed a binary logistic regression to evaluate how political leaning and main news source influenced belief in each false statement (Table 1) . Political leaning influenced the likelihood of believing 12/20 false statements, and main news source influenced the likelihood of believing 7/20 false statements. For 11/12 false statements where there is an effect of political leaning, those on the right are more likely to believe the false information. For 6/7 false statements where there was an effect of main news source, those with social media as a main source are more likely to believe the false information. The five largest effects were based on political leaning, where, as compared to those on the left, those on the right are: • 15.44 times more likely to believe that "asymptomatic carriers of COVID-19 who die of other medical problems are added to the coronavirus death toll to get the numbers up to justify this pandemic response", 95% CI The one false statement that those on the left were more likely to believe than those on the right was that "Sales of Corona beer dropped sharply in early 2020 because consumers mistakenly associated the brand name with the new coronavirus." The effects of main news source on likelihood of believing false statements is smaller than for political leaning. Those with social media as their primary source are 6.45 times more likely to believe that "Silver solution kills COVID-19" (95% CI [1.59, 26.14]), and 5.71 times more likely to believe that "Drinking sodium bicarbonate and lemon juice reduces the acidity of the body and the risk of getting infected with COVID-19" (95% CI [1.70, 19 .15]). There is a troublingly wide reach and belief in COVID-19 related misinformation that is highly partisan and is more prevalent in those that consume news primarily on social media. As with previous work [7] , our study was conducted online, so an average belief in false information of 4.8% may not be representative of the general public [9] . To address this limitation, participants were also asked about the belief of those familiar to them. This revealed what is likely an upper bound of 40% in the belief of misinformation in the general public. The real-world impact of such beliefs has already been demonstrated with devastating consequences. For example, false claims on social media that drinking high-proof alcohol will kill the virus has been linked to the death of over 300 people in the Republic of Iran [10] . It remains unclear the extent to which COVID-19 misinformation is a result of coordinated attacks, or has arisen organically through mis-understanding and fear. It is also remains unclear if the spread and belief in this misinformation is on the rise or decline, and how it has impacted other parts of the world. We are actively pursuing answers to each of these questions. Falsehoods spread faster than the truth [11] , and falsehoods are often resistant to correction [12, 13] . Media, and social media in particular, must do a better job at preventing these falsehoods from reaching their platforms, and they must do a better job in preventing their spread. Keeping our services stable and reliable during the covid-19 outbreak The fear of covid-19 and its role in preventive behaviors Facebook sent home thousands of human moderators due to the coronavirus. now the algorithms are in charge Open Letter with Avaaz. Health professionals sound alarm over social media infodemic World Health Organisation. A coordinated global research roadmap The relationship between conspiracy beliefs and compliance with public health guidance with regard to covid-19. Centre for Countering Digital Hate Navigating the 'infodemic': How people in six countries access and rate news and information about coronavirus Coronavirus conspiracy beliefs, mistrust, and compliance with government guidelines in england Does the Sun revolve around the Earth? A comparison between the general public and online survey respondents in basic scientific knowledge In iran, false belief a poison fights virus kills hundreds The spread of true and false news online When corrections fail: The persistence of political misperceptions Misinformation and its correction: Continued influence and successful debiasing This work was supported by funding from Facebook, the Defense Advanced Research Projects Agency (DARPA FA8750-16-C-0166), and a Seed Fund Award from CITRIS and the Banatao Institute at the University of California.