key: cord-1044155-nk86hvsn authors: Dow, Benjamin J.; Johnson, Amber L.; Wang, Cynthia S.; Whitson, Jennifer; Menon, Tanya title: The COVID‐19 pandemic and the search for structure: Social media and conspiracy theories date: 2021-08-04 journal: Soc Personal Psychol Compass DOI: 10.1111/spc3.12636 sha: 73dd43570e6949f8c42d24ff78d6abd3673c68dd doc_id: 1044155 cord_uid: nk86hvsn The study outlines a model for how the COVID‐19 pandemic has uniquely exacerbated the propagation of conspiracy beliefs and subsequent harmful behaviors. The pandemic has led to widespread disruption of cognitive and social structures. As people face these disruptions they turn online seeking alternative cognitive and social structures. Once there, social media radicalizes beliefs, increasing contagion (rapid spread) and stickiness (resistance to change) of conspiracy theories. As conspiracy theories are reinforced in online communities, social norms develop, translating conspiracy beliefs into real‐world action. These real‐world exchanges are then posted back on social media, where they are further reinforced and amplified, and the cycle continues. In the broader population, this process draws attention to conspiracy theories and those who confidently espouse them. This attention can drive perceptions that conspiracy beliefs are less fringe and more popular, potentially normalizing such beliefs for the mainstream. We conclude by considering interventions and future research to address this seemingly intractable problem. Conspiracy theories about COVID-19 pose a public health risk. For example, they trigger suspicion about well-established scientific recommendations (Prichard & Christman, 2020; Romer & Jamieson, 2020) , hamper response efforts to the pandemic (Buranyi, 2020) , and even lead to the burning of 5G cellphone towers (Heilweil, 2020) . Because of their powerful influence in shaping both our narratives about and responses to the pandemic, these conspiracy beliefs receive a great deal of public attention. A frequent contention in these discussions is that the internet-and social media in particular-contributes to the apparent prevalence of these problematic beliefs (Bomey, 2020; Easton, 2020; Lee, 2020) . For example, US President Joe Biden recently asserted that, by allowing misinformation and conspiracy theories to proliferate, social media platforms were reducing vaccination rates and "killing people" (Cathey, 2021) . Similarly, in a recent study linking social media use to the rate of COVID-19's spread, the authors speculated that one driver of the effect could be social media's tendency to prop up conspiracy theories about the disease (Kong et al., 2021) . In this study, we assess claims such as these by examining the rapidly emerging literature around COV-ID-19 conspiracy theories. Prior to the COVID-19 pandemic, researchers examined the key antecedents that lead to conspiracy theories' spread (i.e., the rate at which conspiracy theories are communicated from person to person; Franks et al., 2013) and stickiness (i.e., the extent to which belief in conspiracy theories "takes root" and becomes difficult to change; Jolley & Douglas, 2017) . Some work argues that the internet does not affect the propagation of conspiracy theories (Uscinski & Parent, 2014) or can even impede it (Clarke, 2007; Uscinski et al., 2018) , others suggest that aspects of the internet critically enabled conspiracy theories to spread and stick (Stano, 2020) . We join the latter group and conclude that, particularly in the context of the COVID-19 pandemic, social media-i.e., internet spaces where people share information, ideas, and personal messages and form communities (Merriam-Webster, 2021)-has played a central role in conspiracy beliefs. We offer a theoretical framework (see Figure 1 ) that outlines how the pandemic affects social media consumption, and how social media contributes to COVID-19 conspiracy theories and their negative effects. We first suggest that effects of the pandemic (and the attempts to contain it) in our daily lives fray our cognitive structures (i.e., our understanding of the world and how to act agentically upon it) and our social structures (i.e., our means of maintaining social bonds and connections with others), and thus are part of the reason the internet has become so central in our lives during the pandemic. Next, we review evidence that features of the alternative cognitive and social structures that social media provides contributed to the spread of COVID-19 conspiracy theories. We also discuss how characteristics of the social structures offered by social media help spread conspiracy theories and make them particularly sticky and resistant to disconfirming information. We go on to explore how these alternative social and cognitive structures translate into online communities with real-world implications. Finally, we propose interventions and future research that might help address these pernicious online dynamics. Conspiracy theories are explanations for events that posit powerful actors are working together in secret to achieve self-serving or malicious goals (Bale, 2007; Kramer & Gavrieli, 2005; Zonis & Joseph, 1994) . While powerful actors certainly perpetrate actual conspiracies, our interest lies in the over-perception of conspiracy-based explanations. Such theories are by no means new, however, the way the internet influences their spread is a quickly evolving consideration. This is all the more relevant during the COVID-19 pandemic, as individuals are forced out of public spaces due to social distancing measures, and the internet becomes more central to how they acquire information about and respond to the pandemic (Koeze & Popper, 2020; Naeem, 2020; Samet, 2020) . There is a great deal of debate about the role of the internet in the spread of conspiracy theories. On one hand, scholars argue that the critical atmosphere of the internet impedes the development of conspiracy theories because disagreement and disconfirming evidence is easily available (Clarke, 2007) . Others point to the lack of evidence that the internet increases their spread beyond people already interested in them (Uscinski et al., 2018) . Taken together, these scholars suggest there is little evidence that the internet has drastically altered the landscape for conspiracy theories. Conversely, other scholars argue the internet provides fertile ground for conspiracy theories to spread. They assert that social media's pervasiveness lowers barriers to access, exposing potential believers directly to conspiracy theories via sites like Facebook, Twitter, Reddit, and YouTube (Douglas, Ang, et al., 2017; Stano, 2020) . We engage in this debate, proposing a framework to describe how the COVID-19 pandemic and attempts to contain it create nearly ideal conditions for social media to uniquely contribute to the spread and stickiness of conspiracy theories. In general, the pandemic increased social media usage, such that general internet activity rose 25% in the days after lockdown (Rizzo & Click, 2020) . More specifically, the year 2020 saw global social media use accelerate by 13%, with 330 million new users, resulting in a total of 4.7 billion total users in April 2021 (Kemp, 2021a; Kemp, 2021b) . We argue that part of the reason people consumed more social media was because of pandemic-related disruptions to their cognitive and social structures. Although this may not have been the only reason for the increase, it is a significant contributing factor. People have a fundamental desire to feel that they understand the world and have agency in it (Douglas, Sutton, et al., 2017; Lewin, 1938; Mitchell, 1974) . However, as measures intended to impede the spread of COVID-19 were imposed, individuals experienced less control over their activities (e.g. via lockdowns, business closures, and mask mandates) and health outcomes (e.g. limited ability to prevent or treat illness). This loss of control had implications for many aspects central to one's life, for example, health (e.g., "Will I get the disease?"), career (e.g., "Will I have a job?"), loved ones (e.g., "What about my family's health-and when will I see them next?"), and government response (e.g., "Will it solve the problem or make it worse?")-to name just a few concerns (Douglas, 2021; Wang et al., 2020) . Information was also limited, driving uncertainty about how the disease spread, what its effects were, how political entities would apply and enforce social measures intended to limit its spread, and the ultimate impact of the disease on individuals' economic and social worlds. Many were in an untenable holding pattern, perhaps hoping for the "old" normal to return, while not knowing what would come next in this "new" normal (Brooks et al., 2020) . In other words, the pandemic disrupted people's sense that the world was predictable and navigable by making their understanding of their environment and how to act in it (i.e., their cognitive structures) appear inaccurate and outdated. The loss of control and uncertainty caused by the pandemic led many people to increase their consumption of social media. As the COVID-19 pandemic increased people's lack of control, they turned to social media as a place where they could still exert some control over their environment and how they interacted with it (Brailovskaia & Margraf, 2021) . In addition, when individuals experience greater uncertainty they tend to seek diagnostic information (Weary & Jacobson, 1997) , and during the pandemic, many people reported turning to social media to seek information (48%), or to share experiences and opinions about COVID-19 (41%; Ritter, 2020) . Indeed, approximately 66% of Instagram users used COVID-19 hashtags to discuss information, in the period between February 20 and May 6, 2020 (Rovetta & Bhagavathula, 2020) . The COVID-19 pandemic also disrupted people's social structures, by which we mean the patterns of interactions with others through which we form and maintain social bonds. For example, many employees were exiled from their normal office environments. Gatherings were restricted, precluding even the basic milestones that organize social life (e.g., weddings, funerals, and holidays). Stuck in highly limited communications with their immediate core groups, the weak ties that provided diversity and broader informational content were sorely missed (Sandstrom & Whillans, 2020) . In other words, the pandemic and associated social distancing measures disrupted the normal social interactions we rely on to fulfill our basic needs for affiliation and belonging (Young et al., 2021) . To fulfill these needs, many turned to online tools that help maintain relationships with others and ameliorate pandemic-induced feelings of loneliness (Cauberghe et al., 2021) . Given that feeling isolated has been linked with increased social media use (Primack et al., 2017a (Primack et al., , 2017b it should be no surprise that social media consumption has risen since the onset of social distancing measures (Fischer, 2020; Samet, 2020) . In addition, many people reported using social media to help maintain social connections (74%; Ritter, 2020) and to feel less lonely (57%; Trifonova, 2020) during the pandemic. In sum, with their cognitive and social structures disrupted, people looked online for alternative structures. However, as we will discuss next, the price of entry into these alternative structures may be immersion in a world of conspiracy theories. As people search online for alternative structures, features of the social media environment can increase the likelihood the structures people find will promote conspiracy beliefs. Here we argue that certain aspects of social media tend to increase the spread and stickiness of conspiracy theories among those seeking structure. Lacking the dependable solidity of their usual cognitive structures, people will seek out alternative structures (Landau et al., 2015; Whitson & Galinsky, 2008) . Although these alternative structures can take many different forms, conspiracy theories are a common instantiation (Whitson et al., 2019) . Conspiracy theories are attractive because they offer non-specific epistemic structure. In other words, they provide simple, clear, and consistent ways to interpret the key phenomena or events in one's environment (Landau et al., 2015) . However, adopting conspiracy theories sometimes means sacrificing a more accurate understanding of the world (Douglas, Sutton, et al., 2017; Douglas et al., 2019) . Though conspiracy theories may be briefly epistemically satisfying, like junk food, they do not resolve the underlying hunger, as they may not accurately reflect the real world. Indeed, recent work by Kofta and colleagues (2020) shows that in a cross-lagged study that endorsement of a conspiracy theory at Time 1 ironically predicted an increased feeling of lack of control at Time 2. Nevertheless, people are attracted to conspiracy theories as a potential replacement for the cognitive structure they have lost. In the context of the COVID-19 pandemic, the sudden lack of control and increased uncertainty may have made people particularly vulnerable to conspiracy theories as a form of alternative structure (Douglas, 2021) . While there is some evidence that uncertainty may lead to increased belief in conspiracies (van Prooijen & Jostmann, 2013; Whitson et al., 2015) , there is a much broader swathe of empirical support that lacking control drives investment in conspiracy beliefs (Kofta et al., 2020; Landau et al., 2015; Whitson & Galinsky, 2008) . In other words, the very same lack of control, and to a lesser extent uncertainty, that drove people onto social media also made them more open to conspiracy theories. We believe that this vulnerability to conspiracy beliefs combined with key characteristics of social media helped to promote the rapid spread of COVID-19 conspiracy theories. In particular, we argue that the prevalence of unreliable posts, bots, and algorithms increased exposure to conspiratorial misinformation in a vulnerable population of users. By readily exposing those who are in need of cognitive structure to conspiracy theories, social media helped spread these theories to more people. In recent years, social media has played a growing role in the spread of news content, both reliable (i.e., vetted and accurate information) and unreliable (i.e., questionable or inaccurate information, . For example, on sites such as Facebook and Reddit, unreliable posts (such as the conspiracy theory that pharmaceutical companies created COVID-19 as a new source of profit; Romer & Jamieson, 2020) spread as quickly as reliably-sourced COVID-related posts Papakyriakopoulos et al., 2020) . This suggests that for social media, unlike more traditional outlets, the spread of information is less tied to content accuracy. On Twitter, research has shown that false rumors not only move six times faster than the truth but also penetrate further into the social network. While the top 1% of false posts diffuse to as many as 100,000 users, the truth rarely reaches more than 1000 (Vosoughi et al., 2018) . Bots are automated accounts that post only certain types of information, often in collaboration with other bots, to boost the visibility of certain hashtags, news sources, or storylines (Ferrara, 2020) . This form of interactive technology is another source of false information on social media. While the identity of those who control bot networks is rarely public, several have been traced back to nation-state actors (Prier, 2017) . These bot networks are a unique feature of the social media landscape, able to create the illusion of popularity and mass endorsement or exaggerate the importance of topics in a way impossible in any other information-sharing context. A sample of over 43 million English-language tweets about COVID-19 revealed that bots were more present in some areas of discourse than in others; namely, bots tweeted about political conspiracies, whereas humans tweeted about public health concerns (Ferrara, 2020) . These bots not only help spread conspiracy theories, they also create a false impression that they are more widely endorsed than they are. There is also emerging evidence that design decisions meant to increase user engagement on social media may have the unintended effect of promoting the spread of misinformation, including conspiracy theories. Prior work has shown that algorithms used by social media companies to recommend content can lead people toward misinformation (Alfano et al., 2020; Elkin et al., 2020; Faddoul et al., 2020; Tang et al., 2021) . While some people may be actively seeking conspiracy theories (Uscinski et al., 2018) , algorithms expose them to a broader audience. As mere exposure to conspiracy theories tends to increase belief in them, algorithms can create vulnerabilities in people who may not have initial inclinations toward those theories (Jolley & Douglas, 2014; Jolley et al., 2020; Kim & Cao, 2016; van der Linden, 2015) . In sum, social media has increased the spread of conspiracy theories through a variety of means. The spread of these conspiratorial alternative cognitive structures is exacerbated on social media by the freedom with which unreliable information travels, bots inflating impressions of their popularity, and algorithms putting them in front of an unwitting and vulnerable public. As the COVID-19 pandemic deprives people of their normal office environments, traditional life milestones, and much of the rich tapestry of social contact they are used to, they turn to social media to try to fulfill their needs for affiliation and belonging. Social media fulfills these needs via alternative social structures that have some distinguishing features that were not present in the social structures they replace. We propose that certain features of the alternative social structures that social media provides-namely the presence of social media influencers, information echo chambers, and constant social reinforcement-tend to make conspiracy theories spread further and become stickier. The role of so-called "influencers" on social media (i.e., high-status social media users with many followers; Enke & Borchers, 2019) may contribute to the spread of false beliefs. For example, a handful of conservative politicians and far-right political activists used Twitter to spread the conspiracy that the pandemic itself was a hoax (#FilmYourHospital; Gruzd & Mai, 2020) . People follow these influencers for a variety of reasons unrelated to conspiracy theories and often feel as if they are trusted friends (Kirkpatrick, 2016; Woods, 2016) . This is troubling because people are less likely to critically evaluate information from people they trust and are more prone to believe misinformation when they are not evaluating it deeply (Kirkpatrick, 2016; Pennycook & Rand, 2019) . As a result, influencers who engage with conspiracy theories are another means by which these theories spread to a larger audience who would not have otherwise sought them out. Furthermore, unlike traditional journalistic outlets that have more formalized accountability to federal laws and ethics standards, social media allows high-status individuals unmitigated, direct access to the general public and the ability to promote ideas largely unchallenged (Finkel et al., 2020) . Indeed, Brennen and colleagues (2020) found that high-status individuals are key contributors to the spread of misinformation; while they produced only 20% of the misinformation in the study sample, this misinformation accounted for 69% of the observed social media engagement from followers. Another study finds that after Donald Trump was removed from Twitter, election-related misinformation on the site dropped by 73% (Dwoskin & Timberg, 2021 ). Finally, a study of 812,000 social media posts containing vaccine misinformation between 1st February and 16th March of 2021 found that 65% of those posts could be traced back to just 12 influencers (Center for Countering Digital Hate, 2021). Some aspects of social media make it harder for self-correcting information to enter the system. Just as with the influencers who reside on them, social media platforms provide direct access to an unprecedented amount of unfiltered information, including a great deal of misinformation Zarocostas, 2020) . The echo chamber. Left to sort through the overwhelming amount of content social media users, guided by their cognitive biases and pre-existing beliefs (Bessi et al., 2015; Cinelli et al., 2021) , tend to seek out information that is consistent with their existing views and ignore or reject contrary information (Brugnoli et al., 2019) . In other words, people become trapped in information echo chambers where individuals with similar interests find each other and form increasingly homogenous groups, driving amplification of particular beliefs (Choi et al., 2020; Cinelli et al., 2021) . Embedded in these groups, they find people with more extreme views dominating the conversation (Barberá & Rivero, 2014; Boutyline & Willer, 2017) . Two studies of Facebook users' information consumption found that those who circulate conspiracy content also tend to ignore science content, and vice versa (Del . Moreover, users in conspiracy communities tend to be more active within their community and interact less with information outside of it (Bessi et al., 2015) . In other words, information echo chambers form online around conspiracy beliefs, making it more difficult for information that counters these beliefs to reach people once they have adopted conspiracy theories. Entrance into the echo chamber. Evidence suggests that those joining COVID-19 conspiracy echo chambers vary in the extent to which they endorsed conspiracy beliefs before entry. On one hand, analysis of Reddit communities showed that the founders of COVID-19 conspiracy theory subreddits were previously involved in other conspiracy theory communities (Zhang et al., 2021) . Similarly, an analysis of Twitter data found that many of those sharing COVID-19 misinformation had previously expressed anti-vaccine sentiments (Memon & Carley, 2020) . This suggests that people who were already involved with other conspiracy theories were attracted to COVID-19 conspiracies. On the other hand, previous endorsement of conspiracy theories is not a prerequisite. The eventual popularity of COVID-19 conspiracy theories indicates that belief in them spilled over into the mainstream, attracting people who were not previously a part of other conspiracy theory communities. Studies have suggested that as much as a third of the population have endorsed at least one COVID-19 related conspiracy theory, and one in 10 watched at least some part of the plandemic conspiracy theory video (Allington et al., 2020; Kuhn et al., 2021; Mitchell et al., 2020; Romer & Jamieson, 2020) . These percentages may be even higher among Republicans, people who watched conservative media, and people who relied on Donald Trump for COVID-19 information, suggesting that COVID-19 conspiracy theories gain traction in groups other than those that were dedicated to conspiracy theories, in part due to their pre-existing biases and beliefs about the world (Annenberg Public Policy Center of the University of Pennsylvania, 2021; Mitchell et al., 2020; Romer & Jamieson, 2020) . This is consistent with an earlier analysis of Reddit posts which found that individuals who go on to post in conspiracy theory communities tend to be more active in certain communities that are unrelated to conspiracy theories, such as those discussing politics, before joining the conspiracy theory communities (Klein et al., 2019) . In other words, conspiracy theories can be spread in communities (and their associated echo chambers) that have been formed around non-conspiracy interests and in this way reach individuals who do not have a preexisting interest in conspiracy theories. On social media, people are rewarded for producing messages that align with their communities' views. This is gamified in a reward system that allocates shared posts, "likes, " and followers to users who produce or promote such content, thus increasing the self-esteem and happiness of those users (Marengo et al., 2021; Stillman & Baumeister, 2009 ). This occurs even if the content being produced contains misinformation. One recent study found that this positive social reinforcement may be withdrawn from Twitter users who do not share the same fake news stories as others in their community (Lawson et al., 2021) . These effects can create a feedback loop in which constant social reinforcement increases engagement with like-minded people, ultimately leading to the reinforcement of social identities associated with shared beliefs. In other words, social media encourages the formation of "identity bubbles", which are online communities that not only act as echo chambers, but also reinforce shared identities and encourage homophily (Kaakinen et al., 2020) . In addition, these online bubbles do not simply reinforce existing beliefs, rather, they tend to encourage the adoption of even more extreme beliefs (Ohme, 2021) . As might be expected, analysis of large-scale social media data has shown that such bubbles often form around conspiracy beliefs . This is problematic because once inside a conspiratorial identity bubble people experience constant reinforcement of their conspiracy beliefs, are encouraged to adopt more extreme beliefs, and are likely to associate core aspects of their self-concept with those beliefs. The constant social reinforcement and entanglement of conspiracy beliefs with social identities are likely to make those beliefs extremely difficult to dislodge. In sum, social media provides alternative social structures which play on individual needs for belonging and affiliation to draw attention to social influencers and to ensnare believers in echo chambers that positively reinforce endorsement of their beliefs and entangle those beliefs with their social identities, thereby increasing the stickiness of conspiracy theories. Correlational evidence suggests that conspiracy beliefs are associated with a weaker social network (Freeman & Bentall, 2017) and that publicly endorsing conspiracy theories may lead to social exclusion (Lantian et al., 2018) . Anecdotal evidence also suggests that believers who become invested in conspiracy theories can find their relationships with friends and family strained or even severed (Lord & Naik, 2020; Roose, 2020a) . Conversely, there is also evidence that conspiracy believers tend to experience and even intentionally build a sense of community with other believers (Franks et al., 2017; Grant et al., 2015) . Thus, adopting conspiracy beliefs can reshape one's social networks, fraying existing in-person social ties while strengthening virtual networks through message boards and online communities. Once embedded within these virtual networks of fellow believers and increasingly cut off from non-believers, individuals will tend to find their conspiracy beliefs are reinforced and, as discussed above, may even become a central feature of their social identities. Below we discuss some of the characteristics of the online communities that formed around COVID-19 conspiracies. A key ingredient of online communities focused on conspiracy theories is the nature of conspiracy theories themselves. By positing a cadre of shadowy actors working in concert with ill intent, conspiracy theories imply an explicit threat that invokes an intense intergroup process (Shahsavari et al., 2020) . These narratives identify a core group (often those who believe the conspiracy theory), a threat to that group, and strategies intended to meet the threat (Drinkwater et al., 2021; Shahsavari et al., 2020) . Conspiracy theories often reflect a "sense of anomie (normlessness) and societal estrangement" that enhances the sense of an uncaring world in which only those in the group care (Drinkwater et al., 2021, p. 2) . In an attempt to unite against external threats, people strengthen their in-group norms and exhibit outgroup derogation (Harambam, 2020; Knight, 2001) . We provide a few brief examples of these processes as they pertain to online communication and behavior during the pandemic. Social media profiles reveal how conspiracy beliefs become a part of an individual's social identity. For instance, QAnon conspiracy believers take oaths to become "digital soldiers, " and frequently repeat mantras such as "where we go one, we go all" in their profiles (Cohen, 2020) . During the COVID-19 pandemic, wearing a mask became a similarly important identity signal (Boykin et al., 2021; Powdthavee et al., 2021) . Like dress or costumes, these public displays signal membership in the community both to insiders and the outside world. Conspiracy theories inspire linguistic innovations in the group. For instance, the word "plandemic" gained popularity as a hashtag and shorthand for a variety of COVID-19 conspiracies before the release of the viral video of the same name (Kearney et al., 2020) . Such language highlights the unique knowledge that separates believers from the uninformed. The easily searchable nature of social media means that these groups are easy to find for newcomers and for those seeking a receptive audience for their related conspiracy theories. Some of this language also involved stigmatizing outgroups. Terms like "China flu", "Kung flu", and "Wuhan virus" became markers of group identity (Macleod, 2020; Restuccia, 2020) , that simultaneously stigmatized a foreign outgroup (China) and signaled membership to others in the same ingroup. Once popular media outlets, such as CNN and the New York Times, disavowed terms like "Wuhan virus" which they had previously used (Kaur, 2020) , usage of these terms began to demarcate a stance against the dominant culture and served to distinguish COVID-19 conspiracy communities from the general public. A critical moment is when the conspiracy theory-which has been festering primarily online-is exposed to reality. Namely, the specific norms that are established concerning appropriate belief and behavior in the online community may spill into the "outside world. " These might include easily visible patterns of dress, like rejecting face mask use or flaunting mainstream norms and government rules about social distancing. COVID-19 conspiracy beliefs translate directly to various health-protective behaviors (Allington et al., 2020; Sternisko et al., 2020) . Conspiracy beliefs reduce healthy behaviors and promote harmful practices (Tasnim et al., 2020) , interfere with the dissemination of clear information from reliable sources (Wong et al., 2020) , and compromise efforts to contain and help recovery from the virus (Shaw et al., 2020) , leaving conspiracy believers with worsened health outcomes (both mental and physical; Tasnim et al., 2020) . For example, those who believe that pharmaceutical companies created COVID-19 as a new source of profit or that the danger of the disease has been exaggerated by the CDC for political reasons are less likely to wear masks and to seek out a vaccine (Romer & Jamieson, 2020) . Importantly, the content of the COVID-19 conspiracy theory matters. When people believe more general pandemic conspiracy theories (e.g., that vague, shadowy groups stand to benefit from COVID-19), they endorse institutional control and surveillance of ethnic minorities associated with the pandemic; however, when they believe government-focused conspiracy theories (e.g., that the government is using the COVID-19 pandemic to remove freedoms from its citizens), they are less likely to wear masks or socially distance (Oleksy et al., 2021) . When people behave this way, they are not dispassionately seeking to test their hypotheses about reality, but are rather motivated to seek confirmation of their theories (Kunda, 1990) . Subsequently, even feedback which on its face seems negative-such as being confronted in a store for not wearing a mask-can ultimately be positively reinforcing, as it provides both attention and confirmation that the confronted individual is having an impact on the social world around them. The people who have these experiences then catalog them online, bringing them back as grist for the mill of their online culture. The process repeats as the community interprets, reinforces, and continues to push the line by updating to fit the facts at hand and escalating to perhaps more extreme norms. Even when individuals encounter truly negative feedback (e.g., actually contracting or dying of COVID-19), the overarching online community often remains intact, because only some individuals experience the negative feedback. Even worse, since many conspiracy theories rest on a broader foundation of science denial and distrust , disconfirming evidence about a single belief does not topple the entire edifice. This implies that factual disconfirmation of these beliefs is likely insufficient and that we must consider other levers in the battle against the pernicious effects of conspiracy beliefs. When people are embedded in online conspiracy communities and become accustomed to the customs therein, they may begin to lose touch with the norms of the wider society. However, when people with these conspiracy beliefs turn outward from their online communities, whether by appearing in other online contexts or taking their beliefs into the real world, they can affect others by inducing questions about whether these conspiracy theories are fringe or are in fact widely shared. Often individuals with conspiracy beliefs intentionally hide the true nature of their beliefs in order to appeal to a wider audience (Roose, 2020b) . In addition, these individuals can sometimes gain media attention or acknowledgment from public figures for their unusual beliefs, making them appear more widespread than they are (Phillips, 2018; Woods & Hahner, 2018) . In other words, as these extreme, confident, attention-grabbing voices and behaviors gain attention and publicity, they risk increasing perceptions in observers of their prevalence. What results is a form of pluralistic ignorance, or the idea that "no one believes, but everyone thinks that everyone believes" (Prentice & Miller, 1996) . The more prevalent a conspiracy theory is seen to be, the more normalized it becomes (van den Broucke, 2020). Rather than reject these conspiracy theories outright, observers outside the community may become more open to evaluating them-at which point these theories can burst free from smaller reservoirs and spread more widely throughout the broader population. Equally concerning, is that these same processes can make conspiracy theories seem more mainstream to people within online conspiracy communities, and may even lead them to believe that the public would support them taking action to stop the supposed conspirators. Such beliefs may have contributed to real-world events such as the January 6th incursion into the United States Capitol (Barry et al., 2021) . Social media played a central role in the spread and stick of COVID-19 conspiracy theories. The question, of course, is how to interrupt this cycle of ever more elaborate and entrenched false beliefs. There are several proven options for reducing the negative influence of online conspiracy theories, content restriction (preventing exposure to and breaking up echo chambers); pre-bunking (attempting to inoculate people before they are exposed); critical consumption (encouraging cognitive engagement during exposure); and, if all else fails, debunking (trying to alter conspiracy beliefs already held and address the problematic downstream attitudes and behavior directly). As well, encouraging a common identity (breaking down intergroup demarcations) and social connection (seeking to re-establish the social connections lost or weakened during the pandemic). Content restriction, a somewhat controversial approach, has the potential to address the more problematic aspects of the alternative cognitive and social structures that social media provides. It is predominantly used by large social media organizations that can enforce sweeping control over their platforms. Social media companies such as Twitter, YouTube, and Facebook have all implemented policies to remove conspiracy theory content, accounts, or groups that have been linked to conspiracy theories and violence (Ortutay, 2020) . For example, in an attempt to address "militarized social movements", Facebook banned all QAnon conspiracy accounts from its social media websites (Collins & Zadrozny, 2020) , as well as all posts that distort or deny the Holocaust (O'Brien, 2020). Removing conspiracy content along with its most prominent purveyors can reduce peoples' exposure to conspiracy theories and help slow their spread (Ahmed et al., 2020) . In addition, banning conspiracy groups and hashtags may help break up echo chambers and dampen the constant social reinforcement they provide, preventing conspiracy beliefs from taking root and mak- ing believers more open to changing their views. However, the social media landscape is constantly evolving and growing, providing conspiracy believers with new opportunities to find evidence to support their theories and ways around content restrictions. Thus, limiting postings and banning accounts and groups might slow the spread of the theories themselves, but it is unlikely to entirely eliminate exposure to them (Papakyriakopoulos et al., 2020) . In addition, content restriction is controversial, with some asserting that targeting groups with unpopular opinions is unacceptable censorship, while others argue that social media companies have a moral responsibility to prevent damaging information from spreading on their platforms (Shields & Brody, 2020) . Social media companies may be hesitant to wade into this controversy any more than necessary for fear of inviting political backlash and unwanted regulations. Another option is to inoculate people before they encounter a conspiracy theory, either via information or via mindset. One study found that participants who read about the effectiveness of vaccination before encountering an anti-vaccination conspiracy theory showed greater vaccination intentions than those who read the same information after (Jolley & Douglas, 2017) . Greater science literacy may also reduce the sharing of misleading or unverified COVID-19 information on social media (Pennycook et al., 2020) , suggesting that improvements in science education may equip people with better cognitive defenses when they encounter conspiracy theories online. There is also evidence that encouraging people to think about their level of resistance to persuasion is effective in reducing conspiracy beliefs (Bonetto et al., 2018) . Congruently, when people are given a chance to play a game that highlights the manipulative persuasion tactics sometimes used to spread conspiracy theories, they later show reduced vulnerability to those same tactics (Roozenbeek & van der Linden, 2019) . Taken together, these findings suggest that addressing conspiracy theories before they become beliefs may be an effective strategy for preventing the adoption of conspiratorial cognitive structures. Other interventions can occur when individuals are in the midst of being exposed to a conspiracy theory. When people exposed to a conspiracy theory are encouraged to think more critically about how convincing it is, they show lower downstream beliefs than those who are simply exposed to it (Einstein & Glick, 2013; Ståhl & van Prooijen, 2018) . Research has found the cognitive engagement of actually thinking about how convinced one was by a conspiracy theory unlocked greater critical thinking, whereas mere exposure allowed conspiracy beliefs to "slip in under the radar" (Einstein & Glick, 2013; see also Einstein & Glick, 2015; Salovich & Rapp, 2021) . In addition, on social media, if first asked to judge the accuracy of a neutral headline (i.e., one unrelated to misinformation), participants were subsequently less likely to share information that was misleading or unverified, regardless of whether or not it aligned with their political ideology (Pennycook et al., 2020) . Thus invoking cognitive reflection may both decrease the likelihood that exposure will lead to the adoption of conspiracy beliefs and reduce the sharing of conspiracy theories within online communities. However, it is worth noting that simply labeling something a "conspiracy theory" did not make people find it any less believable (Wood, 2016) suggesting that labels or disclaimers may not be enough to invoke the necessary critical thought. Pre-bunking and critical consumption target individuals either before or as they encounter a conspiracy theory. Findings concerning whether beliefs can be reduced after someone already endorses a conspiracy theory are more mixed. On one hand, exposing conspiracy believers to disconfirming facts has produced some hopeful findings. Rational counter-arguments reduce conspiracy beliefs in some studies (Lyons et al., 2019; Orosz et al., 2016; Swami et al., 2013) . On the other hand, there is evidence that the challenge of dismantling conspiracy beliefs cannot be met with disconfirming facts alone (Bricker, 2013; Clarke, 2002) , and that the group identity of conspiracy believers stands as a substantive roadblock to these interventions. A "backfire effect" (of conspiracy beliefs increasing rather than decreasing) can occur when disconfirming evidence challenges beliefs entangled with one's political ideology, an effect which seems to intensify as commitment to the ideology increases (Nyhan & Reifler, 2010; Nyhan et al., 2014) . Indeed, disconfirming evidence failed to reduce partisan conspiracy beliefs for both Democrats and Republicans (although notably, only Republicans exhibited the backfire effect; Enders & Smallpage, 2019) . This evidence suggests that group identity may be an important social barrier for addressing conspiracy theories. In addition, for debunking information to reach individuals online and be effective it must break into echo chambers and overcome the social reinforcement that occurs within these communities. Unfortunately, little is known about what interventions may successfully do this. Attempts by debunkers to use pro-conspiracy hashtags to reach believers do not appear to be effective (Mahl et al., 2021) . Some social media companies have begun to attach fact-checking information to posts containing misinformation, however, these warnings may be ineffective when addressing conspiracy beliefs that have already taken root and which are linked to social identities (Drutman, 2020) . The paucity of research on the social aspects of effective debunking interventions is concerning because it is likely that purely cognitive approaches will be less effective if they do not address the social barriers to debunking. One factor which emerges from our review is that conspiracy theories and the communities who endorse them can exacerbate delineations between groups and the in-group out-group dynamics which naturally follow from that. One potential way to reduce the divisions is to encourage people to focus on shared superordinate social identities (Finkel et al., 2020; Gaertner et al., 2000) . This is particularly relevant to the effects social media influencers can have in combating conspiracy theories. Research has shown that "people become less divided after observing politicians treating opposing partisans warmly, and nonpartisan statements from leaders can reduce violence" (Finkel et al., 2020) . Thus, an effective intervention may be to encourage key social influencers to emphasize common identities, the importance of nonpartisanship, and fundamental tolerance of other groups in order to improve the climate and culture around the collective identities that feed online conspiracy communities. Loneliness has intensified during the pandemic (Shah et al., 2019; Jeste et al., 2020; Luiggi-Hernández & Rivera-Amador, 2020) . In July 2020, during the peak of many lockdowns, 83% of respondents to a global survey said they were using social media to help "cope with COVID-19-related lockdowns" (Kemp, 2020) . Unfortunately, this can drive people to invest in alternative social structures with dangerous ramifications, particularly in terms of the spread and stickiness of conspiracy beliefs. However, the turn to online connections also provides a glimmer of hope that the same digital technology tools which may introduce people to conspiracy theories could also provide other forms of support. For example, digital interconnectivity can improve access to mental health services and the pandemic has increased interest in telehealth for mental health therapy and medical appointments. Specifically, teletherapy may be one way to combat loneliness and other pandemic-drive challenges (Luiggi-Hernández & Rivera-Amador, 2020). The ability to access skilled professionals may reduce the need for social connection that increases the appeal of parasocial relationships with influencers, entry into echo chambers, and the catnip of social approval once one is there. Innovations in online connectivity may also help people maintain pre-pandemic relationships with friends and family. Digital tools such as Zoom, Skype, and other social connectivity apps which allow people to maintain existing social ties may prevent individuals from being swept into the enclosed spaces of online echo chambers. While this study has emphasized the theoretical mechanisms by which social media has affected COVID-19 conspiracy theories, a core question is the empirical agenda going forward. We consider three areas for more empirical study (a) testing specific elements of social media engagement to determine the primary drivers of conspiracy spread and stickiness, (b) consequences of engagement in online conspiracy beliefs for real-world social relationships in a post-pandemic world, and (c) the motives driving online engagement with conspiracy theories 9.1 | Internet usage patterns as an antecedent to conspiracy endorsement and spread As we noted at the beginning of the study, researchers have debated the extent to which the internet affects the spread and stickiness of conspiracy theories. The discrepant findings may arise from changes over time in the way we use the internet. In 2005, 68% of US adults used the internet, but just 5% used social media (Pew Research Center, 2021a; Pew Research Center 2021b). As of 2021, 93% of US adults use the internet and 72% use social media, and as noted above social media usage increased significantly during COVID-19. Our focus in this study has been on the role of social media, rather than online engagement in general, in promoting the spread and stickiness of conspiracy theories. If, as we suggest, it is social media in particular that is driving these effects, then as the percentage of internet users who engage with social media has increased the impact of internet usage in general will appear to have changed. However, further research is needed to determine the extent to which, social media engagement, rather than internet usage in general, is the key driver of conspiracy theories. In addition, as social media platforms continue to change over time, new features may alter their influence on conspiracy beliefs. For example, in 2016 Facebook shifted its focus to promoting communities via Facebook groups (Rodriguez, 2020a) . These groups have since been implicated in the spread of misinformation on the platform (Mc-Cammon, 2020; Rodriguez, 2020b) . More research is needed to understand how past changes such as these have influenced the spread and stickiness of conspiracy theories. This knowledge is crucial for predicting the likely impact of future changes to these platforms. To more precisely gauge how specific aspects of social media affect the proliferation and staying power of conspiracy theories, experimental manipulations are useful (e.g., randomly assigning people to different types of online engagement, and measuring how rapidly the theory is elaborated into extreme permutations, how much people believe it, and whether they are willing to share it). However, such manipulations have limits in terms of external validity. More naturalistic empirical opportunities lie in studying large-scale, cross-cultural internet data. Historical analysis may help reveal how changes to social media platforms over time have altered their role in spreading misinformation. In addition, researchers might compensate for the comparatively short time span covered by social media data by comparing social conditions in different parts of the world. Events that were more localized than COVID-19 may have impacted how people use social media in particular regions, providing insight into how different types of crises may impact social media use and its connections to conspiracy theories. In addition, our review of the COVID-19 conspiracy theory literature reveals far more work is needed to understand how disrupted social structures were rebuilt online during the pandemic and how long-lasting those changes are. For example, as the pandemic subsides, will echo chambers begin to become more permeable? As people return to faceto-face interactions, how will their social networks change? To what extent will new group identities, which have been formed online, continue to be an important part of people's self-concepts? Researchers must move quickly to examine these questions as "normal" life resumes. Of particular interest is how newly formed online group identities carry over to face-to-face interaction, blurring the distinctions between virtual and real-world networks. We reviewed evidence that this occurred during the pandemic, but it remains to be seen if such patterns persist in the real world post-pandemic and to what extent the online conspiracy communities that were formed will persist as the crisis passes. Borrowing from a large body of work on conspiracy theories that existed prior to the pandemic we argued that both control and uncertainty can play a role in the disruption of cognitive structures and subsequent search for alternative structures online, which leads to increased spread and stickiness of conspiracy theories. However, relatively little work has compared these two mechanisms. While the pandemic undoubtedly created both uncertainty and feelings of lacking of control, it is unclear which one was the primary driver of the effects on conspiracy beliefs. Kofta and colleagues (2020) are among the few who have examined both uncertainty and feelings of control in the same study. They find that control rather than uncertainty predicts conspiracy beliefs. However, given the importance of the internet as a tool for information search, it is possible uncertainty plays a larger role in an online environment. In addition, during events such as the COVID-19 pandemic when both are affected on a massive scale, the two may operate differently during non-crisis times. For example, we may find that they interact and mutually reinforce one another. More studies that simultaneously examine both uncertainty and feelings of control are needed to establish how they operate to influence conspiracy beliefs online and during periods of crisis. Covid-19 and the "film your hospital" conspiracy theory: Social network analysis of twitter data Technologically scaffolded atypical cognition: The case of YouTube's recommender system Health-protective behaviour, social media usage and conspiracy belief during the COVID-19 public health emergency COVID-19 conspiracy beliefs increased among users of conservative and social media. Phys.org Political paranoia v. political realism: On distinguishing between bogus conspiracy theories and genuine conspiratorial politics Understanding the political representativeness of Twitter users Our president wants us here': The mob that stormed the Capitol. The New York Times Science vs conspiracy: Collective narratives in the age of misinformation Homophily and polarization in the age of misinformation Social media teems with conspiracy theories from QAnon and Trump critics after president's positive COVID-19 test Priming resistance to persuasion decreases adherence to conspiracy theories The social structure of political echo chambers: Variation in ideological homophily in online networks The relationship between burden caused by coronavirus (COVID-19), addictive social media use, sense of control and anxiety Types, sources, and claims of COVID-19 misinformation Climategate: A case study in the intersection of facticity and conspiracy theory The psychological impact of quarantine and how to reduce it: Rapid review of the evidence Recursive patterns in online echochambers How coronavirus has brought together conspiracy theorists and the far right. The Guardian President Biden says Facebook, other social media 'killing people' when it comes to COVID-19 misinformation How adolescents use social media to cope with feelings of loneliness and anxiety during COVID-19 lockdown The disinformation dozen: Why platforms must act on twelve leading online anti-vaxxers. Counterhate Rumor propagation is amplified by echo chambers in social media Selective exposure shapes the Facebook news diet The echo chamber effect on social media The COVID-19 social media infodemic Conspiracy theories and conspiracy theorizing Conspiracy theories and the internet: Controlled demolition and arrested development Michael Flynn posts video featuring QAnon slogans QAnon groups hit by Facebook crackdown The spreading of misinformation online COVID-19 conspiracy theories Reclaiming the truth. The Psychologist The psychology of conspiracy theories Understanding conspiracy theories To what extent have conspiracy theories undermined COV-ID-19: Strategic narratives? Frontiers in Communication Fact-checking misinformation can work Misinformation dropped dramatically the week after Twitter banned Trump and some allies Coronavirus: Social media 'spreading virus conspiracy theories Scandals, conspiracies and the vicious cycle of cynicism Do I think BLS data are BS? The consequences of conspiracy theories Should I vaccinate my child?' Comparing the displayed stances of vaccine information retrieved from Google, Facebook and YouTube Informational cues, partisan-motivated reasoning, and the manipulation of conspiracy beliefs Social media influencers in strategic communication: A conceptual framework for strategic social media influencer communication A longitudinal analysis of YouTube's promotion of conspiracy videos What types of Covid-19 conspiracies are populated by Twitter bots? First Monday Political sectarianism in America Social media use spikes during pandemic Conspiracy theories as quasi-religious mentality: An integrated account from cognitive science, social representations theory, and frame theory Beyond "monologicality"? Exploring conspiracist worldviews The concomitants of conspiracy concerns Reducing intergroup bias: The common ingroup identity model Vaccination persuasion online: A qualitative study of two provaccine and two vaccine-skeptical websites Going viral: How a single tweet spawned a COVID-19 conspiracy theory on Twitter Contemporary conspiracy culture: Truth and knowledge in an era of epistemic instability How the 5G coronavirus conspiracy theory went from fringe to mainstream Battling the modern behavioral epidemic of loneliness: Suggestions for research and interventions The social consequences of conspiracism: Exposure to conspiracy theories decreases intentions to engage in politics and to reduce one's carbon footprint Prevention is better than cure: Addressing anti-vaccine conspiracy theories Exposure to intergroup conspiracy theories promotes prejudice which spreads across groups Shared identity and shared information in social media: Development and validation of the identity bubble reinforcement scale Corona conspiracies: A call for urgent anthropological attention. Social Anthropology: The Journal of the European Association of Social Anthropologists Yes, we long have referred to disease outbreaks by geographic places. Here's why we shouldn't anymore The Twitter origins and evolution of the COVID-19 "plandemic" conspiracy theory Digital 2020 July global snapshot report Digital 2021: Global overview report Digital 2021: April global statshot report The impact of exposure to media messages promoting government conspiracy theories on distrust in the government: Evidence from a two-stage randomized experiment Twitter says influencers are almost as trusted as friends. Marketing Dive Pathways to conspiracy: The social and linguistic precursors of involvement in Reddit's conspiracy theory forum Conspiracy culture: American paranoia from Kennedy to the X-files The virus changed the way we internet What breeds conspiracy antisemitism? The role of political uncontrollability and uncertainty in the belief in Jewish conspiracy Social, economic, and environmental factors influencing the basic reproduction number of COVID-19 across countries The perception of conspiracy: Leader paranoia as adaptive cognition Coronavirus conspiracy beliefs in the German-speaking general population: Endorsement rates and links to reasoning biases and paranoia The case for motivated reasoning Compensatory control and the appeal of a structured world Stigmatized beliefs: Conspiracy theories, anticipated negative evaluation of the self, and fear of social exclusion Tribalism and tribulations: The social cost of not sharing fake news. Paper presented at the 34th Annual Conference of the International Association for Conflict Management How social media influencer tactics help conspiracy theories gain traction online The role of conspiracist ideation and worldviews in predicting rejection of science NASA faked the moon landing-therefore, (climate) science is a hoax: An anatomy of the motivated rejection of science The conceptual representation and the measurement of psychological forces He went down the QAnon rabbit hole for almost two years. Here's how he got out Reconceptualizing social distancing: Teletherapy and social inequality during the COVID-19 and loneliness pandemics Not just asking questions: Effects of implicit and explicit conspiracy information about vaccines and genetic modification In pursuit of Chinese scapegoats, media reject life-saving lessons. Fairness & accuracy in reporting From "Nasa Lies" to "Reptilian Eyes": Mapping communication about 10 conspiracy theories, their communities, and main propagators on Examining the links between active Facebook use, received likes, self-esteem and happiness: A study using objective social media data All things considered Characterizing covid-19 misinformation communities using a novel twitter dataset Most Americans have heard of the conspiracy theory that the COVID-19 outbreak was planned, and about one-third of those aware of it say it might be true Expectancy models of job-satisfaction, occupational preference and effort: Theoretical, methodological, and empirical appraisal The role of social media to generate social proof as engaged society for stockpiling behaviour of customers during Covid-19 pandemic When corrections fail: The persistence of political misperceptions Effective messages in vaccine promotion: A randomized trial Facebook bans Holocaust denial, distortion posts Algorithmic social media use and its relationship to attitude reinforcement and issue-specific political participation: The case of the 2015 European immigration movements Content matters: Different predictors and social consequences of general and government-related conspiracy theories on COVID-19 Changing conspiracy beliefs through rationality and ridiculing Facebook removes accounts linked to QAnon conspiracy theory The spread of COVID-19 conspiracy theories on social media and the effect of content moderation Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning Internet/broadband fact sheet Social media fact sheet How journalists should not cover an online conspiracy theory. The Guardian When face masks signal social identity: Explaining the deep face-mask divide during the COVID-19 Pandemic Pluralistic ignorance and the perpetuation of social norms by unwitting actors Authoritarianism, conspiracy beliefs, gender and COVID-19: Links between individual differences and concern about COVID-19, mask wearing behaviors, and the tendency to blame China for the virus Commanding the trend: Social media as information warfare Social media use and perceived social isolation among young adults in the US Social media use and perceived social isolation among young adults in the US White house defends Trump comments on 'Kung Flu, ' coronavirus testing Americans use social media for COVID-19 info, connection How COVID-19 changed Americans' internet habits Mark Zuckerberg shifted Facebook's focus to groups after the 2016 election, and it's changed how people use the site Facebook is cracking down on groups worldwide to slow the spread of political disinformation Conspiracy theories as barriers to controlling the spread of COVID-19 in the U Rabbit hole. The New York times How 'save the children' is keeping QAnon alive. The New York Times Fake news game confers psychological resistance against online misinformation Global infodemiology of COVID-19: Analysis of Google web searches and Instagram hashtags Misinformed and unaware? Metacognition and the influence of inaccurate information 2020 US social media usage: How the coronavirus is changing consumer behavior Why you miss those casual friends so much Effectiveness of digital technology interventions to reduce loneliness in adults: A protocol for a systematic review and meta-analysis Conspiracy in the time of corona: Automatic detection of emerging COVID-19 conspiracy theories in social media and the news Governance, technology and citizen behavior in pandemic: Lessons from COVID-19 in East Asia Washington's knives are out for big tech's social media shield Epistemic rationality: Skepticism toward unfounded beliefs requires sufficient cognitive ability and motivation to be rational The internet and the spread of conspiracy content The dark side of social movements: Social identity, non-conformity, and the lure of conspiracy theories. Current Opinion in Psychology Uncertainty, belongingness, and four needs for meaning Lunar lies: The impact of informational framing and individual differences in shaping conspiracist beliefs about the moon landings Down the rabbit hole" of vaccine misinformation on YouTube: Network exposure study Impact of rumors and misinformation on COVID-19 in social media How the outbreak has changed the way we use social media A web of conspiracy? Internet and conspiracy theory American conspiracy theories Why health promotion matters to the COVID-19 pandemic, and vice versa. Health Promotion International The conspiracy-effect: Exposure to conspiracy theories (about global warming) decreases pro-social behavior and science acceptance Belief in conspiracy theories: The influence of uncertainty and perceived morality The spread of true and false news online Causal uncertainty beliefs and diagnostic information seeking Lacking control increases illusory pattern perception The emotional roots of conspiratorial perceptions, system justification, and belief in the paranormal Regulatory focus and conspiratorial perceptions: The importance of personal control COVID-19 in Singapore-current experience: Critical global issues that require attention and action Some dare call it conspiracy: Labeling something a conspiracy theory does not reduce belief in it How mainstream media helps weaponize far-right conspiracy theories. The Conversation #Sponsored: The emergence of influencer marketing Using psychological science to support social distancing: Tradeoffs between affiliation and disease-avoidance motivations. Social and Personality Psychology Compass Understanding the diverging user trajectories in highly-related online communities during the COVID-19 pandemic Conspiracy thinking in the Middle East University. She studies how people think about relationships, and how this affects the way they make decisions, collaborate, and lead at work.