key: cord-0953280-3eq795to authors: Coen, Matteo; Sader, Julia; Junod-Perron, Noëlle; Audétat, Marie-Claude; Nendaz, Mathieu title: Clinical reasoning in dire times. Analysis of cognitive biases in clinical cases during the COVID-19 pandemic date: 2022-01-08 journal: Intern Emerg Med DOI: 10.1007/s11739-021-02884-9 sha: 4b626c3e70db84cc223816c507a08184d264f824 doc_id: 953280 cord_uid: 3eq795to Cognitive biases are systematic cognitive distortions, which can affect clinical reasoning. The aim of this study was to unravel the most common cognitive biases encountered in in the peculiar context of the COVID-19 pandemic. Case study research design. Primary care. Single centre (Division of General Internal Medicine, University Hospitals of Geneva, Geneva, Switzerland). A short survey was sent to all primary care providers (N = 169) taking care of hospitalised adult patients with COVID-19. Participants were asked to describe cases in which they felt that their clinical reasoning was “disrupted” because of the pandemic context. Seven case were sufficiently complete to be analysed. A qualitative analysis of the clinical cases was performed and a bias grid encompassing 17 well-known biases created. The clinical cases were analyzed to assess for the likelihood (highly likely, plausible, not likely) of the different biases for each case. The most common biases were: “anchoring bias”, “confirmation bias”, “availability bias”, and “cognitive dissonance”. The pandemic context is a breeding ground for the emergence of cognitive biases, which can influence clinical reasoning and lead to errors. Awareness of these cognitive mechanisms could potentially reduce biases and improve clinical reasoning. Moreover, the analysis of cognitive biases can offer an insight on the functioning of the clinical reasoning process in the midst of the pandemic crisis. The COVID-19 pandemic had a major impact on the clinical reasoning processes of physicians and healthcare professionals; the term "COVID blindness" has been recently used to define this issue [1] . Clinical reasoning can be defined as "the sum of the thinking and decision-making processes associated with clinical practice (…) and it enables practitioners to take (…) the best judged action in a specific context" [2] . The context of pandemic is overshadowed by multiple uncertainties such as an ill-defined, still ongoing construction of COVID-19 illness scripts. The "illness script", i.e. the organised representation in the provider's mind of an illness, is a knowledge structure composed of 4 parts: "Fault" (i.e. the pathophysiological mechanisms), "Enabling Conditions" (i.e. signs and symptoms), "Consequences", and "Management" [3] [4] [5] [6] . Illness scripts are embedded within local, organisational, socio-cultural and global factors, which are referred to as "problem space" [2, 7] or, metaphorically, as a "maze of mental activity through which individuals wander when searching for a solution to a problem". [8] Matteo Coen and Julia Sader equally contributed to the article. The article belongs to COVID 19. Further difficulties can arise when healthcare professionals are under a great deal of pressure to make decisions quickly. Time pressure is indeed a stress factor that can negatively impact diagnostic accuracy, as it can limit the number of hypotheses a physician can make [9] . As a matter of fact, patients with COVID-19 pneumonia can rapidly worsen, and require urgent care, thus critically reducing the optimal time for clinical reasoning process. These contextual factors can provide a breeding ground for cognitive biases and errors in judgment and decisionmaking. The concept of cognitive bias was first introduced by Tversky and Kahneman in their seminal 1974 Science paper [10] ; at first extensively studied in the field of psychology, the notion of cognitive bias has been explored in various fields, including medicine. Cognitive biases can be defined as systematic cognitive distortions (a misleading and false logical thought pattern) in the processing of information that can results in impaired judgement, especially when dealing with large amounts of information or when time is limited and thereby affect decision-making [11] [12] [13] [14] ; according to Wilson and Brekke [15] cognitive biases are, in a nutshell, "mental contaminations", capable of driving unwanted responses because of an unconscious/ uncontrollable mental processing. Clinical reasoning can be conceptualized through the "dual process" theory (or model) [16, 17] . Stemmed from the field of psychology (William James, 1890), this theory has been increasingly employed to illustrate medical reasoning since the 1970s'. According to this model, two types of mental processes exist, that jointly work together. These are called System 1 (Type 1): a fast, non-analytical, intuitive, heavily based on pattern recognition, and System 2 (Type 2): slow, analytical, hypothetico-deductive, involving a conscious explicit analytical approach [18] . Diagnostic errors can occur within both systems of thinking [12, 19] . Recently, a scoping review explored the state of research on clinical reasoning in emergency medicine (to note: only a few studies focused on clinical reasoning per se, and most were treatment oriented) [20] . This review demonstrated that the clinical reasoning process of emergency physicians is similar to that of other specialties, such as family medicine, and anaesthesiology, but contextual peculiarities. The reasoning and decision-making during emergency situation have to be fast and sometimes might hinder the decision-making and lead to resurgence of certain cognitive biases (anchoring bias, for example) A few key studies have deciphered the implications of such biases on the decisionmaking of physicians [21, 22] . The COVID-19 pandemic, characterised by a number of clinical, diagnostic and therapeutic uncertainties, and at the same time by hospital overload of critical patients, can disrupt their clinical reasoning. Working in such a context can easily lead to errors attributable not only to inadequate knowledge, but also to cognitive biases [12, 18, 23] . This study is a case analysis aimed at unravelling the most commonly encountered cognitive biases in a hospital setting in the peculiar context of the COVID-19 pandemic. Given the exploratory nature of this study, and to allow an in-depth analysis to bring forward new hypotheses, an analysis of several cases was chosen. A case study research design was used to investigate the cognitive biases, which might arise during a sanitary health crisis. According to Hartley [24] and Yin [25] a case study is an empirical inquiry that investigates a current phenomenon within its real-life context, especially when the boundaries are not obvious. This makes this approach a tentative measure of the phenomena [26] . The use of multiple sources of evidence led us to create a small case study database which led to what Yin would name "maintaining the chain of evidence" [27] . To showcase whether COVID-19 pandemic could affect clinical reasoning, one of the researchers (MC) sent in May 2020 a short survey to all residents and chief residents taking care of adult patients with a diagnosis of SARS-CoV-2 infection admitted in the Division of General Internal Medicine of University Hospitals of Geneva. MC asked to describe clinical cases in which they felt that their clinical reasoning was "disrupted" because of the unique context of the pandemic. Participant were not given a limit of text length for reporting their clinical case. All participants gave informed consent to participate in the study. A trio (MC, JS, MN) with a prior expertise in the understanding of cognitive structures and of clinical reasoning processes, performed a qualitative analysis of the clinical cases. Using an hybrid approach (deductive and inductive) the trio created a bias grid (Table 1 ) encompassing a dozen cognitive biases most frequently observed in the medical setting ( Table 1 , N 1 to 12) [12, 16, 17, 28] as well as some well-known additional biases not frequently encountered in the medical literature (Table 1 , N 13 to 17) that were identified after reading the clinical cases [9] [10] [11] 29] . The members of the trio independently analyzed the clinical cases and assessed the likelihood (highly likely, plausible, not likely) of the different biases for each case. The results of the three distinct analyses were merged together. In case of disagreement between the researchers, the trio 1 3 re-read and re-analyzed the case reports, and an agreement was achieved. The results were finally crosschecked by two additional researchers (NJ and MCA) before final decisions. Out of the 169 physicians contacted, 9.4% (10; 5 females; 4 chief residents, 6 residents) provided a case. Seven reported clinical cases (approximately 200 words long each) were sufficiently complete to be suitable for analysis. Physicians who provided these cases agreed with the use and publication of the anonymised data. Transcriptions of the clinical cases are provided in Table 2 . The likelihood of all cognitive biases for each referred clinical case is shown in Table 3 . The most common, "highly likely", cognitive biases that stem from our analysis were cognitive dissonance (6/7), premature closure (6/7), availability bias (5/7), confirmation bias (5/7), and anchoring bias (4/7); moreover, cognitive dissonance and premature closure co-occurred as "highly likely" in 5/7 cases. Cognitive dissonance can be defined as the psychological discomfort encountered when simultaneous thoughts are in conflict with each other suggesting an incorrect diagnosis. It was encountered, for example, in case 6, in which doctors felt certain of a diagnosis of COVID-19 and discounted the multiple evidence suggesting the diagnosis was incorrect (e.g. an atypical clinical course and several negative nasopharyngeal swabs, expectorations and serologies, not supporting the initial diagnosis). The following verbatim illustrated this dissonance "often […] we asked ourselves about his (scil. the patient) safety, in a COVID-19 unit with his negative tests". Availability bias and confirmation bias occurred in 5/7 clinical cases. Availability bias, i.e. to consider a diagnosis more likely because it readily comes to mind, could be easily recognized in case 1 (verbatim: "A patient in his 70ies… coming from Italy… "Unluckily" […] at that time the epicentre of COVID-19. He was taken to intensive care in a COVID-19 area"), 3 (verbatim: [scil. patient was…] In the emergency department […] his symptoms: asthenia and mild dyspnea (possibly a "small" COVID-19)), and 7 "[verbatim] A man in his seventies consulted the emergency department because of a generalized weakness, fatigue and fever […] The patient […] was transferred to a normal medical ward for stable patients with COVID-19". Confirmation bias (i.e. to look only for symptoms or signs that may confirm a diagnostic hypothesis) was present in To stick with previously made decisions, regardless of changing circumstances 17 In-group bias (in-group favouritism) To give preferential treatment to others who belong to the same group that they do ).Premature closure, i.e. to fail to consider reasonable alternatives after an initial diagnosis, was a major reason for the right diagnosis to go undetected. For example, in case 1, in which symptoms (orthopnoea), imaging findings (venous diversion of chest X ray), and predisposing factors (ischemic/rhythmic heart disease) for acute pulmonary oedema were neglected in favour of a diagnosis of SARS-CoV-2 infection, despite a negative C-Reactive Protein value (i.e. no inflammation), and several SARS-CoV-2-negative RT-PCR on nasopharyngeal swabs. In case 7, although clinical features (high fever, tachycardia, confusion, heart murmur, Osler nodules) and predisposing factors (biological heart-valve prostheses) suggested infectious endocarditis, it was first rapidly concluded to a diagnosis of COVID-19 pneumonia although respiratory parameters were normal, and SARS-CoV-2 RT-PCR on nasopharyngeal swabs was negative. Anchoring bias (i.e. "To be unable to adjust the initial diagnostic hypothesis when further information becomes available") could be recognised in 4/7 clinical cases, for example in case 4([verbatim]: "Initially we suspected COVID-19. The initial investigations were concentrated on the viral etiology. Given the rapid deterioration of his condition, the effusions which recurred, the decline in renal function […] and 2 negative SARS-CoV-2 RT-PCR in nasopharyngeal swab… he was eventually transferred to a non-COVID-19 hospital […]). The cognitive biases that were the least likely in the analysed cases were visceral bias, choice overload bias and decision fatigue (evaluated as "not likely" in 6/7, 6/7 and 4/7 cases, respectively). Cognitive dissonance and premature closure were associated with default bias and in-group bias in 3/7 clinical cases. In case 6, for example, where both cognitive dissonance and premature closure were "highly-likely", default bias (i.e. to stick with previously made decisions, regardless of changing circumstances) and in-group bias/favouritism (i.e. to favour those who belong to the same group that they do) could be easily evoked. Not only doctors adhered to the prior hypothesis (verbatim: "[…] in the face of uncertainty, the patient remains in a COVID-19 unit), although a scanty evidence. Moreover, the hypothesis was judged "positively" by the group (verbatim: "Often, with the infectious disease specialists and the resident, we asked ourselves […]), discounting the validity of information against an alternative diagnosis. Stressful situations (e.g. characterized by time-pressure, high stakes, multitasking, fatigue, patients overload) [30] are particularly common in the intensive care unit (ICU) and the operating room (defined as "hotbeds for human error") [31] . Likewise emergency medicine, characterized by "poor access to information and with limited time to process it" [32] is potentially "a natural laboratory of error" [33] . In-group bias (in-group favouritism) P P NL P HL HL HL Among the cognitive and affective factors (either organization-or individual-related) that can act as error-catalyzing factors [34] , cognitive biases, i.e. systemic diagnostic distortions, [11-15, 17, 35] associate with diagnostic inaccuracies and flawed clinical decisions [21, 22, 36] . Interestingly, stress can exacerbate biases by affecting the emotion-cognition balance [37] . Likewise, a stressor like time pressure can limit the number of hypotheses a physician can make; this is extremely suggestive of a premature closure of the diagnostic process [9] . Broad assessment of cognitive bias in the emergency setting has been seldom explored. We could identify only one original paper [38] who explored the most common biases in emergency physicians. By using the Rationality Quotient™ Test the authors found that, although common, cognitive biases where less represented in emergency physicians in comparison with laypersons (in particular blind spot and representative bias). The COVID-19 pandemic shares similarities with the emergency setting, inasmuch as it is a stressful context dominated by uncertainty [7, 39] . A literature research focused on the occurrence of cognitive biases in this pandemic context retrieved only a few original articles (mostly position papers, reviews, case reports and small case series). In particular, the study by Lucas et al. [40] investigated the patients upgraded to the ICU following admission to non-critical care units during the first wave of COVID-19 pandemic (March to July 2020; N = 18) is a US hospital. A group of physicians were asked to analyse the cases, and to assign one cognitive bias (chosen among availability bias, anchoring bias, premature closure, and confirmation bias). They concluded that premature closure (72.2%), anchoring (61.1%), and confirmation bias (55.6%) were more likely to be "responsible" of the patients' upgrades. Compared to that of Lucas [40] we did not focus exclusively on patients upgraded to the ICU. All patients hospitalised for COVID-19 were potentially worthy of analysis. Also, employing a hybrid approach involving deductive (theoretical) and inductive (data-driven) processes to bias grid construction allowed additional biases to emerge from the study of cases. This approach is particularly useful to capture the richness and complexity of bias research [41] . At the best of our knowledge, our study is the first one to perform a broad investigation of the type of biases that occurred during the COVID-19 pandemic, and how they can affect decision-making. Moreover, we also analysed how cognitive biases interact and influence each other. The concept of cognitive dissonance has been introduce by the psychologist Leon Festinger in 1957 [42] . Cognitive dissonance bias occurs when contradictory cognitions intersect, thus producing discomfort and underlying tension. To minimize these feelings, avoidance and/or rejection responses are often elicited; this can manifest, as in the collected cases, by discounting or ignoring information (e.g. negative SARS-CoV-2 tests, atypical clinical course) that disconfirm a previous hypothesis (e.g. a COVID-19 diagnosis). It is not surprising that premature closure occurrence was "highly likely" as frequent as that of the bias of cognitive dissonance. Indeed, premature closure is known to be encountered more commonly than any other type of cognitive bias, at least in medicine, and it is linked to a high proportion of diagnostic errors [43] [44] [45] [46] [47] . In the "liquid time" of pandemic (i.e. characterized by rapid change where the only constant is change itself according to Baumann's [48, 49] ), time pressure can be particularly detrimental and force to premature rejection of alternatives, thus leading to premature diagnostic closure. Indeed, inferencing (i.e. to formulate a hypothesis when clear deductions are not available) [50] that a diagnosis is possible, is less time consuming than identifying all the aspects of the non-preferred, yet possible, alternatives (inference of impossibility) [51, 52] . One can conceive that premature closure can arise from an attempt to avoid cognitive dissonance by prematurely rejecting other possible, but "dissonant" diagnoses; indeed, premature closure frequently co-occurred with the bias of cognitive dissonance. The association of cognitive dissonance and premature closure with in-group bias (or in-group favoritism) is understandable, inasmuch as adhering to ideas coming from members of one's in-group can help overcome the discomfort produced by contradictory information and prematurely close on a "shared", but wrong, diagnosis. This can be seen, for example, in case 6 where, albeit uncomfortable uncertainties regarding patient's COVID status, it was decided by the in-group peers not to move the patient to another "non-COVID" unit. According to the psychologist Irving L. Janis, the deep involvement in a cohesive in-group can lead to groupthink, whereby the strive for keeping a conformity or a harmony amongst the members of the in-group "override their motivation to realistically appraise alternative courses of action"; [53] this phenomenon can result in irrational decision-making. [54] . Cognitive dissonance and premature closure also tended to associate with default (or status quo) bias, i.e. when people stick to previously made decisions and/or prefer things to stay the same by doing nothing (inertia) although this occurs in spite of changing circumstances [55] Time pressure can foster this bias, as recently suggested by Couto et al. [56] Moreover, preferring status quo options can serve to reduce the negative emotions connected to choice making (anticipatory emotions) [57, 58] Adhering to a status quo can be seen as a coping strategy against the stressful and "cognitive expensive" situation of decision making in the "liquid times" of pandemic, dominated by unpredictability and uncertainty and involving conflicting attitudes, beliefs or behaviors (i.e. cognitive dissonance). In this context, premature closure is a rapid way to preserve a "no change" attitude. Availability bias is a mental shortcut (heuristic) to help the decision-making to occur faster, since it reduces the time and the effort involved in decision making [16, 17] . It often occurs as a consequence to recent experiences with similar clinical cases [29] . In such situations, physicians tend to weight their clinical reasoning toward more recent information, easily coming to mind [59] . During the pandemic waves, physicians were exposed to multiple patients with similar clinical features in rapid succession; this indeed is a suitable condition for the occurrence of the availability bias (see case 3 above). Confirmation bias, i.e. to give greater weight to data that support a preliminary diagnostic assumption while failing to seek, or dismissing, evidence contradictory to the favored hypothesis [60] . This bias can lead to premature closure, and become a source of error. It has been suggested that confirmation bias can arise from an attempt to avoid cognitive dissonance [61] . Closely related to the confirmation bias, is the anchoring bias [60] . This bias occurs when interpreting evidence, and refers to the tendency of physicians to prioritize information and data supporting their initial diagnosis, making them unable to adjust their initial hypothesis when further information becomes available. This anchoring can eventually lead to wrong decisions, as in clinical case 1. It is worth noticing that, akin to our study, the most frequent biases in Lucas' study [40] were premature closure, anchoring bias and confirmation bias-and this notwithstanding the methodological and conceptual differences. We can thus hypothesize that the pandemic context had the potential to heavily influence the clinical reasoning processes of physicians, and that each clinical case carried different embedded risks for cognitive biases. Different strategies of cognitive "debiasing" (i.e. the mental correction of a mental contamination) [15] have been suggested [11, 12, 29, 62, 63] . Among these, deliberate reflection (i.e. look for evidence that fits or contradicts an initial diagnostic hypothesis) has shown some success [64] , while the efficiency of other approaches still require further investigations. Debiasing is far from being simple, satisfying, and effective. The reasons are many, such as [12, 35] : Type 2 thinking is not unequivocally less prone to cognitive bias that Type 1 thinking; debiasing involves meta-awareness and metacognition [65] , i.e. self-diagnostic processes that are per se error prone [66] ; bias identification is unreliable, even among experts, and is often biased (hindsight bias) [67] . Paraphrasing Kahneman, despite a lifetime spent studying bias, we are far better in recognizing the errors of others than our owns [68] . Moreover, one should not forget that the construction of the "illness script" of COVID-19 is still ill defined and ongoing. Until solid knowledge and understanding of the disease is built up, and knowledge gaps are filled, clinical reasoning cannot be but flawed [21, 66, [69] [70] [71] . Learning which biases arise, and how, is not a golden hammer to mitigate errors in clinical reasoning. However, this awareness remains "interesting, humbling and motivating" [66] , and may facilitate a much desirable shift in medical culture: improving the tolerance to uncertainty in clinical life [72, 73] . One of the key strengths of this study is that it has allowed us to explore, through an empirical design in real clinical context, which cognitive biases were at play in real clinical context and has deepened our understanding on how they interact in these unforeseen and ever-changing contexts. This study has some limitations. Participants were asked to describe clinical cases in which they felt their clinical reasoning could have gone awry. The way they described their cases could thus potentially reflect their own feeling on cognitive biases. The overall response rate was low (9.4%), in an expected range for medical practitioners [74, 75] . This could be attributed to the context of the COVID-19 sanitary crisis (particularly time pressure), as well as to the efforts required from participants who were asked to analyse and describe their clinical cases. Additionally, participants may have had no awareness about potential clinical reasoning problems arising when taking care of patient with possible COVID-19. We also solicited a large number of physicians, without specific targeting, thus explaining a large denominator. This case study design is of exploratory nature and rooted in an empirical inquiry, which Yin described as "the chain of events" [27] . This allows the exploration of the clinical reasoning process at play with the underlying context of the pandemic and how this might hinder the decision-making process. Our results can, therefore, be seen as clues to pursue and further investigate. Cognitive biases are part of everyday medical practice. The analysis of clinical cases in the peculiar context of the COVID-19 pandemic has highlighted several biases that can affect clinical reasoning and lead to errors. The emergence of cognitive biases in this new context, that shares similarities with emergency medicine, has never been assessed before and offers an insight on the functioning of the "clinical brain" in the midst of the pandemic crisis. Although physicians should be aware of these cognitive mechanisms in order to potentially reduce biases and improve clinical reasoning, this is far from being sufficient. Knowledge and experience matter. For the moment, the illness-script of COVID-19 is still under construction, and we may hope for a mitigation of clinical errors with improvement of knowledge deficit. Brown L (2020) COVID blindness Clinical decision making and multiple problem spaces Issues of generality in medical problem-solving. In: Schmidt HG (ed) Tutorials in problem-based learning The role of illness scripts in the development of medical diagnostic expertise: results from an interview study Scripts and clinical reasoning Illness script development in pre-clinical education through case-based clinical reasoning training Clinical reasoning and COVID 19 pandemic: current influencing factors Let us take a step back! Internal Emerg Med Problem solving Factors underlying suboptimal diagnostic performance in physicians under time pressure Judgment under uncertainty: heuristics and biases Cognitive debiasing 1: origins of bias and theory of debiasing Diagnostic errors and flaws in clinical reasoning: mechanisms and prevention in practice Influence of perceived difficulty of cases on physicians' diagnostic reasoning The framing of decisions and the psychology of choice Mental contamination and mental correction: unwanted influences on judgments and evaluations Dual processing and diagnostic errors Diagnostic error and clinical reasoning An analysis of clinical reasoning through a recent and comprehensive approach: the dualprocess theory Critical thinking, biases and dual processing: The enduring myth of generalisable skills A scoping review of physicians' clinical reasoning in emergency departments The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking Context is everything or how could I have been that stupid? Healthc Q Conditions for intuitive expertise: A failure to disagree Case study research. In: Cassell CSG (ed) Essential guide to qualitative methods in organizational research Case Study Research Design And Methods, 5th edn The Use of Qualitative Content Analysis in Case Study Research Applications of case study research The benefits of flexibility: the pedagogical value of instructions to adopt multifaceted diagnostic reasoning strategies Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents ED cognition: any decision by anyone at any time The importance of cognitive errors in diagnosis and strategies to minimize them Emergency medicine: a practice prone to error Gating the holes in theSwisscheese (part I): Expanding professor Reason's model for patient safety Implicit bias in healthcare: clinical practice, research and decision making Cognitive biases associated with medical decisions: a systematic review Stress potentiates decision biases: a stress induced deliberation-to-intuition (SIDI) model Cognitive biases in emergency physicians: a pilot study Understanding clinical decision-making during the COVID-19 pandemic: a cross-sectional worldwide survey Upgrades to intensive care: the effects of COVID-19 on decision-making in the emergency department Transforming qualitative information: thematic analysis and code development A theory of cognitive dissonance Overconfidence as a cause of diagnostic error in medicine Premature conclusions in diagnostic reasoning Diagnostic errors Premature conclusions in the diagnosis of iron-deficiency anemia: cause and effect Collaborative decision making in liquid times Liquid modernity Multiple spaces of choice, engagement and influence in clinical decision making The cybernetic theory of decision Illusions in reasoning Victims of groupthink: A psychological study of foreign-policy decisions and fiascoes A social identity maintenance model of groupthink Status quo bias in decision making Investigating the origin and consequences of endogenous default options in repeated economic choices Status quo selection increases with consecutive emotionally difficult decisions. Poster presented at the meeting of the Society for Judgment and Decision Making The psychology of doing nothing: forms of decision avoidance result from reason and emotion Flaws in clinical reasoning: a common cause of diagnostic error Confirmation bias in medical decision-making To err is human: a case-based review of cognitive bias and its role in clinical decision making Effects of reflective practice on the accuracy of medical diagnoses Developing checklists to prevent diagnostic error in emergency room settings Exploring the role of salient distracting clinical features in the emergence of diagnostic errors and the mechanisms through which reflection counteracts mistakes Dual processing and diagnostic errors Premature closure? Not so fast Is bias in the eye of the beholder? A vignette study to assess recognition of cognitive biases in clinical case workups Thinking, fast and slow. Farrar, Straus and Giroux The role of biomedical knowledge in clinical reasoning: a lexical decision study Diagnostic error: the hidden epidemic Disrupting diagnostic reasoning: do interruptions, instructions, and experience affect the diagnostic accuracy and response time of residents and emergency physicians Tolerating uncertainty: the next medical revolution? Face à l'incertitude : humilité, curiosité et partage A very low response rate in an on-line survey of medical practitioners National survey of COVID 19 challenges The authors wish to thank all the doctors who