key: cord-0880529-cclj2eoc authors: Murad, M. Hassan; Nayfeh, Tarek; Urtecho Suarez, Meritxell; Seisa, Mohamed O.; Abd-Rabu, Rami; Farah, Magdoleen Hassan Eltayeb; Firwana, Mohammed; Hasan, Bashar; Jawaid, Tabinda; Shah, Sahrish; Torres Roldan, Victor; Prokop, Larry; Wang, Zhen; Saadi, Samer Mohir title: A Framework for Evidence Synthesis Programs to Respond to a Pandemic date: 2020-06-16 journal: Mayo Clin Proc DOI: 10.1016/j.mayocp.2020.05.009 sha: 1059074c8ac2daa5336ef61578100f061ae98604 doc_id: 880529 cord_uid: cclj2eoc The coronavirus disease 2019 (COVID-19) pandemic requires making rapid decisions based on sparse and rapidly changing evidence. Evidence synthesis programs conduct systematic reviews for guideline developers, health systems clinicians, and decision-makers that usually take an average 6 to 8 months to complete. We present a framework for evidence synthesis programs to respond to pandemics that has proven feasible and practical during the COVID-19 response in a large multistate health system employing more than 78,000 people. The framework includes four components: an approach for conducting rapid reviews, a repository of rapid reviews, a registry for all original studies about COVID-19, and twice-weekly prioritized update of new evidence sent to key stakeholders. As COVID-19 will not be our last pandemic, we share the details of this framework to allow replication in other institutions and re-implementation in future pandemics. T he coronavirus disease 2019 (COVID-19) pandemic has challenged all parts of the health care enterprise with the need to make rapid decisions based on sparse and rapidly changing evidence. 1 Evidence synthesis programs are research units that conduct systematic reviews for guideline developers, health systems clinicians, and decision-makers; these programs had to significantly change their modus operandi or otherwise become irrelevant to decision-making. A typical systematic review takes an average 6 to 18 months to complete, 2 making this approach outdated and impractical during pandemics. Much has been written about rapid reviews as an alternative that trades off some aspects of rigor with being rapid and concise. Surveys of health systems stakeholders have always shown that they preferred rapid reviews. 3 However, there is no consensus on how to define or conduct rapid reviews, 4 and little is known about how to make such reviews operational during time of crisis. In this exposition, we present a framework for an evidence synthesis program response to a pandemic. This framework has proven feasible and practical during the COVID-19 response in a large multistate academic center employing more than 78,000 people and caring for greater than 1,200,000 patients per year from all 50 US states and 138 countries. As COVID-19 will not be the last pandemic we face, we share this experience to allow replication in other institutions and re-implementation in future pandemics. synthesis program is one of the Evidencebased Practice Centers designated by the Agency of Healthcare Quality and Research. The program consists of two faculties, librarians, 10 to 15 core investigators, and numerous clinicians and ad hoc collaborators, and is housed within the Mayo Clinic Robert D. and Patricia E. Kern Center for the Science of Health Care Delivery. The evidence synthesis program developed a response within 2 weeks of the World Health Organization declaration of a pandemic due to severe acute respiratory system coronavirus 2 on March 11, 2020. The development of the response followed a quality improvement methodology in which the current state of evidence synthesis was evaluated and found insufficient, relevant stakeholders were identified, and rapid and iterative cycles of intervention were initiated and modified repeatedly based on stakeholders' feedback. The response consisted of four components ( Figure) . We called this component of the framework "What is New?" It consists of a list of studies that have been published in the previous 3 days about COVID-19. A search query was developed early in the pandemic via collaboration with medical reference librarians with expertise in systematic reviews (Supplemental Material, available online at http://www.mayoclinicproceedings.org). This product is intended to have minimal manual processing. Search output is categorized into topic areas (eg, prevention, diagnosis, treatment, and prognosis) and is prioritized by highlighting the studies expected to impact or change practice and placing them at the top of the list. We arbitrarily assumed that such studies are those with any of these characteristics: more than 100 participants, randomized, systematic review or a guideline, or published in the 10 journals with highest impact factor. When 3 days have passed on the "What is New?" list of studies, a new list is generated. The expired list is entered in a database that contains all studies about COVID-19. Consequently, this database would contain all studies published since the date of the first case report of COVID-19. This database is critical for conducting any future systematic or rapid review because reviewers would only have to search this database for studies that fit their new question, as opposed to designing and executing searches in bibliographic databases. Framework for evidence synthesis programs to respond to pandemics. The framework includes four components and presents a description of the component and relevant implementation information. Frontline clinicians and health system guideline developers provided questions about practice dilemmas they were facing during the pandemic. The program conducted rapid reviews to answer these questions with turnaround time of 3 to 4 days. The reviews emphasized speed and brevity. Steps that deviated from a standard systematic review process were searching a single database, restricting to English language, heavily depending on general search engines such as Google, selecting and extracting studies by a single reviewer, and rarely requiring meta-analysis. This repository included systematic and rapid reviews identified through the following sources: rapid reviews about COVID-19 conducted by the Mayo program, reviews identified through the twice-weekly search, reviews identified through a dedicated overview of systematic reviews search (Supplemental Material), retrieved from a discrete list of websites (Supplemental Material), or identified through networking with other academic evidence synthesis programs. Some reviews included indirect evidence derived from studies about other coronaviruses or respiratory infections, when appropriate. Reviews were indexed in population, intervention, comparison, outcome (PICO) format with quantitative or narrative summary of the main outcomes and also indexed according to the same topic areas. Grading of the certainty in the evidence, a critical component of any synthesis, 5 was extracted from the existing review if available. A final and rigorous grading process is planned when feasible from a time perspective. Identifying a target audience aside from the stakeholders who requested a particular review was challenging in a complex health system. Mass email strategy was considered ineffective and not pursued. We eventually followed the approach of identifying key partners working on developing guidelines and institutional policies and procedures, and we created an internal website with all the data available without the need for a password. Another important audience was researchers working on various aspects of COVID-19 such as conducting trials or developing systematic review protocols. We describe the development and implementation of a framework for evidence synthesis programs during the COVID-19 pandemic. This framework was developed using rapid cycles of iterative components and was found feasible to implement in a large multistate health system. After 1 month of implementation, this team has conducted seven rapid reviews and indexed more than 100 reviews; the database contains approximately 2000 studies. This framework is likely generalizable to the extent that it can be activated in other future pandemics or urgent crises such as large-scale novel environmental exposures. Developing a list of studies published in the previous 3 days is surprisingly difficult and requires a manual process. Varying filters in databases do not function well and confuse electronic publication dates with print dates. The prioritization of studies scheme that we used clearly can introduce publication bias, particularly when focusing on specific journals. Existing systematic review registries (eg, the International Prospective Register of Systematic Reviews) were neither up-to-date nor helpful. A major disincentive to conducting rapid reviews is the difficulty in publishing them by peerreviewed journals. Rapid reviews themselves have important limitations. For example, not duplicating the process of study selection by two independent reviewers can introduce an error rate of approximately 11%. 6 Within weeks of the pandemic, multiple systematic reviews had been published about the same topic, which led to a challenge of determining which review to use, which one is most recent, and the extent of overlap of included studies across reviews. Another limitation relates to any new or novel topic: earlier studies likely exaggerate the treatment effects, which is known as the Proteus phenomenon. 7, 8 Extrapolation from studies about other corona viruses or respiratory infections can lead to indirect evidence and lower certainty. We present a framework for evidence synthesis programs to respond to pandemics that has proven feasible and practical during the response to COVID-19. Whereas some health systems may not have a similar infrastructure for timely implementation of rapid evidence synthesis activities, collaboration across health systems is critical so that emerging knowledge is efficiently shared across systems and globally. Supplemental material can be found online at http://www.mayoclinicproceedings.org. Supplemental material attached to journal articles has not been edited, and the authors take responsibility for the accuracy of all data. Guide to understanding the 2019 novel coronavirus Methodology in conducting a systematic review of systematic reviews of healthcare interventions In: A Framework for Conceptualizing Evidence Needs of Health Systems What are the best methodologies for rapid reviews of the research evidence for evidence-informed decision making in health policy and practice: a rapid review Clinical practice guidelines: a primer on development and dissemination Error rates of human reviewers during abstract screening in systematic reviews Early studies reported extreme findings with large variability: a meta-epidemiologic study in the field of endocrinology Treatment Effect in Earlier Trials of Patients With Chronic Medical Conditions: A Meta-Epidemiologic Study