Evidence Summary

 

Naming Specific Adverse Effects Improves Relative Recall for Search Filters Identifying Literature on Surgical Interventions in MEDLINE and Embase

 

A Review of:

Golder, S., Wright, K., & Loke, Y.K. (2018). The development of search filters for adverse effects of surgical interventions in MEDLINE and Embase. Health Information and Libraries Journal, 35(2), 121-129. https://doi.org/10.1111/hir.12213

 

 

Reviewed by:

Ann Glusker

Research & Data Coordinator

National Network of Libraries of Medicine, Pacific Northwest Region

University of Washington Health Sciences Library

Seattle, Washington, United States of America

Email: glusker@uw.edu

 

Received: 1 Dec. 2018                                                                    Accepted: 16 Jan. 2019

 

 

cc-ca_logo_xl 2019 Glusker. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

 

DOI: 10.18438/eblip29537

 

 

Abstract

 

Objective – “To develop and validate search filters for MEDLINE and Embase for the adverse effects of surgical interventions” (p.121).

 

Design From a universe of systematic reviews, the authors created “an unselected cohort…where relevant articles are not chosen because of the presence of adverse effects terms” (p.123). The studies referenced in the cohort reviews were extracted to create an overall citation set. From this, three equal-sized sets of studies were created by random selection, and used for: development of a filter (identifying search terms); evaluation of the filter (testing how well it worked); and validation of the filter (assessing how well it retrieved relevant studies).

 

Setting – Systematic reviews of adverse effects from the Database of Abstracts of Reviews of Effects (DARE), published in 2014.

 

Subjects – 358 studies derived from the references of 19 systematic reviews (352 available in MEDLINE, 348 available in Embase).

 

Methods – Word and phrase frequency analysis was performed on the development set of articles to identify a list of terms, starting with the term creating the highest recall from titles and abstracts of articles, and continuing until adding new search terms produced no more new records recalled. The search strategy thus developed was then tested on the evaluation set of articles. In this case, using the strategy recalled all of the articles which could be obtained using generic search terms; however, adding specific search terms (such as the MeSH term “surgical site infection”) improved recall. Finally, the strategy incorporating both generic and specific search terms for adverse effects was used on the validation set of articles. Search strategies used are included in the article, as is a list in the discussion section of MeSH and Embase indexing terms specific to or suggesting adverse effects.

 

Main Results – “In each case the addition of specific adverse effects terms could have improved the recall of the searches” (p. 127). This was true for all six cases (development, evaluation and validation study sets, for each of MEDLINE and Embase) in which specific terms were added to searches using generic terms, and recall percentages compared.

 

Conclusion – While no filter can deliver 100% of items in a given standard set of studies on adverse effects (since title and abstract fields may not contain any indication of relevance to the topic), adding specific adverse effects terms to generic ones while developing filters is shown to improve recall for surgery-related adverse effects (similarly to drug-related adverse effects). The use of filters requires user engagement and critical analysis; at the same time, deploying well-constructed filters can have many benefits, including: helping users, especially clinicians, get a search started; managing a large and unwieldy set of citations retrieved; and to suggest new search strategies.

 

Commentary

 

This paper adds to the substantial literature on the creation and limitations of search filters for biomedical citation searching in order to perform systematic reviews. The authors have been prolific contributors to this literature; they appear in ten of the fifteen articles referenced in this article. This paper builds on their earlier work, looking at non-drug interventions (Golder et al., 2017). While they couldn’t characterize these in general, they found that they could characterize terms for surgical interventions, and this study is the outcome of their exploration. The resulting findings, building on their own past studies in a methodical and informed manner, create a valuable resource for both librarians and clinicians, and suggest further exploration on the part of the authors, as they note in their conclusions section.

 

The authors mention two limitations of their work: the sample size of articles examined is small; and they lack a true measurement for precision. In addition, for this evidence summary, methodologies were systematically assessed using Glynn’s critical appraisal checklist (2006), raising questions about both sample size and replicability.

 

As for the sample size, there may be existing resources which would be appropriate for further research; one possibility is the extensive McMaster PLUS citation database developed by HiRU (the Health Information Research Unit at McMaster University) (Wilczynski, 2011; available at https://hiru.mcmaster.ca/hiru/HIRU_McMaster_PLUS_projects.aspx). Wilczynski’s description of McMaster’s approach to search filter development highlights the specialized nature of this work, and expands on some terms and concepts that Golder et al. (2018) outlined. Another useful article for clear definitions and process descriptions was that on MEDLINE indexing and adverse effects of oral contraceptives (Wieland and Dickersin, 2005). This is not a criticism, but more an acknowledgement that it may take additional reading beyond the Golder et al. (2018) article to master its content. Also important would have been a brief word about why MEDLINE and Embase were the chosen databases for searching; a recent study by Lam et al. (2018) offers interesting insights into the nature and uses of these two resources and illuminates context in the paper reviewed here. 

 

This leaves (besides the question of a true measurement of precision, which is beyond our scope) the question of replicability. The explanations of the process and decisions in the article are meticulous and complete, but complex. This means that potentially there are decision points that might be handled differently by a replicating researcher, such as, which articles actually had adverse effects as a primary outcome (especially given that discrepancies between these researchers were resolved by discussion alone without a third party). However, this is a very minor point.

 

In conjunction with some of the other supporting pieces mentioned, this paper is overall an excellent and rigorously conducted and presented study with which to introduce oneself or one’s students to the area of search filter development. It also makes important contributions to the armamentarium of librarians and clinicians as they search for studies to guide their work. For those performing and supporting systematic reviews, it is extremely useful to have such a validated set of search strategies, both for reasons of efficiency and consistency.

 

References

 

Glynn, L. (2006). A critical appraisal tool for library and information research. Library Hi Tech, 24(3), 387-399. https://doi.org/10.1108/07378830610692154

 

Golder, S., Wright, K., & Loke, Y.K. (2017). The feasibility of a search filter for the adverse effects of nondrug interventions in MEDLINE and Embase. Research Synthesis Methods, 8(4), 506-513. https://doi.org/10.1002/jrsm.1267

 

Lam, M.T., De Longhi, C., Turnbull, J., Lam, H.R., & Besa, R. (2018). Has Embase replaced MEDLINE since coverage expansion? Journal of the Medical Library Association, 106(2), 227-234. https://doi.org/10.5195/jmla.2018.281

 

Wieland, S. & Dickersin, K. (2005). Selective exposure reporting and MEDLINE indexing limited the search sensitivity for observational studies of the adverse effects of oral contraceptives. Journal of Clinical Epidemiology, 58(6), 560-567. https://doi.org/10.1016/j.jclinepi.2004.11.018

 

Wilczynski, N. (2011). McMaster University – HiRU’s approach to search filter development. Retrieved from https://hiru.mcmaster.ca/hiru/HiRU_approach.pdf