Classics

 

Eysenbach, Tuische and Diepgen’s Evaluation of Web Searching for Identifying Unpublished Studies for Systematic Reviews: An Innovative Study Which is Still Relevant Today

 

A Review of:

Eysenbach, G., Tuische, J. & Diepgen, T.L. (2001). Evaluation of the usefulness of Internet searches to identify unpublished clinical trials for systematic reviews. Medical Informatics and the Internet in Medicine, 26(3), 203-218. http://dx.doi.org/10.1080/14639230110075459  

 

Reviewed by:

Simon Briscoe
Information Specialist
National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) South West Peninsula
University of Exeter Medical School
Exeter, United Kingdom
Email:
s.briscoe@exeter.ac.uk

 

Received: 7 Mar. 2016    Accepted: 11 May 2016

 

 

cc-ca_logo_xl 2016 Briscoe. This is an Open Access article distributed under the terms of the Creative Commons-Attribution-Noncommercial-Share Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

Abstract

 

Objective – To consider whether web searching is a useful method for identifying unpublished studies for inclusion in systematic reviews.

 

Design – Retrospective web searches using the AltaVista search engine were conducted to identify unpublished studies – specifically, clinical trials – for systematic reviews which did not use a web search engine.

 

Setting – The Department of Clinical Social Medicine, University of Heidelberg, Germany.

 

Subjects – n/a

 

Methods – Pilot testing of 11 web search engines was carried out to determine which could handle complex search queries. Pre-specified search requirements included the ability to handle Boolean and proximity operators, and truncation searching. A total of seven Cochrane systematic reviews were randomly selected from the Cochrane Library Issue 2, 1998, and their bibliographic database search strategies were adapted for the web search engine, AltaVista. Each adaptation combined search terms for the intervention, problem, and study type in the systematic review. Hints to planned, ongoing, or unpublished studies retrieved by the search engine, which were not cited in the systematic reviews, were followed up by visiting websites and contacting authors for further details when required. The authors of the systematic reviews were then contacted and asked to comment on the potential relevance of the identified studies.

 

Main Results – Hints to 14 unpublished and potentially relevant studies, corresponding to 4 of the 7 randomly selected Cochrane systematic reviews, were identified. Out of the 14 studies, 2 were considered irrelevant to the corresponding systematic review by the systematic review authors. The relevance of a further three studies could not be clearly ascertained. This left nine studies which were considered relevant to a systematic review. In addition to this main finding, the pilot study to identify suitable search engines found that AltaVista was the only search engine able to handle the complex searches required to search for unpublished studies.

 

Conclusion –Web searches using a search engine have the potential to identify studies for systematic reviews. Web search engines have considerable limitations which impede the identification of studies.  

 

Commentary

 

Background

 

Eysenbach, Tuische, and Diepgen’s study is the first evidence-based evaluation of how searching the Internet using a web search engine can contribute to the identification of studies for systematic reviews, in particular, unpublished clinical trials. The study deserves the status of classic due to its originality and continuing significance; in particular, for proposing and evaluating a systematic approach to web searching which to date is referenced in prominent guidelines for conducting systematic reviews (Lefebvre, Manheimer & Glanville, 2011).

 

Web searching is a common activity for information professionals in almost all library and information settings. Systematic reviews, however, are perhaps more familiar to information professionals in health care research settings. Systematic reviews answer research questions by identifying and appraising all the relevant studies (using pre-specified eligibility and quality criteria) and synthesizing the accumulated evidence (Higgins & Green, 2011). They are important in health care settings because there is too much research literature for practitioners to appraise individually. In addition, the methods and conclusions of systematic reviews are less biased than narrative reviews or expert opinion (Higgins & Green, 2011). It is important to identify unpublished studies, the focus of Eysenbach et al., because they may contain findings which are more up-to-date than published studies. There is also evidence suggesting that studies with negative findings are less frequently published or take longer to reach publication (Fanelli, 2010).

 

Information professionals contribute to systematic reviews by identifying studies (Harris, 2005). Research has shown that their contributions improve the quality of systematic reviews (Rethlefsen, Farrell, Osterhaus Trzasko, & Brigham, 2015). At the time Eysenbach et al. was published in 2001, there had been several years of research on the identification of studies for health care systematic reviews using bibliographic databases. Early examples of this research include studies by Dickersin et al. (1994) and Wilczynski et al. (1993) – see also the historical survey of methodological developments in this area by Lefebrve et al. (2013). There were also established supplementary search methods for identifying studies, including checking reference lists, hand searching, and searching company trials registries, all of which were detailed in the systematic review guidance manual, the Cochrane Reviewers’ Handbook (now titled the Cochrane Handbook for Systematic Reviews of Interventions, hereafter, the Cochrane Handbook) (Clarke & Oxman, 1999). Web searching did not have a prominent place amongst these search methods. This is a view Eysenbach et al. verify with reference to the lack of a web searching section in the otherwise comprehensive Cochrane Handbook .

 

Eysenbach et al. addressed the lack of research and guidance on web searching for systematic reviews, focusing on the use of web search engines to identify unpublished studies. The authors tested the hypothesis that retrospectively conducted web searches, which were adapted from the bibliographic database search strategies of completed systematic reviews, would retrieve previously unidentified and unpublished studies (specifically, clinical trials). They also set out to address practical issues such as the suitability of various search engines for the task.

 

Main Results

 

Following the identification of 14 unpublished studies relating to 4 of the 7 included systematic reviews in the study, Eysenbach et al. recommended that web searching using a search engine with appropriate search features should be conducted alongside other search methods. They also, however, noted that there was no evidence the searches they conducted affected the outcome of a systematic review. In particular they emphasized that none of the studies they identified contained results that remained unpublished due to negative results. (This would have contributed to the aforementioned evidence that studies with negative results are hard to publish and less likely to be included in systematic reviews (Fanelli, 2010).) The authors concluded that web searching using a search engine should be conducted as it has the potential to affect the outcome of a systematic review.

 

This conclusion is important for being the first evidence-based recommendation on web searching for systematic reviews. The conclusion has been noted in subsequent editions of the Cochrane Handbook, which currently states that “[t]here is little empirical evidence as to the value of using general internet search engines such as Google to identify potential studies”, citing Eysenbach et al. as evidence (Lefebvre et al., 2011). A forwards citation search on the citation index Web of Science reveals a total of eighteen citations of Eysenbach et al.  The Cochrane Handbook citation is enough to ensure that health care information professionals with systematic review experience are likely to have seen, or learnt from mentors and on training courses, the main result and conclusion.

 

The web searching section in the Cochrane Handbook also advises that searchers might have more success identifying studies by targeting known key websites, such as pharmaceutical companies, than using web search engines. This is an important point considering the inaccessibility of a large portion of the web, known as the invisible or deep web, to the automated web-crawlers which index webpages for search engines (Devine & Egger-Sider, 2013).This is highlighted by Eysenbach et al. To improve the efficacy of using search engines the authors recommended that organizations involved in carrying out and funding trials should publish details “on a robot [i.e. web crawler] accessible web page…. using the standard format ‘randomized trial on (intervention) in (condition)’ … so that they can be indexed by search engines and found by systematic reviewers” (p. 216).

 

Eysenbach et al. advocated for the establishment of prospective and ongoing trials registries. This would remove some of the difficulties of finding unpublished trials using web search engines, though the authors anticipated that the web would play an important part in “linking the evidence” between different registries (p. 215). Recent developments in this area are detailed below in the discussion of specialized web resources.

 

Pilot Study Results

 

In addition to the enduring impact of the main finding of Eysenbach et al., the findings from the pilot study remain relevant. In order to effectively adapt bibliographic database search strategies for web search engines, the search engines require similar search features. To this end, the search features of 11 web search engines were assessed: AltaVista, Excite, FAST search, Google, HotBot, InfoSeek, Lycos, Northern Light, WebCrawler, Medical World Search, and MedHunt. Only AltaVista offered all the required search features, i.e., Boolean operators, phrase, proximity, and truncation searching, and capitalization recognition. Subsequently, AltaVista was the only search engine used in the main study.

 

It remains the case today that bibliographic databases have more advanced search features than web search engines. There have been some improvements to the latter since Eysenbach et al. was published. For example, Google did not offer Boolean searching when Eysenbach et al. was published but it does at the time of writing, albeit with limitations. However, the main developments in web search engines have been moving away from complex searches where the user retains a degree of control, towards simple searches where the user increasingly relinquishes control to undisclosed algorithms which determine the relevancy and ranking of the webpages retrieved (Granka, 2010; Pariser, 2011). This is a challenge for information professionals with complex and detailed information needs, in that search strategy development is limited, frequent changes to algorithms compromise the reproducibility of searches, and bias is introduced in cases where the search history of the user informs the webpages which are retrieved (Briscoe, 2015). 

 

The problem of identifying relevant studies with a simple search interface has been exacerbated by the growth of the web. When Eysenbach et al. carried out their research in December 1998 there were approximately 2,400,000 websites, whereas in March 2016 there were approximately 1,000,000,000 websites ("Total number," 2016). Subsequently, the search string (study or trial or random*) near asthma* near (education* or (self near management)), which retrieved 159 hits using AltaVista in December 1998 (p. 210), retrieved 389,000 hits using Google on 4 March 2016. AltaVista was terminated in 2013 and is unavailable for testing ("Yahoo to shut," 2013). The same search on 4 March 2016 in Google Scholar, which limits results to scholarly literature, retrieved a more focused 37,800 hits, although it is unclear whether the unpublished studies which Eysenbach et al. searched for would be indexed in Google Scholar. The high numbers retrieved indicate that the approach Eysenbach et al. used would need to be adapted in order for the results to be manageable. Either the searches would need to be made more focused, or the screening of hits would need to be limited to a manageable number (Godin, Stapleton, Kirkpatrick, Hanning, & Leatherdale, 2015).

 

The relatively simple search capabilities of web search engines and the growth of the web highlight the importance of assessing the tools and strategies used for web searching, following the example of Eysenbach et al. In particular, in an age dominated by Google, information professionals should be mindful to seek out and assess other search engines.

 

The Development of Specialized Web Resources

 

As a solution to the limitations of using web search engines for systematic reviews, Eysenbach et al. advocated the creation of “specialized search engines” containing “expert knowledge on which [web]sites ongoing studies are published and [able to] access dynamic databases [i.e. the deep web] and meta-trial registers” (p. 214). No such search engine exists to date, although the launch of the web-based databases ClinicalTrials.gov and the ISRCTN registry (both in 2000) have made it easier to identify unpublished studies, specifically, unpublished clinical trials.

 

Google Scholar is a specialized web search engine but it is unable to access the deep web as advocated by Eyenbach et al. Nonetheless, Google Scholar is an advance in web searching for the systematic review community, and in recent years there has been research on how it can contribute to systematic reviews. In the health care literature there has been research and debate about whether Google Scholar can replace bibliographic databases as the main source of studies for systematic reviews (Boeker, Vach, & Motschall, 2013; Gehanno, Rollin, & Darmoni, 2013; Giustini & Boulos, 2013), general comparisons (not primarily related to systematic review methods) of Google Scholar with the PubMed database (Anders & Evans, 2010; Nourbakhsh, Nugent, Wang, Cevik, & Nugent, 2012; Shultz, 2007), and in the environmental science literature, its ability to identify grey literature (Haddaway, Collins, Coughlin, & Kirk, 2015). There are varying views on how much Google Scholar can contribute to systematic reviews, but in most studies the inadequacy of the Google Scholar search interface for writing complex search strategies is a predominant theme, reflecting the pilot study findings of Eysenbach et al.

 

Conclusion

 

Despite the limitations of web search engines and the underwhelming result of Eysenbach et al., information professionals who contribute to systematic reviews are likely to continue to use them to identify literature. Although there are web-based databases for health care literature, such as the ClinicalTrials.gov and ISRCTN trials registries, web searches using search engines have the potential to retrieve literature not indexed in these resources, or which exist in web resources unknown to the searcher. More research is needed on the potential role of web searching for different types of literature and different types of systematic reviews. Evaluations of search engines launched since Eysenbach et al. was published are also required. Eysenbach et al. will remain a benchmark for future research in these areas, and deserves to be recognized as a classic of the information science literature.

 

As an aid to future research, Eysenbach et al. advocated that systematic review authors should “carefully document their Internet search strategy in reports of systematic reviews (rather than just mentioning that ‘Internet searches have been performed’) so that factors influencing the effectiveness and necessity of Internet searches can be identified” (p. 215). This is a recommendation which research suggests requires more adherence (Briscoe, 2015).

 

References

 

Anders, M. E., & Evans, D. P. (2010). Comparison of PubMed and Google Scholar literature searches. Respiratory Care, 55(5), 578-583.

 

Boeker, M., Vach, W., & Motschall, E. (2013). Google Scholar as replacement for systematic literature searches: good relative recall and precision are not enough. BMC Medical Research Methodology, 13, 131. http://dx.doi.org/10.1186/1471-2288-13-131  

 

Briscoe, S. (2015). Web searching for systematic reviews: a case study of reporting standards in the UK Health Technology Assessment programme. BMC Research Notes, 8, 153.

http://dx.doi.org/10.1186/s13104-015-1079-y

 

Clarke, M., & Oxman, A. D. (Eds.) (1999). Cochrane Reviewers' Handbook 4.0 (updated July 1999) In: The Cochrane Library, Issue 4. Oxford: Update Software.

 

Devine, J., & Egger-Sider, F. (2013). Going Beyond Google Again: Strategies for Using and Teaching the Invisible Web. London: Facet Publishing.

 

Dickersin, K., Scherer, R., & Lefebvre, C. (1994). Identifying relevant studies for systematic reviews. British Medical Journal, 309(6964), 1286-1291.

 

Fanelli, D. (2010). Do pressures to publish increase scientists' bias? An empirical support from US States Data. PLoS One, 5(4), e10271. http://dx.doi.org/10.1371/journal.pone.0010271

 

Gehanno, J. F., Rollin, L., & Darmoni, S. (2013). Is the coverage of Google Scholar enough to be used alone for systematic reviews. BMC Medical Informatics and Decision Making,13, 7. http://dx.doi/10.1186/1472-6947-13-7

 

Godin, K., Stapleton, J., Kirkpatrick, S. I., Hanning, R. M., & Leatherdale, S. T. (2015). Applying systematic review search methods to the grey literature: a case study examining guidelines for school-based breakfast programs in Canada. Systematic Reviews, 4, 138. http://dx.doi.org/10.1186/s13643-015-0125-0  

 

Granka, L. A. (2010). The Politics of Search: A Decade Retrospective. The Information Society, 26(5), 364-374. http://dx.doi.org/10.1080/01972243.2010.511560

 

Haddaway, N. R., Collins, A. M., Coughlin, D., & Kirk, S. (2015). The Role of Google Scholar in Evidence Reviews and Its Applicability to Grey Literature Searching. PLoS One, 10(9), e0138237. http://dx.doi.org/10.1371/journal.pone.0138237

 

Harris, M. R. (2005). The librarian's roles in the systematic review process: a case study. Journal of the Medical Library Association, 93(1), 81-87.

 

Higgins, J. P. T., & Green, S. (Eds.) (2011). Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 (updated March 2011). The Cochrane Collaboration. Available from www.cochrane-handbook.org

 

Lefebvre, C., Glanville, J., Wieland, L. S., Coles, B., & Weightman, A. L. (2013). Methodological developments in searching for studies for systematic reviews: past, present and future? Systematic Reviews, 2, 78. http://dx.doi.org/10.1186/2046-4053-2-78 

 

Lefebvre, C., Manheimer, E., & Glanville, J. (2011). Searching for studies. In J. P. T. Higgins & S. Green (Eds.), Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 (updated March 2011). The Cochrane Collaboration. Available from www.cochrane-handbook.org

 

Nourbakhsh, E., Nugent, R., Wang, H., Cevik, C., & Nugent, K. (2012). Medical literature searches: a comparison of PubMed and Google Scholar. Health Information and Libraries Journal, 29(3), 214-222. http://dx.doi.org/10.1111/j.1471-1842.2012.00992.x

 

Pariser, E. (2011). The troubling future of internet search. The Futurist, 45(5), 6.

 

Rethlefsen, M. L., Farrell, A. M., Osterhaus Trzasko, L. C., & Brigham, T. J. (2015). Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. Journal of Clinical Epidemiology, 68(6), 617-626. http://dx.doi.org/10.1016/j.jclinepi.2014.11.025

 

Shultz, M. (2007). Comparing test searches in PubMed and Google Scholar. Journal of the Medical Library Association, 95(4), 442-445. http://dx.doi.org/10.3163/1536-5050.95.4.442

 

Total number of websites. (2016). Retrieved on 4 March 2016 from http://www.internetlivestats.com/total-number-of-websites/#trend

 

Wilczynski, N. L., Walker, C. J., McKibbon, K. A., & Haynes, R. B. (1993). Assessment of methodologic search filters in MEDLINE. Proceedings of the Annual Symposium on Computer Applications in Medical Care, 601-605.

 

Yahoo to shut down pioneering AltaVista search site. (2013). Retrieved on 4 March 2016 from http://www.bbc.co.uk/news/technology-23127361