Issues in Science and Technology Librarianship | Winter 2018 |
|||
DOI:10.5062/F4T43RBN |
Bradley Brazzeal
Agriculture & Forest Resources Librarian
Mississippi State University Libraries
Mississippi State, Mississippi
bbrazzeal@library.msstate.edu
The value of meta-analysis is becoming more recognized by agricultural researchers, and an important component of a proper meta-analysis is a comprehensive search of the relevant literature. This study examines the search methodologies of 140 crop science-related meta-analyses published in peer-reviewed journals. Specific information sought included databases searched, the use of reference list browsing to find additional studies, search string details, listing of publications used in the meta-analysis, and inclusion of non-journal publications. It was found that researchers often do not seem to have a good grasp on how to sufficiently document their search methodology in a way that allows for replication, and in some cases they may not be aware of how to conduct an effective, comprehensive search. The results highlight the need for researchers conducting a meta-analysis or similar project to collaborate with librarians who have expertise in literature searching.
Scherm et al. (2014) define meta-analysis as "the statistical analysis of the results of multiple independent studies, usually from the published and/or gray literature, to estimate the magnitude, consistency, homogeneity of an effect of interest." Researchers in the health sciences have used this technique the most extensively, and Makowski et al. (2014) have called upon agronomists to follow suit, noting the value of meta-analysis for the "drawing up of general laws on how agroecosystems work." However, they caution that the value is often diminished by improper application of meta-analysis techniques.
The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) provides a general list of criteria to include in a meta-analysis (Moher et al. 2009). More specifically for agriculture, Philibert et al. (2012) provide eight criteria for agronomy-related meta-analyses, while Koricheva and Gurevitch (2014) provide 16 criteria for meta-analyses in plant ecology. Two common criteria in all of these which are perhaps the most relevant for libraries include (1) providing a detailed description of search techniques used to identify studies to include in the meta-analysis, and (2) providing a reference list of studies used for the meta-analysis.
Failure to include relevant studies in a meta-analysis can skew the results (Reed and Baxter 2009), and Rothstein and Hopewell (2009) recognize that "[a]ppropriate staff, including librarians or information specialists, are critical to conducting an efficient search." Librarians can create more effective searches and be of special assistance with finding gray literature, which can help to minimize publication bias.
Publication bias may refer to situations where journal editors may be less likely to publish articles that do not report what they believe to be significant findings, and researchers may be less likely to even submit such findings to a journal (Sutton 2009). Those results may appear in gray literature, however, so researchers conducting a meta-analysis should be aware of the implications of excluding publications such as dissertations. On the other hand, researchers often purposely exclude gray literature because of quality concerns (Rothstein & Hopewell 2009). Both Philibert et al. (2012) and Koricheva and Gurevitch (2014) also list testing for publication bias in their lists of criteria, though their emphasis is on statistical methods to deal with publication bias, which is beyond the scope of the present study.
This study attempts to answer the following questions regarding crop-related meta-analyses. (1) Which databases do researchers use to find studies to include in the meta-analysis? (2) Do the researchers provide enough information for others to replicate the searches? (3) Do the researchers typically include sources other than peer-reviewed journal articles? (4) And finally, do the researchers provide a clear list of studies used in the meta-analysis? Both Philibert et al. (2012) and Koricheva and Gurevitch (2014) investigated questions two and four, but neither addressed questions one and three.
The author chose the CAB Abstracts database to compile the list of meta-analyses because of its rich metadata and broad inclusion of agriculture-related journals. After some preliminary searching to determine the most effective thesaurus terms and other keywords, the author used the following search string in the EBSCOhost version of CAB Abstracts:
"crop production" OR "crop yield" OR "crop management" OR "cropping systems"
AND "meta-analysis"
NOT (QTL OR "quantitative trait loci")
Articles with the terms "QTL" and "quantitative trait loci" were excluded after examination of a sample of the articles that contained the terms in the metadata. Of those sampled articles, none relied on datasets discovered by searching the literature.
The author included only meta-analyses published in journals, since these are more likely to be peer-reviewed. Despite the issue of publication bias mentioned above, limiting to journal articles allows the results to be more meaningful to researchers in crop science, since the peer reviewed journal article is still the most highly regarded type of publication in the sciences (Bornmann 2011). Furthermore, if a meta-analysis in a conference paper or dissertation were later published in a peer-reviewed journal, there would be a risk of double counting the same meta-analysis.
Articles were out of scope of the present study if they merely discussed meta-analysis as a technique, rather than being an actual meta-analysis, or if they were not based on published data. The author excluded non-English articles, as well as two articles that met the criteria but were unavailable through the author's institutional library, through the open web, or through Interlibrary Loan.
To analyze the studies, the author created a database using Microsoft Access and in early 2016 performed a preliminary analysis of 119 articles published from 2011 to 2015 that met the criteria above. This time period was chosen to yield a manageable number of results in which the array of databases available to researchers was fairly stable (e.g. Google Scholar had already been well-established). To expand the study, the author repeated the search in July 2016 and found 21 additional articles published in the first half of 2016 that met the criteria.
The 140 articles included in this study came from 66 different journals. Sixty-one of those journals, accounting for 133 of the total articles, were listed in the 2015 Journal Citation Reports (JCR). Thus 95% of the articles were from journals that met the standards for inclusion in JCR, which include the need for peer review (Testa 2016). An examination of the web sites for the remaining journals indicated that these were also peer-reviewed. It should be noted, however, that non-peer-reviewed material may appear in peer-reviewed journals, and it may not always be obvious which articles might fall into that category. Table 1 shows the distribution of articles by year. See the Appendix for a complete list of the articles.
Year | Number |
---|---|
2011 | 18 |
2012 | 16 |
2013 | 27 |
2014 | 29 |
2015 | 29 |
2016 (Jan-Jun) | 21 |
Total: | 140 |
Table 1: Distribution by year of meta-analyses examined for this study
The author was most interested in determining which databases were used to collect studies to include in the meta-analyses. Roughly three out of four (77%, n=108) of the articles included a list of databases used, while 23% (n=32) did not. The latter figure could be an indication of the relative unfamiliarity of meta-analysis as a technique in the crop sciences, for both authors and reviewers, as compared to other fields such as the health sciences.
The most commonly mentioned databases were those offered by Thomson Reuters (now Clarivate Analytics) (63%, n=88). Table 2 shows the varying terminology authors used to describe the databases from this company. In almost all cases, the descriptions provided do not adequately inform the reader because the specific version of "Web of Science" or "Web of Knowledge" searched by the authors could have included (or not included) many different databases (Clarivate Analytics 2017a, 2017b).
Author Description | Number | Percentage |
---|---|---|
Web of Science | 57 | 41% |
Web of Knowledge | 21 | 15% |
Science Citation Index | 4 | 3% |
http://www.isiknowledge.com | 1 | 1% |
Web of Science and Current Contents (ISI) | 1 | 1% |
Web of Science Citation Index Expanded | 1 | 1% |
Web of Science Core Collection | 1 | 1% |
Web of Science and Web of Knowledge | 1 | 1% |
Web of Sciences [sic] and Web of Knowledge | 1 | 1% |
None (Authors listed other databases) | 20 | 14% |
Unknown (No identification of any databases used) | 32 | 23% |
Total: | 140 | 102% |
Table 2: Use of databases provided by Thomson Reuters (now Clarivate Analytics). Total percentage exceeds 100 due to rounding.
Google Scholar is becoming a database of choice for many researchers (Wolff-Eisenberg et al. 2016), especially for early career researchers (Nicholas et al. 2017). However, only 22% (n=31) of the articles indicated that Google Scholar was one of the databases searched. Of that number, the majority (n=23) mentioned searching one of the Clarivate Analytics databases mentioned above, while only four of those mentioned Google Scholar as the sole database used.
It is surprising that Google Scholar was not mentioned more often. This could be because Google Scholar often returns an overwhelming number of search results, or because searches return inconsistent results. There could also be a recognition that it is best to stick to more established databases when doing a study for which the searching is actually part of the methodology. Note again, however, that almost a quarter of the analyzed articles did not provide any information about the databases searched.
Meta-analysis authors also mentioned a number of other databases, including some publisher web sites that the authors searched directly. Table 3 lists those mentioned at least twice, in order of times mentioned.
The China National Knowledge Infrastructure (CNKI) or Chinese Journal Net (CJN), which seems to be a component of CNKI, was mentioned most frequently. Within the group of meta-analyses that used these resources, all but three focused geographically on China.
Database/Publisher Name | Number |
---|---|
China National Knowledge Infrastructure (CNKI) (n=13) / Chinese Journal Network (CJN) (n=3) | 16 |
Scopus | 14 |
CAB Abstracts / CAB Direct /CABI | 8 |
ScienceDirect* | 8 |
AgEcon Search | 3 |
Wiley / Wiley-Blackwell* | 3 |
AGRICOLA | 2 |
JSTOR* | 2 |
RePEc (Research Papers in Economics) | 2 |
SciELO (Scientific Electronic Library Online) | 2 |
Springer* | 2 |
Table 3: Databases mentioned at least twice, other than Google Scholar and Thomson Reuters (Clarivate Analytics) databases. Asterisk (*) denotes publisher-specific or aggregator platform or database.
The few mentions of agriculture-specific databases such as CAB Abstracts was surprising, but it should be noted that CAB Abstracts is one of the databases available through the Web of Science platform (Clarivate Analytics 2017a), as is SciELO (Clarivate Analytics 2017b), so some of the meta-analyses represented in Table 2 could have included these two databases. The inclusion of publisher sites, such as ScienceDirect, and aggregators, such as JSTOR, could indicate that some researchers misunderstand how those resources differ from independent abstracting and index resources such as CAB Abstracts.
The manual technique of examining reference lists continues to be an important means of discovering additional relevant studies. As White (2009) points out, "the footnotes of a substantive work do not come as unevaluated listings ... but as choices by an author whose critical judgment one can assess in the work itself." Table 4 shows the number of studies that used this technique to find additional studies. In some cases, the authors only examined reference lists of certain articles, such as specific review articles.
Searched Reference Lists | Number | Percentage |
---|---|---|
Yes | 36 | 26% |
No | 67 | 48% |
Selected Articles Only | 5 | 4% |
Unknown | 32 | 23% |
Total: | 140 | 101% |
Table 4. Studies that searched reference lists of articles to find other relevant studies. Total percentage exceeds 100 due to rounding.
There was great variation in the details of the search strategy used to identify publications for inclusion in the meta-analyses. Only 35% (n=49) provided a description sufficient to allow replication of the search string, and only 52% (n=73) gave some indication about the date the search was carried out and/or about the years covered by the search. These results are similar to those of Koricheva and Gurevitch (2014), who found that "[o]nly 32% of the studies we evaluated provided complete details of bibliographic searches."
The vast majority (91%, n=127) of the meta-analyses provided a clear listing of the publications used, either in the body of the meta-analysis article or in an appendix, while 9% (n=13) did not. However, even those that did include a list of publications sometimes mentioned using unpublished data acquired from sources such as research institutes. Again, these results are similar to the results of Koricheva and Gurevitch (2014), who found that 87% of meta-analyses examined provided a clear list of publications used.
An examination of those publication lists, as well as the methodology given for the meta-analysis, showed that 14% (n=20) specifically limited the publications used to journal articles, while 64% (n=89) did not. An additional 22% (n=31) listed only articles, but it was unclear if those authors had deliberately excluded non-journal publications.
While meta-analysis is an important technique that can shed light on many issues pertaining to crop-related research, this study has shown that many published analyses lack basic information to enable replication and inspire confidence in the studies. In some cases, the authors did not seem to realize that they were providing inadequate information about their search strategies.
Librarians can be valuable partners in the meta-analysis process, and this present study suggests that many crop science researchers would clearly benefit from consulting with a librarian who can help to ensure that they conduct an effective and comprehensive search and that they describe the search methodology clearly and unambiguously.
With the agricultural research community calling for increased use of meta-analysis (e.g. Makowski et al. 2014; Scherm et al. 2014), science librarians should be pro-active in ensuring that they themselves are familiar with (1) PRISMA and related protocols for conducting this type of research, (2) the strengths and limitations of relevant subscription-based and freely available resources for searching the literature, (3) how to construct the complex search strings often required for a meta-analysis, and (4) how to write up the methodology. Armed with improved skills in these areas, librarians will be better able to reach out to researchers and demonstrate the value that they can add to the meta-analysis process.
The results of this study provide additional evidence of the continued relevance of science librarians in today's research environment, but it is up to librarians to make sure that they have the needed knowledge to assist researchers in this way and that they also promote this expertise to their clientele.
Bornmann, L. 2011. Scientific peer review. Annual Review of Information Science and Technology 45:197-245. DOI: 10.1002/aris.2011.1440450112
Clarivate Analytics. 2017a. Discipline-specific resources. [cited 3/2/2017]. Available from http://wokinfo.com/products_tools/specialized/
Clarivate Analytics. 2017b. Multidisciplinary resources. [cited 4/11/2017]. Available from http://wokinfo.com/products_tools/multidisciplinary/
Koricheva, J. and Gurevitch, J. 2014. Uses and misuses of meta-analysis in plant ecology. Journal of Ecology 102: 828-844. DOI: 10.1111/1365-2745.12224
Makowski, D., Nesme, T., Papy, F., & Dore, T. 2014. Global agronomy, a new field of research. A review. Agronomy for Sustainable Development 34: 293-307. DOI: 10.1007/s13593-013-0179-0
Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G., & The PRISMA Group. 2009. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Medicine 6: e1000097. DOI: 10.1371/journal.pmed.1000097
Nicholas, D., Boukacem-Zeghmouri, C., RodrÃguez-Bravo, B., Xu, J., Watkinson, A., Abrizah, A., Herman, E., & Swigon, M. 2017, Where and how early career researchers find scholarly information. Learned Publishing 30: 19-29. DOI: 10.1002/leap.1087
Philibert, A., Loyce, C., & Makowski, D. 2012. Assessment of the quality of meta-analysis in agronomy. Agriculture, Ecosystems & Environment 148: 72-82. DOI: 10.1016/j.agee.2011.12.003
Reed, J.G. and Baxter, P.M. 2009. Using reference databases. In: Cooper, H. et al., editors. The Handbook of Research Synthesis and Meta-Analysis. 2nd ed. New York: Russell Sage Foundation. p. 73-101.
Rothstein, H.R and Hopewell, S. 2009. Grey literature. In: Cooper, H. et al., editors. The Handbook of Research Synthesis and Meta-Analysis. 2nd ed. New York: Russell Sage Foundation. p. 103-125.
Scherm, H. et al. 2014. Meta-analysis and other approaches for synthesizing structured and unstructured data in plant pathology. Annual Review of Phytopathology 52:453-76. DOI: 10.1146/annurev-phyto-102313-050214
Sutton, A.J. 2009. Public bias. In: Cooper, H. et al., editors. The Handbook of Research Synthesis and Meta-Analysis. 2nd ed. New York: Russell Sage Foundation. p. 435-452.
Testa, J. 2016. The Web of Science journal selection process. [cited 8/4/2017]. Available from http://wokinfo.com/essays/journal-selection-process/
White, H.W. 2009. Scientific communication and literature retrieval. In: Cooper, H. et al., editors. The Handbook of Research Synthesis and Meta-Analysis. 2nd ed. New York: Russell Sage Foundation. p. 51-71.
Wolff-Eisenberg, C., Rod, A.B., Schonfeld, R.C. 2016. Ithaka S+R US Faculty Survey 2015. [cited 11/7/2017]. Available from https://doi.org/10.18665/sr.277685
This work is licensed under a Creative Commons Attribution 4.0 International License.