Evidence Summary

 

Much Library and Information Science Research on Open Access is Available in Open Access, But There Is Still Room to Grow

 

A Review of:

Chilimo, W. L., & Onyancha, O. B. (2018). How open is open access research in library and information science? South African Journal of Libraries & Information Science, 84(1), 11-19. https://doi.org/10.7553/84-1-1710

 

 

Reviewed by:

Rachel Elizabeth Scott

Interim Coordinator, Cataloging, Collection Management, and Library Information Systems & Integrated Library Systems Librarian

University Libraries

University of Memphis

Memphis, Tennessee, United States of America

Email: rescott3@memphis.edu

 

Received: 20 Nov. 2018                                                                Accepted: 7 Jan. 2019

 

 

cc-ca_logo_xl 2019 Scott. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

 

DOI: 10.18438/eblip29531

 

 

Abstract

 

Objective – To investigate the open access (OA) availability of Library and Information Science (LIS) research on the topic of OA, the relative openness of the journals in which this research is published, and the degree to which the OA policies of LIS journals facilitate free access.

 

Design – Bibliometric, quantitative dataset analysis.

 

Setting – African academic library and information science department.

 

Subjects – 1,185 English-language, peer-reviewed articles published between 2003 and 2013 on OA and published in journals indexed by three major LIS databases, of which 909 articles in the top 56 journals received further analysis.

 

Methods – Authors first searched LIS indexes to compile a dataset of published articles focusing on OA. They then manually identified and evaluated the OA policies of the top 56 journals in which these articles were found. The openness of these journals was scored according to a rubric modified from the Scholarly Publishing and Academic resources Coalition’s (SPARC’s) 2013 OA spectrum. Finally, authors manually searched Google Scholar to determine the OA availability of the articles from the dataset.

 

Main Results – Of the 909 articles published in the top 56 journals, 602 were available in some form of OA. Of these, 431 were available as gold copies and 171 were available as green copies. Of the 56 journals evaluated for openness, 13 were considered OA, 3 delayed OA, 27 hybrid/unconditional post-print, 2 hybrid/conditional post-print, and 11 had unrecognized OA policies.

 

Conclusion – The increasing amount and significance of LIS research on OA has not directly translated to the comprehensive adoption of OA publishing. Although a majority of the articles in the dataset were available in OA, the authors indicate that some measures of OA adoption and growth assessed in this study are only somewhat higher than in other disciplines. The authors call upon LIS professionals to become more conversant with journals’ OA policies. An acknowledgement that not all LIS scholars researching OA are necessarily advocates thereof led the authors of this study to recommend further investigation of OA research not available in OA to shed light on those scholars’ perceptions and preferences.

 

Commentary

 

The study at hand builds on existing OA analysis of LIS publications such as Vandegrift & Bowley (2014) and Grandbois & Beheshti (2014). It is unique in its analysis of LIS articles on the topic of OA, the quantification of their OA availability, and the relative openness of the journals in which most are published.

 

Throughout this commentary, Perryman’s tool for bibliometric studies was used to evaluate the rigor of the research (2009). This tool was selected for its focus on the systematic construction of bibliometric studies.

 

There is no discrete literature review, however the authors make use of current and relevant published literature to support their objectives and methodology, and to delineate the gap that their research will address. The authors cite bibliometric studies to discuss the limitations of their data sources and to provide a rationale for the indexes they selected and excluded.

 

They compiled their dataset by hand by searching the following databases: EBSCO Library & Information Science Source, EBSCO Library, Information Science and Technology Abstracts, and ProQuest Library and Information Science Abstracts. The methodology includes imprecise search procedures. The authors used the “advanced search” to conduct a subject keyword search, but the criteria in place beyond the initial search are ambiguous. In order to be included in the dataset, articles must be peer reviewed, English-language, published between 2003 and 2013, and discuss open access. These inclusion criteria are relevant to the research question, but the authors mentioned expanding the search to keywords such as “institutional repository,” which have potentially little to do with OA. The authors claim to have conducted a “thorough check” (p. 12) to remove irrelevant and duplicated records, the details of which are not defined.

 

Overall, the evaluation methods were appropriate to the objectives, but there are some discrepancies in the data. Table 2 is labeled “Top 50 journals…” where the text indicates “Table 2 shows the fifty-six journals” (p. 15). The list of “Journals with unrecognized OA policies” included several major titles, such as American Archivist and portal. Although it is difficult to confirm that all policies were not readily available when the authors searched in June 2015, a December 28, 2014 version of the American Archivist’s website indicates that its journal content “is licensed under a Creative Commons Attribution Non-Commercial 3.0 United States License” (Society of American Archivists, 2014). That so many processes were executed manually and individually renders replication both challenging and time consuming without defined supporting data. What may simply be a lack of clarity also casts a shadow on data collection and analysis, both of which would benefit from additional supporting data.

 

The study achieves its original objective of measuring the degree of openness of OA scholarship in LIS. Like prior research, the present study shows that both self-archiving and journal-based OA are not yet widely established practices. This article’s contribution is in showing that despite increased LIS literature and advocacy on the topic of OA, LIS scholars and journals have plenty of room to grow in their adoption of OA. Although the data is not sufficiently strong to serve as a benchmark for future measurement, the article in its current form is a strong piece of advocacy.

 

References

 

Grandbois, J., and Beheshti, J. (2014). A bibliometric study of scholarly articles published by library and information science authors about open access. Information Research: An International Electronic Journal, 19(4). http://www.informationr.net/ir/19-4/paper648.html#.XIf69yhKhPY

 

Perryman, C. (2009). Evaluation tool for bibliometric studies. Retrieved from https://www.dropbox.com/l/scl/AAAL7LUZpLE90FxFnBv5HcnOZ0CtLh6RQrs

 

Society of American Archivists. (2014, December 28). American Archivist editorial policy. Retrieved from https://web.archive.org/web/20141228053114/http://www2.archivists.org:80/american-archivist/editorialpolicy

 

Vandegrift, M., & Bowley, C. (2014 April 23). Librarian, heal thyself: A scholarly communication analysis of LIS journals. In the Library with the Lead Pipe, 2-18. http://www.inthelibrarywiththeleadpipe.org/2014/healthyself/