Previous Contents Next
Issues in Science and Technology Librarianship
Fall 2014
DOI:10.5062/F4G73BP3

[Refereed]

Factors in Science Journal Cancellation Projects: The Roles of Faculty Consultations and Data

Jeanine Williamson
jwilliamson@utk.edu

Peter Fernandez
pfernand@utk.edu

Lana Dixon
ldixon@utk.edu

University of Tennessee Libraries
Knoxville, Tennessee

Abstract

The economic downturn of 2007-08 forced many academic libraries in the United States to cancel journals. We surveyed life sciences librarians from ARL libraries to find out about their experiences with journal cancellations during 2008-12. Overall, we discovered that two factors were essential in decision-making: faculty consultations and data. While faculty consultations and data have always been influential in journal cancellations, our survey allowed us to explore what roles these two factors played in the recent period of necessary reductions.

Introduction

With the ready availability of journal use data, cost information, and impact factors, as well as increasingly efficient methods of compiling information about the value of specific journals in a library's collection, do subject librarians feel any better or more confident about the process of journal cancellation projects and the resulting decisions? The literature abounds with studies that look at how libraries accomplish journal cancellations, but what is lacking is a sense of how librarians perceive the process. Three science librarians at the University of Tennessee surveyed life sciences librarians from public ARL (Association of Research Libraries) libraries for their perspectives concerning journal data availability and usefulness, their consultations (if any) with faculty, and their overall sense of confidence in cancellation decisions made. Responses indicated general satisfaction with the availability of data, but the data was not always considered adequate, timely, or convenient to use. Consultation with faculty was highly valued as a source of decision-making information, but obtaining faculty input was also a point of discomfort or frustration for some respondents. We also explored the general level of confidence in the final decisions of journal cancellation projects and found that librarians are, for the most part, confident.

For our study, we focused on science librarians, specifically those in the life sciences. Science journal subscriptions are typically among the most expensive and often incur some of the highest price increases each year (Creaser and White 2008). In particular, within the sciences, biology has the fourth most costly journal prices according to the 2014 periodical survey published by Library Journal (Bosch and Henderson 2014). Our study is part of a long line of research focused on how libraries make decisions about their collections of scientific literature. Dess (1998) compared and contrasted three different approaches taken by Rutgers science librarians to evaluate journals on an ongoing basis, including both usage data and faculty preferences. Galbraith (2002) provided a case study of a science library that emphasized the difficulties in incorporating both print and often "slippery" electronic data into ongoing collection maintenance. Over the years, science librarians have continued to publish case studies that highlight different aspects of scientific literature. Staggs-Neel (2006) framed the University of Kentucky's cancellation project as part of the scholarly communications crisis. Li and Kopper (2005) accentuated the dynamic between print and electronic resources, while Ward et al. (2006) attempted to create a systematic approach for decision-making in the field. Ten years after Dess' study (1998), a summary of five science librarians by Kennedy et al. (2008) highlighted many of the same themes of documentation, data, and interpersonal relationships. The details and technology had changed dramatically in the years between the two studies, but primary considerations remained the same.

Our study complements and expands on these case studies by focusing on the experiences of science librarians. We surveyed science librarians who had participated in a journal cancellation project from 2008 to the present. This relatively recent perspective is important because the context of journal cancellations in the 2000s is different than in previous decades. Whereas the emphasis prior to roughly 2000 was on print journal reviews, journal cancellation projects since then have had to take into account the complexities of canceling journals in an online environment. Factors include issues with journals in aggregator databases and package deals from scientific publishers (e.g., Sprague and Chambers 2000). Other factors affecting journal cancellations during the time of our study include the recession in the United States that began in 2007; the increased provision of information in open access journals; and the growth of institutional repositories (Curran 2008; Laakso and Bjork 2012; Pinfield et al. 2014). Journal cancellation decisions have never been easy, but the many changes and developments of the 2000's created an even more complicated environment than before.

Literature Review

Several surveys and case studies have identified factors that librarians take into consideration when making journal cancellation decisions. Spencer and Millson-Martula (2006) characterized comprehensive reviews of print serials collections as "a multi-faceted process." They stated: "[M]any reviews leading to cancellation incorporate multiple elements in the decision-making process" (Spencer and Millson-Martula 2006). The ALPSP Survey of Librarians on Factors in Journal Cancellation (Ware 2006) found that the three most important factors in journal cancellation decisions were that the faculty no longer required the journal; usage; and price. Nixon (2010) reported on a cancellation project that took into account the factors of usage and presence of titles in an aggregator's database. Rupp-Serrano et al. (2002) listed a number of considerations when deciding whether to cancel journals with multiple formats available (licensing, provider, local politics, publication structure, technological considerations, and local resources). Foudy and McManus (2005) described a decision grid process that took into account access, cost-effectiveness, breadth/audience, and uniqueness during the cancellation of electronic resources. Li and Kopper (2005) used price, usage, and faculty demand in their journal cancellation decisions. Surveys before 2000 such as Richards and Prelec (1993) identified factors such as usage, cost, duplication, indexing, ISI impact factor, and less importantly, programmatic relevance: "perhaps because the 'fat' had long since been trimmed from serials lists and the processes being used currently are based on an assumption that only relevant titles are being acquired."

Consultation with Faculty

Consultation and collaboration with faculty outside the library would seem to be a natural part of any journal cancellation process in an academic library. Extensive research exists concerning how librarians may effectively engage faculty and incorporate their input into the cancellation process. Broude (1978) described in a more than three decades old case study how the level of faculty involvement may vary from one library to another:

Although some degree of faculty involvement is generally considered desirable, many librarians recognize that such involvement can be time-consuming and costly, and that faculty and librarian views concerning what is important to the collection may greatly differ.

Many more recent studies speak to the considerable value of having faculty support and guidance for successful journal cancellation and review projects (Bucknell and Stanley 2002; Carey et al. 2005; Clement et al. 2008; Dess 1998; Nixon 2010; Sinha et al. 2005; Walter 1990). Nixon (2010) provides a unique look at the evolving process of engaging faculty by detailing serials cancellation projects that took place in 1992, 1997, and 2009 at Purdue University. In 1992 Nixon found that faculty resorted to canceling the most expensive title so as to get "the unpleasant task over as fast as possible." The 1997 cancellation featured "partnering with the faculty (instead of being at war with them)" which resulted in being able "to make logical, fair and effective cancellations decisions rather quickly." Armed with extensive data in 2009, Nixon described a situation of full involvement by faculty in the cancellation process--"a shared crisis rather than a battle."

Garczynski (2011) looked at the serials cancellation literature for strategies to garner faculty involvement to see if these approaches could be applied for the same purpose with database cancellations at Towson University. Garczynski adapted three serial cancellation approaches and surveyed the faculty for their preferred method of involvement in the database cancellation process. The findings indicated "for the most part, faculty do want to play a role in database cancellation decisions even if they are otherwise minimally knowledgeable about library events and resources." After starting with a question of: "how involved faculty would like to be in potential database cutbacks," Garczynski came away with "a sense of the need to involve faculty more in cutback decisions." The importance of subject librarians/liaisons as communicators with faculty is a recurring theme in the literature. For example, Sinha et al. (2005) described the success of a cancellation project in 2003 that featured stronger roles played by subject librarians in public relations and marketing. In previous cancellations, "the role of the liaison was more subtle, less proactive" resulting in faculty who felt "confused, angry, and disappointed" with their perceived lack of involvement. The 2003 approach better incorporated librarians as the bridge between faculty and the accomplishment of a journal cancellation project.

The literature shows that involvement of faculty in the decision-making process is, for the most part, highly prized and considered an essential component of a successful journal cancellation project. The pursuit of objective data to underpin the framing of decisions and to balance the more subjective aspects of journal review is also an important part of the literature concerning journal cancellation projects.

Data

It is perhaps obvious to say that the collection and analysis of data have been a crucial part of many journal cancellation projects. While "objective data" has at times been considered a different kind of information than "subjective input" (e.g., Tucker 1995), the literature does not show entirely standardized approaches for the use or interpretation of data in journal cancellations. Gallagher et al. (2005), who used an evidence-based approach, consulted usage data and SFX [open URL link resolver] data, but pointed out: "The shortcomings of traditional journals usage statistics have been well-documented.... Electronic journal use statistics provide similar problems." Enssle and Wilde, "in spite of the uneven information from electronic resources" (2002), decided to use in-house use statistics, availability of full text, number of citations and citation impact, and provision of copies of articles quickly and at no cost.

Dewland and Minihan (2011), who compiled usage data and faculty citations, developed a "Relevant Value Scale" to standardize the de-selection decisions. This formula was "designed to balance the support of research (citations) versus the support of teaching and learning (usage statistics), and to measure it against the cost of the journal" (2011). Dewland and Minihan recognized that their formula might not be applicable to all institutions with different emphases on teaching and research. A project applying both data and some subjective input to de-selection decisions in a science and engineering library is described by Galbraith (2002). Galbraith's library examined a wide range of statistics and factors including: in-house usage statistics, cost data, and vendor-supplied usage statistics; usage patterns in the last five years; population served; degree of coverage by other titles; significance of the title; coverage in indexes; availability at nearby libraries; document delivery options; and faculty need.

As these highlights from a review of the literature have shown, librarians are continually considering how best to gather, combine, and balance the data and input available to them for making the most reasoned decisions about journal cancellations. We wanted to find out about life sciences librarians' experiences with the process now. Do they have the desired data when needed? Are faculty consulted at an adequate level? What are some of the variables/factors that generate tension in the process?

Methods

Our study investigated the experiences and perceptions of life sciences librarians at ARL libraries who had been involved in journal cancellation projects between 2008 and 2012. This covers the period following the United States' economic recession between 2007 and 2008 (Business Cycle Dating Committee 2008).

To create the initial target group, we visited the web sites of publicly funded ARL Libraries in the United States and identified librarians whose public profiles included responsibilities in the life sciences. In addition to contacting these individuals, we posted a message to the STS-L mailing list (Science and Technology Section of the American Library Association) soliciting participants over the age of eighteen who were life sciences librarians in ARL Libraries. Our targeted solicitations focused on publicly funded institutions, but in the second e-mail to the STS-L mailing list, we allowed participants from any institution. Only two of our respondents were from privately funded institutions.

The survey was created in SurveyMonkey and made available via a web link. It was composed primarily of ratings scales and multiple choice questions with opportunities for additional comments. Open-ended questions were also included. The survey was developed by three science librarians experienced with journals cancellations at the University of Tennessee and was pilot tested by several colleagues. Their feedback was incorporated to ensure the survey terms were understandable.

Forty-one librarians participated in the survey. This is a relatively small population in absolute terms, which made detailed analysis difficult and precluded most standard tests of statistical significance. However, it represents a relatively large percentage of the overall population of life science librarians at public ARL institutions. For context, there were a total of 67 institutions targeted by our survey. While in some cases there was more than one life sciences librarian at an institution, 41 participants nonetheless represents a proportionally high percentage of the overall population. Moreover, the survey captures the opinions of this relatively small group who are responsible for making decisions about collections that affect the life sciences in ARL libraries. Their collective decisions impact the overall state of research and university libraries collections at most of the largest institutions in the country.

Counts of responses for individual questions were sometimes greater or lower because not every participant answered all the questions. While we did not collect demographic characteristics or geographic locale, the survey did ask participants to provide a brief description of their jobs. These descriptions showed that participants held a number of different kinds of positions, such as subject librarians, managers of science libraries, etc.
The Survey Monkey format allowed us to put in conditional skip logic that required participants to assent that they were life sciences librarians in ARL libraries. Because we wanted to survey respondents with fairly comparable job duties, we also included skip logic that required participants to have both research assistance and collection development responsibilities, or they were disqualified from the rest of the survey. Participants were also required to assent to a human subjects informed consent document.

Data Analysis

The small cell sizes precluded applying inferential statistics tests such as chi-square. We looked at frequency percentage distributions of responses and analyzed the open comments. The open-ended responses were coded according to themes such as data, faculty input, and cost. These themes were used in writing up the results. The open comments were examined next to the frequency information in order to elucidate the results. Our findings were initially based on the choice responses, but were also informed by the more qualitative results from the free-text responses.

The two themes prominent in our results concerning librarians' perceptions of the journal cancellation process were faculty consultation and data. Following is an examination of these themes as well as other factors that played a role in the process.

Consultation with Faculty Outside the Library

The survey revealed that consultation with faculty is one of the most valued sources of information in the review process. When asked to select from a list of factors taken into account by respondents when making decisions, the top three considerations were usage data, cost, and faculty concerns:

Table 1. What did you take into account when you were making decisions? Select all that apply.

Answer Options

Response Percent

Response Count

Usage data

91%

30

Cost of journal

88%

29

Faculty concerns

85%

28

Librarian subject expertise

67%

22

Format

64%

21

Impact factor, or other indicators of prestige

58%

19

Student concerns

36%

12

Core title lists

33%

11

Other (please specify)

12%

4

answered question

33

Even survey questions that did not deal directly with faculty consultation sometimes elicited comments related to faculty consideration. For example, in regard to the question, "What was your role in your most recent journal cancellation project," four respondents commented that they consulted with faculty as part of their role when that was not included in the answer options. Two other comments did reveal that in some institutions, librarians do not always consult with faculty before making decisions. One stated that faculty had been left out of the decision making process, and another spoke of "limited consultation with faculty as they were told to depend on cost/usage data review."

Librarians perceived that materials designed to support faculty research were of higher priority than materials designed to support the curriculum. When asked, "How did curriculum and faculty research influence decisions?" 45% (14) of respondents rated curriculum as a major factor, and 52% (16) rated it as a minor factor. This is in contrast to faculty research, which 91% (29) rated as a major factor, and 6% (2) rated as a minor factor. Only one respondent indicated that neither was a consideration (see Appendix, Table 2). A comment from a question concerning guidelines further underscored the influence of faculty research: "Knowledge of faculty research makes you want to keep a title even though it doesn't meet guidelines." Comments related to the question, "Did the review process result in any changes to the decisions you submitted?" indicated that discussion with faculty did result in retaining a few titles.

Two survey questions asked about the kinds of information that subject librarians had and what they would have liked to have (see Appendix, Tables 3-4). 63% (20) had faculty feedback to draw upon, but the amount and quality were not surveyed or commented upon. Fifteen percent (4) of respondents indicated that they would have liked to have faculty feedback. Thus, 77% (24) of respondents either had faculty feedback or wanted it, further underscoring the perceived value of faculty feedback (see Appendix, Tables 3-4).

For librarians tasked with making decisions, faculty input sometimes served to confound as much as it clarified. Comments noted that faculty "had other metrics" and that their desire to hold or own items "forced a particular model of access." Another comment spoke of there always being at least one faculty member "angered by finding 'their' title on a cut list." Particularly interesting was a comment related to the question, "What additional data would you have liked to have." The respondent stated that faculty feedback was not actively sought, "as previous experience indicated that faculty have very limited perspectives on overall journal use and value." These comments reveal how faculty input can both complicate and inform the librarians' decision-making process.

When asked about obstacles to obtaining data, one librarian noted that faculty had to be contacted multiple times to get feedback: "I have to ask, and re-ask, and re-ask." Another spoke of how only a few faculty members responded. The way in which faculty are approached for input can certainly influence the quantity and quality of the information received as reflected by this comment: "Individual consultation with faculty about individual titles was key to getting the feedback to drive decisions. Not sure they would have responded to a big list review approach." The time constraints attached to journal cancellation projects could be significant inhibitors of faculty consultation. Faculty were not always available. One respondent commented that the time between receiving the new fiscal year budget and the window to cancel could be as short as July-August, a time when many faculty are not available for consultation. Others expressed their frustration that internal deadlines made it difficult to consult faculty as much as they would have liked. The time crunch was particularly a problem since respondents also indicated that journal cancellation projects require a significant investment of time by subject librarians. In spite of comments indicative of dissatisfaction with time constraints surrounding accomplishment of journal cancellation projects, roughly two-thirds of respondents indicated that they had sufficient time to work on the project (see Appendix, Table 5).

Another obstacle encountered by librarians was contending with discontented faculty in some cases, which caused discomfort. As one person put it: "Wear your asbestos underwear. You can't please everyone. The teaching faculty will rant, so have all the data you can muster." Another librarian expressed pain at having to make tough decisions affecting faculty: "Cancelling was painful because someone had to be affected with canceling a title." Another emotion concerning faculty reactions was frustration. When asked, "Were there titles that you wanted to cancel but could not, one librarian stated, "IMPORTANT people make waves and we have from time-to-time had to 'protect' a pet journal."

On the whole, faculty input was highly valued and sought by the majority of survey respondents. Answers to the question, "Anything else you would like to tell us?" provided some additional insights related to the consultative relationship with faculty:

"Most faculty have grown accustomed to cutting, realize that it isn't going to go away, and accept the transition from a holdings to an access model. Some have intense feelings about cutting anything." Whether faculty feedback was contentious or not, librarians would much rather have faculty input when making journal cancellation decisions.

Data

Faculty concerns were important, as 85% (28) of respondents said they took it into account when making decisions. However, the most frequently cited factor that librarians considered was usage data 91% (30) Almost as many respondents, 88% (29) used cost data as well (see Appendix, Table 1). The near-ubiquity of usage and cost considerations is evidence that librarians rely heavily on quantitative data in any decision about journal cancellations.

When respondents were asked if they had the data they needed, 58% (18) indicated that they usually did; while 36% (11) had the data they needed at least "sometimes" (see Appendix, Table 6). This would seem to indicate that for the most part librarians did not encounter significant barriers to obtaining data. However, when the question was reversed and librarians were asked about the data that they would have liked to have had, only 41% (11) of respondents indicated that there was no additional data that they would have liked to have had (see Appendix, Table 4).

Respondents not only indicated the importance of data to the project when asked directly about it, but it was also frequently brought up in the comments to other questions. The most common type of data to which respondents wanted better access was usage data: 26% (7) (see Appendix, Table 4). In the comments field to this question, respondents amplified their concerns stating that they would have preferred "usage data and cost increases over a longer time period" as well as "usage data for those titles that did not provide it." Commenters also highlighted frustrations about the timeliness of the data, as well as concerns about the difficulty of accessing, compiling, and comparing often disparate data.

Despite the clear consensus around the importance of data, in an open-response question about obstacles to obtaining data, 17 respondents took the time to detail a wide variety of obstacles, from lack of internal support for using data to inconsistent standards. In some cases librarians did not feel that the data they had was the right data, complete data, or that the data was convenient enough to use. Although their precise concerns varied widely, the fact remains that a significant number of respondents experienced obstacles in obtaining or using data. For example, difficulties were posed by time constraints associated with acquiring and synthesizing data. One respondent noted the enormous amount of time required to sort through the data supplied by publishers in order to winnow out the statistics that actually applied to titles under consideration. Another commented about how time consuming it is just to get data from some publishers. A third respondent voiced concern about how there is not enough "time to compile necessary data from the various areas and provide it in a useful format."

Respondents also highlighted other difficulties with the data received from publishers. For example, in some cases the amount of data was viewed as insufficient. As one respondent said, "The information we were provided with was not enough to make informed decisions." Another problem was consistency of publisher-provided data including that gathered from journal aggregators (for example, Academic Search Premier which is a full-text database). A third problem was capturing in-house use data. For example, one person "had to do a barcoding project to be able to capture in-house use." An additional problem was that data was not readily available sometimes. As one librarian said, "Nothing was terribly easy; if I wanted impact factor I had to look it up so mostly didn't." Inconvenience was a complaint, as well. Even when one librarian had all of the data desired, he or she said, "[T]he information was not in one place, and [I] had to consult multiple sources." Perhaps due to these problems, librarians expressed frustration with acquiring and using data. " One librarian went so far as to say it would have been nice to have a "crystal ball" when asked, "What additional journal data would you have liked to have had?" Still another responded to the same question, "less ambiguity, incompleteness in the usage data."

Guidelines and Big Deal Packages

Two additional features are worth noting concerning the climate in which librarians were making decisions about cancellations. First, librarians' decision making was often constrained by guidelines given to them a priori concerning the process. Second, "Big Deal" packages, such as ScienceDirect or similar publisher collections, often complicated the process. Both of these factors placed boundaries on which journals could be considered and on how librarians made the decisions.

Most respondents 70% (23) had some form of guidelines provided other than a target dollar amount (see Appendix, Table 7). There was a wide variety of comments about what these guidelines contained, but themes were: cost of journals, level of usage, and the inability to cut journals that were part of packages. The most common theme in the comments was the requirement of faculty or departmental feedback (seven comments) in the guidelines.

In general, guidelines were thought to be helpful 78% (19) (see Appendix, Table 8). In the comments respondents mentioned two primary ways that guidelines helped. One was that guidelines provided metrics relating to usage data and cost data. The other was that by their mere existence, guidelines gave the librarians something objective to point to if their decisions were challenged. This overall satisfaction may also be a reflection of the fact that subject librarians were often involved in the development of the guidelines 64% (15) of the time (see Appendix, Table 9).

Librarians who had guidelines were, in general, satisfied with them, and those who did not were satisfied with not having them. Librarians who said they did not have guidelines provided were asked if, in retrospect, they thought guidelines would have been helpful (see Appendix, Table 10). All respondents, except for one, indicated that they would not have preferred guidelines, and this was reinforced by comments such as "flexibility is key" and "[additional guidelines] could have restricted subject specialists in making informed decisions." Despite this overall satisfaction with guidelines, it is worth noting that around a quarter of respondents (28%; 8) either sometimes, or frequently, had their subject expertise come into conflict with the guidelines provided. Another 45% (13) respondents indicated this occurred on rare occasions (see Appendix, Table 11).

Decision-making was also sometimes complicated by the requirements of "Big Deal" packages that prohibited cutting individual titles. For 42% (13) of respondents, "Big Deal" packages were not available for cancellation. When they were available for cancellation and the librarian had input as to whether any of them should be canceled, "Big Deal" packages were just as likely to be canceled as not (see Appendix, Table 12). Restrictions on canceling these packages were brought up in the comments to many questions. Even when allowed to cancel "Big Deal" packages, some respondents found it very difficult to figure out what was eligible for cancellation. The inability to cancel "Big Deal" packages at times evoked negative emotions. In response to the question, "Were there titles that you wanted to cancel but could not?" one person answered, "Packages. #$^& packages!"

Considering journals as a whole, 14% (4) of the librarians said there were frequently titles they wanted to cancel but could not. Another 48% (14) said that this was sometimes the case, while 31% (9) said rarely and 7% (2) said never (see Appendix, Table 13). When asked specifically about journal aggregators, 53% (16) were unable to cancel these (see Appendix, Table 14). In the instances where electronic journals were available both on their own and as part of an aggregation service, 53% (16) sometimes canceled these journals. In contrast, only five librarians (17%) stated that they almost always canceled some titles that were also in aggregators (see Appendix, Table 15). When asked what factors were considered in these circumstances, seven respondents mentioned the importance of secure, long-term access to titles. Another five mentioned the importance of the title to their collections, either for research or to the faculty.

Journal aggregators, "Big Deal" packages, and a priori guidelines made the decision process more complicated for librarians as revealed by survey data and comments. Notwithstanding these complicating factors, librarians generally made purposeful decisions based on faculty feedback and other types of data that coincided with a high level of satisfaction with the final outcomes. Librarians expressed a high degree of confidence that the best decisions were made: 48% (14) very confident and 48% (14) somewhat confident, with only one person who was not very confident (see Appendix, Table 16). Yet, one respondent summed up the process with the comment: "All that data, and all those conversations, and yet... do you ever really feel comfortable about this stuff and confident about these decisions?"

Conclusion

We looked at how life sciences librarians feel about the journal cancellation process since 2009 in the context of more readily available data and an established tradition of faculty consultation. An interesting realization was that even with consultation of faculty and more and better data, many fundamental issues in canceling journals have not changed.

Our study found that faculty input is one of the most important components of the information gathering process during journal cancellation projects. In spite of a greater ability now to acquire more and better data concerning usage, price, and impact of journals, the desire to consult with faculty for their perspectives, preferences, and judgments is just as strong as the need for more quantitative, concrete data. In fact, journals supporting faculty research were of higher priority for librarians than those supporting the curriculum. Acquiring faculty input is not without difficulty as noted by comments from survey respondents, but some of the obstacles are library-generated such as when cancellation projects are inconveniently timed or have very narrow windows for completion. Even though faculty feedback is not always helpful or without controversy, librarians would much rather have faculty involvement in the journal cancellation process than not.

Our survey also found that data was sometimes a problematic source of information for decision-making even though data was valued by more librarians than personal input from any source (librarians, students or faculty). While the majority stated that they sometimes had what they needed, follow-up questions revealed a host of complications that made getting and using the data difficult.

Some librarians could not easily turn data into decisions. Only 41% (11) of respondents indicated that there was no additional data they would have liked to have had (see Appendix, Table 4). Usage data in particular elicited comments about accuracy throughout the survey, while others found that they frequently did not have easy access to the data they needed to make decisions. Both third party vendors, as well as internal processes were cited as barriers, and this is an area for potential future research to uncover exactly what best practices might help alleviate these concerns.

Guidelines were common and considered largely successful in the institutions where they were implemented, while librarians in the institutions that did not have guidelines were equally appreciative of their absence. In addition, in about half of the cases, librarians were unable to cancel titles in journal aggregators and "Big Deal" packages. In instances when these journals were available for cancellation, they were sometimes, but not always canceled. Despite these challenges, librarians were confident overall that the best decisions were made.

As much as technology has changed the context of journal cancellations, the process still boils down to librarians making decisions in the midst of many challenges. Future studies could examine the experience of librarians in other disciplines to better understand the role of the librarians' subject areas. In addition, it would be interesting to conduct in-depth interviews with librarians involved in cancellation decisions to gain a better understanding of the process.

Having examined the roles of faculty consultation and data in the cancellation process, we hope that our study will inform the development of best practices that will address the concerns identified by librarians and incorporate their successful processes.

References

Bosch, S. & Henderson, K. 2014. Steps down the evolutionary road: Periodicals price survey 2014. Library Journal 139(7):32-37.

Broude, J. 1978. Journal deselection in an academic environment: A comparison of faculty and librarian choices. The Serials Librarian 3(2):147-166.

Bucknell, T. & Stanley, T. 2002. Design and implementation of a periodicals voting exercise at Leeds University Library. Serials 15(2):153-159.

Business Cycle Dating Committee. 2008. Determination of the December 2007 peak in economic activity [Internet]. National Bureau of Economic Research. Available from: http://www.nber.org/cycles/dec2008.html

Carey, R., Elfstrand, S. & Hijleh, R. 2005. An evidence-based approach for gaining faculty acceptance in a serials cancellation project. Collection Management 31(1/2):59-72.

Clement, S., Gillespie, G., Tusa, S. & Blake, J. 2008. Collaboration and organization for successful serials cancellation. The Serials Librarian 54(3-4):229-234.

Creaser, C. & White, S. 2008. Trends in journal prices: An analysis of selected journals, 2000-2006. Learned Publishing 21(3) 214-224.

Curran, R. 2008. U.S. entered a recession a year ago, NBER says. Wall Street Journal [Internet]. Available from: http://online.wsj.com/news/articles/SB122815252673269395

Dess, H.M. 1998. Gauging faculty utilization of science journals. Science & Technology Libraries 16(3-4):171-190.

Dewland, J. & Minihan, J. 2011. Collective serials analysis: The relevance of a journal in supporting teaching and research. Technical Services Quarterly 28(3):265-282.

Enssle, H.R. & Wilde, M.L. 2002. So you have to cancel journals? Statistics that help. Library Collections, Acquisitions, and Technical Services 26(3):259-281.

Foudy, G. & McManus, A. 2005. Using a decision grid process to build consensus in electronic resources cancellation decisions. Journal of Academic Librarianship 31(6):533-538.

Galbraith, B. 2002. Journal retention decisions incorporating use-statistics as a measure of value. Collection Management 27(1):79-90.

Gallagher, J., Bauer, K. & Dollar, D.M. 2005. Evidence-based librarianship: Utilizing data from all available sources to make judicious print cancellation decisions. Library Collections, Acquisitions, & Technical Services 29(2):169-179.

Garczynski, J. 2011. Making the cut: Do faculty want to be involved in library database cancellations? Practical Academic Librarianship: The International Journal of the SLA Academic Division 1(1):16-27.

Kennedy, K., Cataldo, T.T., Davis, V., Gonzalez, S.R. & Newsom, C. 2008. Evaluating continuing resources: Perspectives and methods from science librarians. The Serials Librarian 55(3) 428-443.

Laakso, M. & Bjork, B.C. 2012. Anatomy of open access publishing: A study of longitudinal development and internal structure. BMC Medicine 10:9.

Li, X. & Kopper, C. 2005. Cancellation of print journals in the electronic era: A case study. Against the Grain 17(6):1-1, 18, 20, 22.

Nixon, J.M. 2010. A reprise, or round three: Using a database management program as a decision-support system for the cancellation of serials. The Serials Librarian 59(3-4):302-312.

Pinfield, S., Salter, J., Bath, P.A., Hubbard, B., Millington, P., Anders, J.H.S. & Hussain, A. 2014. Open-access repositories worldwide, 2005-2012: Past growth, current characteristics, and future possibilities. Journal of the Association for Information Science and Technology.

Richards, D.T. & Prelec, A. 1993. Serials cancellation projects: Necessary evil or collection assessment opportunity? Journal of Library Administration 17(2):31-45.

Rupp-Serrano, K., Robbins, S. & Cain, D. 2002. Canceling print serials in favor of electronic: criteria for decision making. Library Collections, Acquisitions, and Technical Services 26(4):369-378.

Sinha, R., Tucker, C. & Scherlen, A. 2005. Finding the delicate balance: Serials assessment at the University of Nevada, Las Vegas. Serials Review 31(2):120-124.

Spencer, J.S. & Millson-Martula, C. 2006. Serials cancellations in college and small university libraries the national scene. The Serials Librarian 49(4):135-155.

Sprague, N. & Chambers, M.B. 2000. Full text databases and the journal cancellation process: A case study. Serials Review 26(3):19-31.

Staggs-Neel, J. 2006. A serials cancellation project at the University of Kentucky. Journal of Agricultural & Food Information 7(2/3):57-63.

Tucker, B.E. 1995. The journal deselection project: the LSUMC-S experience. Library Acquisitions: Practice and Theory 19(3):313-320.

Walter, P.L. 1990. Doing the unthinkable: Cancelling journals at a research library. The Serials Librarian 18(1-2):141-153.

Ward, R.K., Christensen, J.O. & Spackman, E. 2006. A systematic approach for evaluating and upgrading academic science journal collections. Serials Review 32(1):4-16.

Ware, M. 2006. ALPSP Survey of Librarians on Factors in Journal Cancellation. Worthing, Association of Learned and Professional Society Publishers.

Appendix

Table 1

Table 1. What did you take into account when you were making decisions? Select all that apply.

Answer Options

Response Percent

Response Count

Librarian subject expertise

67%

22

Faculty concerns

85%

28

Student concerns

36%

12

Cost of journal

88%

29

Usage data

91%

30

Format

64%

21

Impact factor, or other indicators of prestige

58%

19

Core title lists

33%

11

Other

12%

4

answered question

33

Table 2

Table 2: How did curriculum and faculty research influence decisions?

Answer Options

Major factor

Minor factor

Not a factor

Response Count

Curriculum

45% (14)

52% (16)

3% (1)

31

Faculty research

91% (29)

6% (2)

3% (1)

32

answered question

32

Table 3

Table 3: What journal data did you have to inform your decisions? Select all that apply.

Answer Options

Response Percent

Response Count

Usage data

88%

28

Cost data

91%

29

Faculty feedback

63%

20

Impact factor, or other indicators of prestige

53%

17

I did not need any data

3%

1

Other

13%

4

answered question

32

Table 4

Table 4. What additional journal data would you have liked to have had? Select all that apply.

Answer Options

Response Percent

Response Count

Usage data

26%

7

Cost data

7%

2

Faculty feedback

15%

4

Impact factor, or other indicators of prestige

4%

1

None

41%

11

Other

41%

11

answered question

27

Table 5

Table 5. In general, did you feel that you had sufficient time to work on this project?

Answer Options

Response Percent

Response Count

Yes

66%

19

No

35%

10

answered question

29

Table 6

Table 6. Did you feel you had the data you needed to make decisions for the journal cancellation project?

Answer Options

Response Percent

Response Count

Usually

58%

18

Sometimes

36%

11

Rarely

7%

2

answered question

31

Table 7

Table 7. Were guidelines (other than a target dollar amount) provided for making cancellation decisions?

Answer Options

Response Percent

Response Count

Yes

70%

23

No

30%

10

answered question

33

Table 8

Table 8. Did you find the guidelines to be helpful?

Answer Options

Response Percent

Response Count

Yes

78%

18

No

22%

5

Why or why not?

15

answered question

23

Table 9

Table 9. Were subject librarians involved in the development of the guidelines?

Answer Options

Response Percent

Response Count

Yes

64%

14

No

36%

8

Other

3

answered question

22

Table 10

Table 10. Would guidelines have been helpful?

Answer Options

Response Percent

Response Count

Yes

14%

2

No

86%

12

Why or why not?

12

answered question

14

Table 11

Table 11. If guidelines were provided, did your subject expertise come into conflict with the guidelines?

Answer Options

Response Percent

Response Count

Always

0%

0

Frequently

10%

3

Sometimes

17%

5

Rarely

45%

13

Never

3%

1

Guidelines not provided

24%

7

Please explain

4

answered question

29

Table 12

Table 12. What input did you have in making decisions about Big Deal Packages? (For example, ScienceDirect or similar publisher collections.) Select all that apply.

Answer Options

Response Percent

Response Count

Big Deal Packages were canceled with my input

13%

4

Big Deal Packages were canceled without my input

7%

2

Big Deal Packages were retained with my input

13%

4

Big Deal Packages were retained without my input

13%

4

Big Deal Packages were not available for cancellation

42%

13

Other

29.%

9

answered question

31

Table 13

Table 13. Were there titles that you wanted to cancel but could not?

Answer Options

Response Percent

Response Count

Always

0%

0

Frequently

14%

4

Sometimes

48%

14

Rarely

31%

9

Never

7%

2

Please explain

8

answered question

29

Table 14

Table 14. What input did you have in making decisions about journal aggregators? (For example, Academic Search Premier, which is a full text database.) Select all that apply.

Answer Options

Response Percent

Response Count

Journal aggregators were canceled with my input

3%

1

Journal aggregators were canceled without my input

3%

1

Journal aggregators were retained with my input

13%

4

Journal aggregators were retained without my input

20%

6

Journal aggregators were not available for cancellation

53%

16

Other

23%

7

answered question

30

Table 15

Table 15. If a journal were available as both a subscription from a publisher and as part of a journal aggregator database (such as Academic Search Premier), how did that influence your decision about the title?

Answer Options

Response Percent

Response Count

Almost always canceled subscribed titles that were also in aggregators

17%

5

Sometimes canceled subscribed titles that were also in aggregators

53%

16

Almost never canceled subscribed titles that were also in aggregators

3%

1

Presence in an aggregator was not a consideration

7%

2

Other

20%

6

answered question

30

Table 16

Table 16. Rate your confidence that the best decisions were made for your subject area(s).

Answer Options

Response Percent

Response Count

Very confident

48%

14

Somewhat confident

48%

14

Not very confident

3%

1

answered question

29

Previous Contents Next

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License. W3C 4.0   Checked!