Using Evidence in Practice

 

Increasing Objectivity in eResource Selection Using a Priority Matrix

 

 

Megan L. Anderson

Research & Curriculum Librarian

Library and Media Services         

Fanshawe College

London, Ontario, Canada

Email: manderson@fanshawec.ca

 

Linda L. Crosby

Research & Curriculum Librarian

Library and Media Services

Fanshawe College

London, Ontario, Canada

Email: lcrosby@fanshawec.ca

 

Received: 28 Aug. 2018                                                                  Accepted: 12 Nov. 2018

 

 

cc-ca_logo_xl 2016 Anderson and Crosby. This is an Open Access article distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

 

DOI: 10.18438/eblip29499



 

Setting

 

Library and Media Services (LMS) at Fanshawe College is located in London, Ontario, Canada. LMS is an academic library providing a variety of services and resources to approximately 14,000 full-time equivalent (FTE) students and 2,800 faculty and staff. A significant number of students attend regional campuses, with no on-site library services, thereby increasing the need for a strong eResource collection. There are three faculty librarians at this college, two of whom investigated the use of a priority matrix for eResource selection and renewal.

 

Problem

 

Librarians at Fanshawe College faced a major dilemma. A significant eResource budget cut, combined with a depressed Canadian dollar, made it impossible to retain all the databases in the collection. The ensuing decision-making process left the librarians repeatedly fighting their collection management instincts. The process was challenging, in part because each librarian had her own emotional investment in particular databases. The librarians believed there must be a way to objectively assess which databases should be retained or added to the collection. This objectivity is vitally important because, as Walters (2016) explains “regardless of the library’s . . . selection model, collection development librarians must be able to explain their decisions to librarians, faculty, and administrators with primary interests in areas other than collection development” (p. 10).

 

The librarians were also curious to see if their instincts aligned with an objective, rational review of the data. A priority matrix format has proven successful at this library when applied to other projects. The librarians decided to see if this format could also be successful when applied to eResource collection management. Further, the key to solving this problem was to find or create a tool that allowed eResource decisions to be made easily and systematically.

 

Evidence

 

The evidence component of this project was twofold: a literature review informed the decision-making during development of the matrix, and local and vendor data was used in the matrix to rank existing eResource subscription products. The literature provided an excellent starting point for determining what factors were important to consider in this evaluation. The investigating librarians were quite familiar with the consideration and application of indicators such as usage statistics given that this type of evaluation is “focused on demand, as indicated by usage” (Kohn, 2013, p. 89). Local data included information such as the number of students registered in a program. Vendor data provided content and coverage details. Concrete criteria, as opposed to the more abstract concepts upon which the librarians might have relied instinctively, was also discovered. For example, Walters (2016) focuses on the idea of brand recognition when stating that “relevant papers . . . will be found only if the patron first recognizes that the online resource . . . has a reasonable chance of including relevant works” (p. 13).

 

Implementation

 

The process began with an environmental scan including a survey of electronic mail lists and completion of a literature review. Ideally, the investigating librarians hoped a “plug and play” solution was already in existence. After the search yielded no promising results, they resolved that a priority matrix would be created. Microsoft (MS) Excel seemed like a natural solution as it is capable of mathematical formulas, is possible to customize, and is cost effective.

 

The next step was to compile a list of the appropriate criteria. Table 1 lists the selection of weighted criteria. “Frequency of course offering,” an unweighted criterion, is reserved for use when a resource is at risk of cancellation. At that point, the librarians need to review how often the course is offered because it affects usage statistics, particularly with very specific and specialized eResources such as QuickLaw.

 

 

Table 1

Weighted Criteria

Criteria

Weight

Content

x10

Required Resource

x10

Cost Sharing

x10

Cost

x8

# of Applicable Programs

x8

Cost per Expected User

x8

Currency of Content

x8

Licensing & Authentication

x6

Ease of Use

x6

Overlap of Content

x6

Depth of Coverage

x6

Opportunity Cost

x4

Vendor Support

x2

Perpetual Access

x1

Brand Recognition

x1

% of Budget Assigned to Applicable School(s)

x1

 

 

Table 2

Priority Matrix Weights and Rationales

Criteria

Rationale

Priority

The Priority number calculated for a particular resource is calculated after the resource has been put through the matrix.

Content (x10)

Content of a particular resource is one of, if not the, most important factors in determining a resources value. Our beliefs on this particular criterion were reinforced by Mangrum and Pozzebon’s 2012 studya, and Walters’ 2016 articleb.  As such, this criterion was assigned the top possible value score of 10.

Required Resource (x10)

Resources required for programs to maintain accreditation are, naturally, more important than others and therefore this criterion was assigned a value score of 10. 

Cost Sharing (x10)

Given the current economic climate, the amount of money a program or school is able to contribute to a resource heavily influences our ability to make a purchase, resulting in this criterion begin assigned a value score of 10.

Cost (x8)

Cost is one of the most important considerations when reviewing potential purchases, however it is not one of the top considerations and so was assigned a value score of 8.

# of Applicable Programs (x8)

The number of programs that may find a particular resource useful speaks directly to value for money.  Something may have a low initial cost, but may not be useful – thereby having low value for money. This is equally as important as the initial cost, so was also assigned a value score of 8.

Cost per Expected User (x8)

As important as the overall cost, the cost per expected user of a particular resource is equally important and speaks to value for money. Some resources are specialized, and it is not reasonable to compare their usage statistics to those of resources intended for a more general audience. This criterion should create a more equitable playing field. This criterion has been assigned a value of 8.

Actual Cost per Use (x8)

The number of uses any particular resource has requires further context.  For example, a resource may have 1,000 uses that are only $0.02/use or they may have 100 uses that are $3.50/use.  This further contextualization allows accurate assessment of value for money and return on investment. This criterion has been assigned a value of 8, in line with the weight of other cost criterion.

Currency of Content (x8)

Currency of content is almost as important as overall content. While a database may have lots of title holdings, it is important to consider how current the content is – for example, heavily embargoed resources are not particularly useful and reduce the value of the resource.  A value of 8 has been assigned to this criterion.

Licensing & Authentication (x6)

Licensing, including permitted use, and authentication method are important as they influence the usability of a particular resource.  This criterion has been assigned a value of 6.

Ease of Use (x6)

Patrons are more likely to make use of a database that is intuitive and user friendly. To that end, this is a relatively important criterion, but since learning how to use databases is part of a college education the value is lesser than it would be in other types of libraries. As such, this criterion has been assigned a value of 6.

Overlap of Content (x6)

It is important to consider how much the content of a resource overlaps with content in the existing collection, both print and electronic, to ensure we are not paying for the same resource twice unless it is justified.  To reflect this, a value of 6 has been assigned.

Depth of Coverage (x6)

Backfiles, and their relative importance, varies by database and discipline, which is why this criterion has been assigned a mid-range value of 6.

Opportunity Cost (x4)

What would the cost to the library be if we had to buy all of the relevant content individually, rather than as part of the database package? This is important to consider, but not as important as many other factors and therefore has been assigned a value of 4.

Vendor Support (x2)

It is important to note how many technology-based incidents are associated with a particular database.  However, how many of said incidents will be tolerated is largely dependent on other criteria, with a much higher value, and for this reason this criterion has been assigned a value of 2.

Perpetual Access (x1)

Lack of perpetual access is certainly not a deal breaker, however it is an additional value that should be considered.  It was assigned a value of 1 to reflect this.

Brand Recognition (x1)

As per Walters, “Relevant papers…will be found only if the patron first recognizes that the online resource…has a reasonable chance of including relevant works.”c This criterion has been assigned a value of 1.

% of Budget Assigned to Applicable School(s) (x1)

The percentage of the overall budget assigned to the applicable school(s) must be considered to ensure that all schools are being equitably represented in library holdings.

Frequency of Course Offering

This criterion is not weighted, and is not routinely used in assessing resources.  Use should be limited to resources that are on the bubble as the frequency of course offerings may influence the use, or lack thereof, of particular resources.

aMangrum & Pozzebon, 2012.

bWalters, 2016.

cIbid.

 

 

After compiling the list, the investigating librarians took the next step to assign a weight to each criterion to ensure that the relative importance of each was considered. For example, if a database package is near-perfect in terms of content, should IP authentication, or lack thereof, dissuade collections librarians from making a purchase or renewing a subscription? By weighting each criterion, situations where a less important criterion overrules a more important criterion, thereby skewing decision-making, can be avoided. The weights and associated rationales are found in Table 2. Settling on the criteria weighting was the last step before building the matrix in MS Excel.

 

One of the investigating librarians created an MS Excel spreadsheet that contains six worksheets: Evaluation; Results; Criteria Description; Criteria Weighting Rationale, Charts; and Database Data. The Priority Matrix then went live on November 1, 2016.

 

Evaluation: The collections librarians determine scores for the criteria for each eResource and enter the data into this worksheet. The collections librarians determine scores collectively if a resource is multi-disciplinary. If a resource is discipline-specific, the librarian responsible for collections within the discipline will establish the score.

 

Results: Scores for each resource are automatically populated from the Evaluation worksheet and auto-calculated according to weight. Each resource is assigned a score of one to four. The score then determines the decision that is made. An explanation of the decisions is found in Table 3.

 

 

Table 3

Purchase or renewal decisions

Rating

Decision

1

High priority purchase / renewal;

Robustly meets all requirements

2

Generally meets all requirements;

Purchase / renew if funds available

3

Meets minimal requirements;

Purchase / renew with caution

4

Does not meeting basic requirements;

Do not purchase / renew

 

 

Criteria Description: This worksheet defines each criterion and describes what to look for when assigning a score.

 

Criteria Weighting Rationale: This worksheet contains a list of each criteria, the weight assigned to each, and associated rationale behind each weight assignment.

 

Charts: This worksheet uses the data generated in the Evaluation worksheet and displays it as images rather than numbers for optimal visual data representation.

 

Database Data: The eResource Specialist proactively inputs raw database data, such as cost, usage, and cost sharing, needed by the librarians to make their retention and selection decisions. 

 

The final step was to present the product to the Senior Manager and the non-investigating librarian colleague. An example of a completed priority matrix and ranking, such as that found in Table 4, were included in this presentation.

 

Outcome

 

The Priority Matrix has been in use since November 1, 2016 as ad hoc renewals have come in. Utilization of the matrix identified required minor tweaks, three of which are of note. While “Cost per Expected User” was included in the initial criteria, “Actual Cost per Use” had inadvertently been omitted from this list. “Actual Cost per Use” is, of course, of tremendous importance so it was added to the list of criteria and assigned a weight of eight. The investigating librarians quite quickly realized that two Priority Matrices are necessary: one for renewal and retention of databases, and one for new subscriptions. This is a critical differentiation since a criterion such as “Actual Cost per Use” is not available and should not be applied to a potential new resource. Additionally, the investigating librarians reworded some criteria descriptions to make their scope encompassing or applicable when evaluating non-traditional databases like SimplyAnalytics or Statista.

 

 

Table 4

Sample Completed Priority Matrix and Ranking

Database Data Worksheet

Sample Resource A

Cost Sharing

0

Cost

$27,363

Expected Users

2,637

Cost per Expected User

$10.38

Actual Use

16,879

Actual Cost per Use

$1.62

Depth of Coverage

1977-

Vendor Support

No issues

Perpetual Access

N

% of Budget Assigned to School

22%

Evaluation Worksheet

Sample Resource A

Content

4

Required Resource

3

Cost Sharing

0

Cost

2

# of Applicable Programs

4

Cost per Expected User

2

Actual Cost per Use

4

Currency of Content

3

Licensing & Authentication

4

Ease of Use

3

Overlap of Content

4

Depth of Coverage

4

Opportunity Cost

4

Vendor Support

4

Perpetual Access

0

Brand Recognition

0

% of Budget Assigned to School

4

Results Worksheet

Sample Resource A

Priority

2

Renew / Cancel

R

Content

40

Required Resource

30

Cost Sharing

0

Cost

16

# of Applicable Programs

32

Cost per Expected User

16

Actual Cost per Use

32

Currency of Content

24

Licensing & Authentication

24

Ease of Use

18

Overlap of Content

24

Depth of Coverage

24

Opportunity Cost

16

Vendor Support

8

Perpetual Access

0

Brand Recognition

0

% of Budget Assigned to School

4

Total

308

 

 

Since implementation, the librarians have an annual eResource Collection meeting during which all existing subscriptions, as well as desired additions, are evaluated using the Priority Matrix. The librarians pass these decisions on to the eResource Specialist who acquires, renews, or cancels resources accordingly. The investigating librarians monitored the application of the matrix for the next year to enhance and refine it whenever necessary or possible. As well, the possibility of applying this same approach to other resource types such as streaming media collections will be explored in future. Using the matrix for decisions is a welcome change to the process. It allows for more efficient decision-making, and increases the ability to articulate any contentious collections decisions in a manner that is clear to both non-practitioners and practitioners.

 

Reflection

 

The addition of evidence into eResource collections decisions was challenging in some ways, yet relatively simple in others. The librarians already used a significant amount of evidence, but not in a uniform or consistent manner. Additionally, many of the evidence-based decisions made prior to the implementation of the matrix were at an instinctual level, causing the challenge to lie in slowing down the process and identifying what pieces of evidence were being used intuitively. Vendor-supplied data provided some challenges to the process, as the type of data tracked and supplied to the library is not consistent between vendors. COUNTER-compliant statistics were used whenever possible to “compare data received from different publishers and vendors” (COUNTER, 2018, para. 3).

 

Conclusion

 

A failing Canadian dollar and a declining eResources budget compelled the librarians at Fanshawe College to address the way eResources selection and retention decisions were made. Additionally, the librarians needed to be able to appropriately articulate to non-library-science practitioners why new resources could not be added and existing resources were being eliminated. By reviewing the literature and applying local and vendor data in a consistent manner, the librarians could make objective decisions, rather than relying on their instincts. After applying the matrix, it became clear that the librarians’ instincts were actually fairly consistent with what the hard data demonstrated, as there were no significantly unexpected outcomes in terms of retention decisions. Its application did, however, require one librarian to realize she was continuing to advocate for a database that, despite being a good fit for the needs of a particular program, was just not being used. Furthermore, the matrix allowed the librarians to demonstrate to non-library-science practitioners that the budget is at its bare minimum, and further cuts would decimate the collection.

 

 

References

 

COUNTER. (2018). About COUNTER. Retrieved from https://www.projectcounter.org/about

 

Kohn, K. C. (2013). Usage-based collection evaluation with a curricular focus. College & Research Libraries, 74(1), 85-97. https://doi.org/10.5860/crl-295

 

Mangrum, S., & Pozzebon, M. E. (2012). Use of collection development policies in electronic resource management. Collection Building, 31(3), 108-114. https://doi.org/10.1108/01604951211243506

 

Walters, W. H. (2016). Evaluating online resources for college and university libraries: Assessing value and cost based on academic needs. Serials Review, 42(1), 10-17.

https://doi.org/10.1080/00987913.2015.1131519