Research Article

 

Sharing Success: A Review of Strategic Planning, Annual Reports, and Publicly Available Information from Academic Libraries

 

Kaitlin Springmier

Instruction & Learning Assessment Librarian
Sonoma State University Library

Rohnert Park, California, United States of America

Email: kaitlin.springmier@sonoma.edu

 

Elizabeth Edwards

Assessment Librarian

University of Chicago Library

Chicago, Illinois, United States of America

Email: eee@uchicago.edu

 

Michelle B. Bass

Population Research Librarian

Lane Medical Library & Knowledge Management Center

Stanford University

Stanford, California, United States of America

Email: michellebbass@stanford.edu

 

Received: 14 July 2017    Accepted: 8 Mar. 2018

 

 

cc-ca_logo_xl 2018 Springmier, Edwards, and Bass. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

 

DOI: 10.18438/eblip29316

 

 

Abstract

 

Objective – This paper reports on a study which explored web-based information sharing practices in North American academic libraries. This study specifically focused on how selected academic libraries use data, reports, and other strategic planning documents to communicate success and demonstrate impact to stakeholders, administrators, and peers.

 

Methods – An environmental scan was conducted to explore the assessment programs and communication practices of 97 North American academic libraries. The population for this study was identified on the basis of several metrics: consortial membership, Association of Research Libraries (ARL) ranking on various criteria, and institutional attendance at the 2014 and 2016 Library Assessment Conferences (LAC). Researchers conducted content analyses on the websites of the 97 libraries to identify measures of institutional support for assessment and to explore the range, depth, and quality of data made available. These iterative analyses were supported by the use of a rubric developed based on emergent criteria observed during multiple phases of review.

 

Results – Of the libraries reviewed, 57% made some form of data available to the public. The most robust and effective use of data observed in this study involved the use of data to tell stories about the library and its impact. While this study found a positive correlation between libraries with clear investments in assessment and their use of data in public documents, it found that other factors such as an institution’s consortial memberships or funding model may more strongly influence a library’s decision to make data available.

 

Conclusions – While observations gleaned from this study may serve as a benchmark for evaluating communication practices in academic libraries, further research is necessary to understand how factors within an academic library, its parent institution, or the profession at large may contribute to this decision making process.

 

 


 


 

Introduction

 

In 2016, a group of librarians at the University of Chicago Library conducted an environmental scan to learn how academic libraries were sharing information about their assessment programs on their websites. While conducting the scan, project members were struck by the myriad ways libraries were (or were not) using data in public-facing communications. As a result, the focus of this project evolved to explore how libraries use data, reports, and other strategic planning documents to communicate success and demonstrate impact to stakeholders, administrators, and peers.

 

Literature Review

 

Formative work by McClure and Samuels shows that transparent, evidence-based decision making contributes to a more productive, innovative library. Beginning in 1985, McClure and Samuels found that academic librarians had an overwhelming preference for internal information sources, and tended to ignore external information (such as patron preferences) during the decision making process. The authors claimed that these internally-based decisions encourage ineffective library activity because they create a “‘closed,’ inflexible environment” (p. 496) that is unable to adapt to the changing needs of library clientele. McClure continued this work in 1986, focusing on the need for staff to have a clear understanding of performance measure data for library planning and decision making. When interviewing public services academic librarians ranked as middle management, he found that these librarians were unlikely to use data for library decision making because they neither trusted the quality of the data nor were motivated to make effective use of it. McClure claimed that a library must have an “organizational system in place that recognizes the interactive aspects of policy making, encourages wide access to the data, and recognizes that empirical data are used in a much broader psychological context of organizational politics, personalities, and conflicting objectives” (p. 333). While these studies are almost 30 years old, the scholarship on the use of information and data in decision making within libraries (Koufogiannakis, 2014) and many of their main conclusions about library communication and decision making hold true today.

 

Assessment may feel like the latest trend in libraries, but the use of data to inform library service decisions is well-documented in the literature (Lundstrom, Martin, & Cochran, 2016; Manzuch & Maceviciute, 2014; Paulus, 2014; Seago, Schlesinger, & Hampton, 2002; Van House, 1989). In fact, Van House (1989) documents examples of the use of performance or output measures dating back to the early 1970s. In more recent years, Paulus (2014) collected data on questions asked to roving library workers and at service desks to inform staffing levels, hours, and roving locations in the library’s new learning commons. Lundstrom and colleagues (2016) partnered with academic departments to map research-related student outcomes; they made significant changes in programming after determining that current levels of library integration were failing.

 

It is less clear what might motivate a library to make their data available to an external audience, particularly data related to strategic decision making. In their review of Association of Research Libraries (ARL) institutions’ use of data in assessment and strategic planning, Lewin & Passonneau (2012) gave examples of productive ways libraries used assessment to improve service to their stakeholders and parent institutions. Saunders (2016) performed content analyses on libraries’ strategic directions documents, and found that slightly over 25% of her institution’s cohort “integrated explicit performance related metrics into their strategic plans” (p. 10). Given that libraries have been using a range of data for decision making for several decades, it is worth measuring the profession’s progress towards making these data and the resulting decisions available to their stakeholders.

 

Aims

 

This project explored the hypothesis that an academic library’s demonstrated commitment to assessment would correlate with the web-based presence of data or other strategic documents demonstrating institutional progress or data-driven decision making. After conducting content analyses of web-based documents made available by 97 North American academic libraries, this hypothesis was only partially supported, as it seems that other characteristics of academic libraries were more closely correlated with the presence of data and other reports on their websites.

 

Methods

 

Research Design

 

This study was conducted between May 2016 and June 2017 by researchers affiliated with the University of Chicago Library.

 

A total of 97 North American academic research libraries comprised the population for this study. This population was identified on the basis of several metrics: consortial membership, ARL ranking, and institutional representation at the 2014 and 2016 Library Assessment Conferences (LAC).

 

At the time of this study, the University of Chicago Library was a member of two consortia: the Ivy Plus Libraries (Ivy Plus) and the Big Ten Academic Alliance (BTAA). Consortial relationships are frequently the basis of institutional comparisons; however, the researchers felt it appropriate to expand the pool beyond geography or institutional stature with other metrics. The researchers reviewed rankings data made available by ARL for fiscal years 2014 and 2015, and selected the following criteria as potentially meaningful for comparison: Investment Index ranking, total library expenditures, and total staff. Institutions ranked in the five positions above and below the University of Chicago Library were added to the population.

 

Finally, researchers looked to the attendee lists from recent Library Assessment Conferences (LAC) to identify other types of institutions with a demonstrated commitment to assessment. Given the costs associated with sending staff to a national conference, the researchers felt that the expenditures associated with sending three or more staff members could be taken as an indication of institutional commitment to assessment. Attendee rosters were procured from the LAC websites for the 2014[1] and 2016[2] conferences.

 

These three sets of criteria yielded a population of 97 institutions, with several institutions included on the basis of multiple criteria.

 

Methodology

 

The data for this study were collected between May 2016 and April 2017 through environmental scanning. Content analysis was supported by the development of a rubric which was itself based on themes that emerged from the data.

 

As a method, “environmental scanning is not just a source of data on the external world...it provides that backdrop against which internal values may be clarified” (Mitchell & Witthus, 1991, p. 162). The first phase of the environmental scan tested the study’s hypothesis by seeking evidence of library-based assessment programs and public-facing data through a content analysis, or “a systematic and iterative review,” as described by Saunders (2015, p. 287). Researchers reviewed websites and other public-facing documents made available by the institutions included in the population seeking evidence of the existence of an established assessment program, with each researcher reviewing about one-third of the included institutions. Evidence of assessment programs included the word ‘assessment’ or ‘evaluation’ in a job title or in the name of a department or committee; it could also take the form of an assessment webpage, LibGuide, or stand-alone website, as described by Lewin & Passonneau (2012, pg. 89-90). If evidence was not found by browsing a library’s website, researchers would use a site search utility (if available) to search for terms like ‘assessment’ or ‘evaluation’ on the website or in the library’s staff directory. Any evidence identified was recorded; researchers also noted names of personnel holding assessment-type positions, where applicable.

 

Researchers also looked for examples of public-facing data on the libraries’ websites. Public-facing data included annual reports, strategic planning documents, visualizations, infographics, or fact sheets used to benchmark library progress, success, or impact to external stakeholders. If evidence was not found by browsing a library’s website, researchers would use a site search utility (if available) to search for terms like ‘report’, ‘annual report’, or ‘strategic plan’ on the website. If no evidence was found by searching the library’s website, researchers would repeat these searches in combination with the word ‘library’ or the library’s name using a site search utility (if available) on the parent institution’s website. Any examples of these data or of other forms of strategic communication were recorded and linked to data collected for the library’s assessment program (if applicable).

 

The researchers then reviewed data found in order to better understand how the previously identified public-facing data were being used in library communications. Because the population had a diverse range of communication practices, a rubric (see Appendix) was created using grounded theory, a methodology that embraces “the discovery of theory from data systematically obtained from social research” (Glaser & Strauss, 2006, pg. 3). During an initial scan, the researchers’ comparative analysis generated five conceptual categories that appeared to be indicators for excellence in data reporting: Accessibility, or Ease of Access[3] (e.g., reports are easily found on the library’s website), Communication (e.g., information is clear and without jargon), Data (e.g., reports include quantitative or qualitative data), Documentation (e.g., reports are up-to-date and publicly available), and Reporting (e.g., evidence of and access to historical reports). These criteria were chosen because the researchers were interested in investigating libraries’ varied use of qualitative and quantitative data in external communication, and because 47% of the population included no quantitative or qualitative data in reports, methods in which libraries were communicating success. Each criterion was worth up to 3 points, allowing each institution to receive a maximum of 15 possible points

 

 

Table 1

Rubric Scores by Institutional Assessment

 

Meana

Medianb

Modec

All libraries

 (97)

8.7

10

10

Libraries with assessment

(52)

9.4

10

10

Libraries without assessment

(45)

7.3

9

0

a Mean, the average of a group of numbers, is calculated by adding a group of numbers and then dividing by the count of those numbers.

b Median is the middle number of a group of number; that is, half the numbers have values that are greater than the median, and half the numbers have values that are less than the median.

c Mode is the most frequently occurring number in a group of numbers.

 

 


 

Results

 

Of the 97 institutions reviewed, the researchers identified 52 institutions with demonstrated investments in assessment as met the criteria for the first phase of content analysis: at least one position or title focused primarily on assessment (e.g., Assessment Coordinator, Assessment Data Specialist), an assessment committee, or a department dedicated to assessment. The grades of the 52 institutions ranged from 0-15, the most common grade being 10 (n=10).

 

The 52 libraries identified as having investments in assessment consistently scored higher on the rubric than the 45 libraries without. While grades of this cohort also ranged from 0-15, these libraries received an average grade of 7.3, 2.1 points lower than libraries with assessment personnel. The most common grade assigned was zero.

 

Researchers found that data sharing practices from different types of institutions varied widely. Of the Ivy Plus members, 9 of 12 demonstrated a commitment to assessment in accordance with the study's criteria; these institutions received an average grade of 8.4. The same number of BTAA libraries (out of 14) demonstrated a commitment to assessment; however, these libraries received an average grade of 10.3.[4]  Although the same number of libraries in both consortia had dedicated resources to assessment, one consortium received a markedly higher average score as a result of the range and depth of data and documents made publicly available.

 

 

Table 2

Rubric Scores by Consortia

 

 

Mean

Median

Mode

All consortial libraries

 (26)

8.4

10

0

 

BTAA

(14)

9.9

11

10

 

Ivy Plus

(12)

6.7

7.0

0

Consortial members with assessment

(18)

9.4

10

10

 

BTAA

(9)

10.3

11

11

 

Ivy Plus

(9)

8.4

9

9

Consortial members without assessment

(8)

5.6

4.5

0

 

BTAA

(5)

8.2

11

n/a[5]

 

Ivy Plus

(3)

1.3

0

0

 

 

Table 3

Rubric Scores by Public, Private, and ARL affiliation

 

 

Mean

Median

Mode

All libraries

 (97)

8.7

10

10

 

Public

(65)

9.2

10

10

 

Private

(32)

7.5

8

8

ARL members

(65)

9.1

10

10

 

Public

(45)

9.5

10

10

 

Private

(20)

8.1

8

8

Non-ARL members

(32)

 

 

 

 

Public

8.4

10.5

11

 

Private

6.6

7.5

0

 

 


 

The notable discrepancy between ratings of Ivy Plus and BTAA libraries inspired additional investigation into factors that might contribute to libraries’ propensity for data sharing. One possible explanation for this discrepancy could be that the majority of BTAA institutions are publicly funded and so may be required by state law or mandate to make more financial and strategic planning data available to the general public. The majority of Ivy Plus institutions, on the other hand, are private, and so have no such mandate for financial transparency. Another possible explanation for greater transparency in some institutions could be the requirement for annual data reporting to ARL[6] that affects 67% of the institutions included in this study. Might required reporting - whether to taxpayers or professional organizations - affect a library’s data sharing?

 

Further analysis of library ratings by public, private, or ARL affiliation determined that public institutions were much more likely to score highly on the study’s rubric than private institutions, regardless of ARL membership. Membership in an organization with (possible) mandated reporting may contribute to more information-sharing, as seen when the ratings of private ARL members (average 8.1) are compared to private non-ARL members (average 6.6). However, this type of organizational mandate does not consistently correlate with higher ratings, as can be seen by comparing public ARL members (mode 10) and public non-ARL members (mode 11).

Finally, the researchers saw that an analysis of libraries’ dedication to assessment as measured by funding for LAC attendance correlated with higher grades on the study’s rubric; however, there was a notable discrepancy between LAC 2014 attendees and LAC 2016 attendees. Libraries represented by three or more individuals at LAC 2014 were rated much higher than the aggregate; they were also rated higher than institutions with three or more LAC 2016 attendees. This could be because libraries that attended a conference three years ago have had time to expand or develop a culture of assessment and reporting; it could also be that the dramatic growth in LAC attendance is also indicative of a wider range of experience with and commitment to assessment.

 

Discussion

 

In total, 57% of libraries reviewed made some form of data available to the public. The most common form of data sharing was a Facts and Figures type-page on the library’s website. This type of page typically presented the library’s “tombstone statistics” - data points regularly collected for external reporting, including titles or volumes held, classes taught, or gate counts.[7] It is likely that libraries share these types of pages because they are familiar library data points and are relatively easy to produce. As McClure and Samuels discovered 30 years ago, “the closer and more familiar a source is, the more it is likely to be used” (1985, p. 495). However, while libraries often provide “tombstone statistics” as a measure of library value, these data provide an incomplete picture of the library’s service to and impact on the campus, and can be incomprehensible to outside stakeholders.

 

An improvement on the “tombstone statistics” approach involved the use of data for (internal) benchmarking or (external) comparison.[8] Some libraries use their “tombstone statistics” to demonstrate change over time; they may also use these standard data points to compare themselves to peer institutions that collect and report the same data. By establishing benchmarks, libraries are able to measure and communicate ways in which they are or are not achieving their goals. In this study, libraries that provided benchmarks or comparisons tended to receive higher grades, as the presence of benchmarks or comparisons by definition exemplified good communication.

 

The most robust and effective use of data observed in this study involved the use of data to tell broader stories about the library and its impact. The seven institutions receiving the highest scores this study made use of data —qualitative or quantitative —to tell such stories on their websites or in other reports, and by doing so, were able to effectively communicate the library’s impact on campus research, teaching, and learning. While the institutions tended to rely heavily on numbers to demonstrate impact, they provided context by also supplying narratives describing why the numbers mattered. These institutions often had a section of their website dedicated to assessment in the library which directed viewers to multiple years of archived documentation of library assessment initiatives. Others included yearly initiatives in their long-term strategic plans and updated the status of these initiatives in subsequent annual reports. Infographics complemented the text-based discussion of assessment and data-based decision making; this was particularly effective in illustrating the financial reasons behind reallocation of library funds to or away from collections budgets to meet other library service demands. [9]

 

Limitations

 

This study emerged from an environmental scan conducted at the University of Chicago Library with the specific purpose of informing internal decision making related to the representation of the library’s assessment presence on its website. As a result, the first two criteria for determining the study’s population identified institutions that more closely align with the University of Chicago Library, rather than the average North American academic library.

 

Similarly, the third criterion - staff attendance at a professional conference - makes assumptions about the institutions represented at this conference. First, while many institutions provide funding for their employees to attend such events, many individuals are required to pay at least some of the costs of attendance. While an institution may be willing to support this type of professional development, staffing needs may limit the number of individuals who are able to be away from the library at any a given time. Additionally, many other reasons contribute to an individual’s decision or ability to attend a conference. Finally, the LAC is a relatively small conference, so while it was not clear from the LAC website whether registration had reached capacity in 2014 or 2016, it is possible that the number of individuals in attendance was limited due to the size of the conference itself. As a result, while an institution’s representation at this conference in recent years can be taken as a measure of some institutions’ investment in assessment, it is an incomplete measure at best.

 

Finally, the use of grounded theory for the

development of this study’s rubric limits the generalizability of this study’s findings. As Thomas and James note, the problem with grounded theory is that it is a scientific instrument similar to one’s everyday “practical syllogism” (2006, pg. 773). Since the practice of developing theory from observation is highly subjective, this study’s rubric should not be utilized in other studies without further testing; similarly, the findings of this study should be treated as observations subject to further investigation.

 

Areas for Future Research

 

This study used publicly-available data to make inferences about external factors affecting libraries’ information-sharing processes. While shared characteristics among institutions in this study’s population could be correlated with expanded sharing of data on library websites, none of these characteristics reflect internal institutional factors that contribute to this area of decision making. Further research is needed to identify these factors and to explore their implications for information sharing in the larger academic library community. Additionally, further research is needed to explore how the external factors identified in this study (e.g., consortial membership) shape the internal decision making around these processes.

 

Conclusion

 

Increasingly, libraries are relying on their assessment programs to collect the data needed to demonstrate the value libraries contribute to their institutions’ missions and goals. Evidence of the movement can be seen through ACRL’s Impact of Academic Libraries, Megan Oakleaf’s recent work (Oakleaf, 2016; Oakleaf et al, 2017) on integrating the library in campus data collecting initiatives, and emerging papers considering the library’s role in protecting
student data (Jones & Salo, 2017). However, this study demonstrates that a library’s investment in an assessment program does not guarantee that the data collected by such programs will be made available to external stakeholders.

 

This study sought to explore factors that influence the ways academic libraries choose to share data and other reports on their websites. While the researchers found a slight correlation between libraries’ investment in assessment and the presence of outward-facing reporting, the correlations were observably impacted by other factors. A library’s demonstration of data could be influenced by employees’ engagement in assessment projects, participation in a consortia that requires regular reporting, or the receipt of taxpayer funding. Future studies might investigate which, if any, of these factors greatly increase or diminish the likelihood of data being made publicly available.

 

It is the researchers’ hope that observations gleaned from the content analysis can serve as a benchmark for measuring changes in library communication practices. Slightly more than half of the libraries reviewed made data or strategic documents available on their websites. However, those institutions that made data or documents available frequently did so without providing meaningful context for external audiences, thus missing an important opportunity to articulate the value expressed in the data. There is clearly significant room for improvement.

 

References

 

Glaser, B. D., & Strauss, A. L. (2006). The discovery of grounded theory: Strategies for qualitative research. New Brunswick, NJ: Aldine Transaction.

 

Jones, K., & Salo, D. (2017). Learning analytics and the academic library: Professional ethics commitments at a crossroads. College & Research Libraries 79(3), 304-323. https://doi.org/10.5860/crl.79.3.304

 

Koufogiannakis, D. (2014). McClure and Samuels’ study on information sources used for decision making and the connection to organizational climate still resonates today. Evidence Based Library and Information Practice, 9(4), 78-81. https://doi.org/10.18438/B8788Q

 

Lewin, H. S., & Passonneau, S. M. (2012). An analysis of academic research libraries assessment data: A look at  professional models and benchmarking data. The Journal of Academic Librarianship, 38(2), 85-93. https://doi.org/10.1016/j.acalib.2012.01.002

 

Lundstrom, K., Martin, P., & Cochran, D. (2016). Making strategic decisions: Conducting and using research on the impact of sequenced library instruction. College & Research Libraries, 77(2), 212-226. https://doi.org/10.5860/crl.77.2.212

 

Manzuch, Z. & Maceviciute, E. (2014). Library user studies for strategic planning. Information Research, 19(4), 71-84.

 

McClure, C. R. (1986). A view from the trenches: Costing and performance measures for academic library public services. College & Research Libraries, 47(4), 323-336. https://doi.org/10.5860/crl_47_04_323

 

McClure, C. R., & Samuels, A. R. (1985). Factors affecting the use of information for academic library decision making. College & Research Libraries, 46(6) 483-498. https://doi.org/10.5860/crl_46_06_483

 

Mitchell, M., & Witthus, R. W. (1991). Planning for diversity: Strategic planning for an urban academic library. Journal of Library Administration, 13(3-4), 157-165. https://doi.org/10.1300/J111v13n03_12

 

Oakleaf, M. (2016). Getting ready & getting started: Academic librarian involvement in institutional learning analytics initiatives. Journal of Academic Librarianship, 42(4), 472-475. https://doi.org/10.1016/j.acalib.2016.05.013

 

Oakleaf, M., Whyte, A., Lynema, E., & Brown, M. (2017). Academic libraries & institutional learning analytics: One path to integration. Journal of Academic Librarianship, 43(5), 454-461. https://doi.org/10.1016/j.acalib.2017.08.008

 

Paulus, A. R. (2014). Using data to assess staffing and services: University of Iowa Main Library. Journal of Access Services, 11(3), 189-205. Retrieved 19 March 2018 from ERIC database (EJ1033784).

 

Saunders, L. (2015). Academic libraries’ strategic plans: Top trends and under-recognized areas. The Journal of Academic Librarianship, 41(3), 285-291. https://doi.org/10.1016/j.acalib.2015.03.011

 

Saunders, L. (2016). Room for improvement: Priorities in academic libraries’ strategic plans. Journal of Library Administration, 56(1), 1-16. https://doi.org/10.1080/01930826.2015.1105029


 

 


Seago, B. L., Schlesinger, J.B., & Hampton, C.L. (2002). Using a decade of data on medical student computer literacy for strategic planning. Journal of the Medical Library Association, 90(2), 202-209.

 

Thomas, G., & James, D. (2006). Reinventing grounded theory: Some questions about theory, ground, and discovery. British Educational Research Journal, 32 (6), 767-795. Retrieved from http://www.jstor.org/stable/30032707

Van House, N. A. (1989). Output measures in libraries. Library Trends, 38(2), 268-279. Retrieved from http://hdl.handle.net/2142/7661

 

 

 

 


Appendix

Study Rubric

 

 

Excellent

Good

Satisfactory

Needs Improvement

Accessibility[10]

Documents are easily found on the library's website (within 5 clicks)

Documents can be found on the library's website, but it takes some time (5+ clicks?)

Documents can be found by searching the library's website

Documents not available

Communication

Communication is clear and accessible for non-librarians (e.g. lack of jargon)

Communication is directed towards non-librarians, but contains some jargon

Library's message or assessment contains jargon and seems to be directed mainly to staff

No direct message publicly available

Documentation

Most recent strategic directions and annual report publicly available, as well as archived documentation

Most recent strategic directions and annual report publicly available

Strategic directions or annual report publicly available; out of date

Documents not available

Data

Strategic directions or annual report uses qualitative and quantitative data to tell a story about the library's achievement or struggles

Draws links between qualitative/quantitative data collected by the library with strategic directions and/or annual report

Makes qualitative or quantitative data related to library assessment publicly available.

Data not available.

Reporting

Publicly available documents are up-to-date and there is evidence of historical reporting and evaluation.

Publicly available documents are up-to-date

Publicly available documents are 1 year or less out of date

Documents are not current

 

 



[1] http://libraryassessment.org/archive/2014-library-assessment-conference.shtml

[2] http://libraryassessment.org/archive/2016-library-assessment-conference.shtml

[3] ‘Ease of Access’ better conveys the concepts intended by ‘Accessibility’, a term with a well-established meaning in the library community; however, the latter is included here as it appears in the original rubric (see Appendix).

[4] Consortial numbers exclude the University of Chicago.

[5] Each institution in this cohort received a different grade, making it impossible to determine a mode.

[6] See http://www.arl.org/publications-resources/arlstatistics/terms/summary for more information.

[7] For example, see “By the Numbers” on The University of Chicago Library’s About the Library webpage, accessed from https://www.lib.uchicago.edu/about/thelibrary/.

[8] For example, see “NCSU Libraries Strategic Plan FY14/FY16, accessed from https://www.lib.ncsu.edu/sites/default/files/files/images/NCSU_Libraries_Strategic_Plan_FY14-FY16-062813FINAL.pdf.

[9] For example, see Creighton University Library’s “Budget Challenge” from their Library Assessment webpage, accessed from http://culibraries.creighton.edu/assessment/budget.

[10] As noted before, ‘Ease of Access’ better conveys the concepts intended by ‘Accessibility’; however, the latter is included here as it appears in the original rubric.