Issues in Science and Technology Librarianship | Summer 2006 |
|||
DOI:10.5062/F42V2D1H |
URLs in this document have been updated. Links enclosed in {curly brackets} have been changed. If a replacement link was located, the new URL was added and the link is active; if a new site could not be identified, the broken link was removed. |
Anne Piergrossi
MLIS candidate
Syracuse University
Syracuse, New York
akpiergr@syr.edu
Kathleen Spahn
MLIS candidate
Syracuse University
Syracuse, New York
kmspahn@syr.edu
This study focuses on the collection of a small science library within the National Estuarine Research Reserve System. It reviews the history, context, and existing constraints on its collection and development and suggests ways in which the library might overcome barriers in its effort to obtain grants for its expansion of print materials.The library plays a three-fold role in the local community: it hosts both staff and visiting scientists conducting estuarine research; it educates the local community -- schoolchildren, adults, and policymakers; and it advocates for sensitive and sustainable stewardship of the land in partnership with local organizations and policymakers.
The Education Director has been repeatedly turned down in her attempts to obtain grant money because the library is too new (to the public) to demonstrate any user base or long-term viability. Granting organizations suggested she come back for reconsideration when rising usage statistics could be demonstrated.
This study seeks to propose a method by which the Wells Research Library can measure the use of its collection and more accurately identify its user base and attempts to bring together a variety of tools and methods into a single cohesive measurement process. Once established as a routine, the regular collection of data can be used to demonstrate patterns and changes in the library's use and user base over time, and thereby be able to present such data to granting organizations. This system can be adapted by any small specialized library interested in measuring its users and usage on a regular basis -- particularly one connected to a nonprofit or parent organization.
The Wells Reserve, in Wells, Maine, joined NERRS in 1986. Located on the 1,600-acre site of Laudholm Farm and containing fields, forests, wetlands, sand beach, dunes, and rivers, the Wells Reserve is overseen by the nonprofit organization Laudholm Trust.
The Wells Reserve plays a three-fold role in the local community: it hosts both staff and visiting scientists conducting estuarine research, particularly in relation to the restoration and maintenance of these coastal environments in the face of development, sprawl, erosion, and pollution; it educates the local community -- schoolchildren, adults, and policymakers alike -- about coastal ecology; and it advocates for sensitive and sustainable stewardship of the land in partnership with land trusts, conservation commissions, and local policymakers.
In any given year, as many as 100 scientists may visit Wells to conduct research, and the Reserve will typically direct or participate in around 20 studies involving scientists, students, and staff from the Reserve and from other academic and environmental institutions. Wells staff also create and lead more than 20 different interpretive tours of the habitat for all age groups and interests, with a consistent focus on environmental stewardship and issue awareness. One of the most ambitious outreach programs promises at least one day of coastal ecology education for all local elementary school students.
About six years ago, the Director of Education began to consolidate the miscellaneous materials used by researchers, volunteers, local educators, and staff into one location. Over the years the Reserve has accumulated approximately 3,000 items, mostly through donations. A selection policy has kept the collection focused, but it suffers from a lack of funding that prevents any expansion of the collection in a planned fashion.
The collection, whose primary users are researchers, docents, and local educators, is cataloged online through a regional OPAC known as Minerva. This consortium of more than 85 Maine libraries provides electronic catalog access to more than six million library items through Innovative Interfaces' Millennium Cataloging software. On April 13, 2005, the Wells Research Library formally announced its participation in Minerva, making its collection electronically visible for the first time.
The Education Director has been repeatedly turned down in her several attempts to obtain grant money because the library is too new (to the public) to demonstrate any user base or long-term viability. Granting organizations suggested she come back for reconsideration when rising usage statistics could be demonstrated
This study seeks to propose a method by which the Wells Research Library can measure the use of its collection and more accurately identify its user base and attempts to bring together a variety of tools and methods into a single cohesive measurement process. Once established as a routine, the regular collection of data can be used to demonstrate patterns and changes in the library's use and user base over time. This system can be adapted by any small specialized library interested in measuring its users and usage on a regular basis -- particularly one connected to a nonprofit or parent organization.
Touch techniques count anything moved for any purpose in a given period of time. Both time-consuming and prone to under- and over-counting, its primary advantage lies in not requiring patron participation.
Reshelving methods record items as they are put away and rely in part on patron participation. Numbers can be tallied either as "all or nothing" or in greater detail. Problems include an inherent underestimate of use, as it relies on patrons leaving books on tables, and count deflation if more than one person uses a book before it is reshelved. Because of the potential for distorted figures, it is recommended that this method be used in conjunction with other methods.
More difficult, but with the potential for richer, more accurate information, is the use of questionnaires or interviews. Biggs notes several potential dangers: selection of a representative time period and pool of users; the difficulties inherent in creating a valid questionnaire that will elicit easily interpreted answers; and self-conscious use of materials by patrons for the purposes of the survey.
Christiansen, Davis, and Reed-Scott (1983) note that advantages of circulation studies are their flexibility of study/sample size and of data analysis, easily countable units, and objective nature of the information. There are disadvantages as well: data reflect successes, not user failures; excludes in-house use and under-represents actual use; and fails to identify low use due to poor quality of collection.
As a counterpoint, Mason (1998) cautions against an over-reliance on the use of circulation data as a gauge of library use and suggests that electronic catalogs, databases, electronic encyclopedias, and the web itself have become core services. Her point is a valid one for Wells Reserve Library to consider, as data gathered solely about Wells Reserve Library's "traditional" resources might serve to reinforce the limitations under which it currently operates.
Drawing on the literature of measurement and evaluation, Nicholson (2004) has developed a theoretical model enabling libraries to obtain a more holistic self-understanding. Under its current budget constraints, the Wells Research Library does not have the resources to build such an extensive system. However, as Nicholson suggests, his framework "should be seen as a guide to the selection and implementation of measurement and evaluation procedures rather than a detailed process that must be followed without deviation." The establishment of routine procedures to collect and analyze use and user data represents a first step in that direction.
Pungitore (2001) describes community analysis as "defining community information needs analysis as a structured, planned, formal study that identifies the information requirements of the people within a library's jurisdiction." In his view, "...needs analysis consists of 10 steps: laying groundwork, preparing to conduct the study, framing questions and choosing tools, designing data-gathering instruments, using in house data, launching a study, analyzing results, sharing results, and acting on results."
Kaarst-Brown et al. (2004) discuss the relevance of identifying the unique characteristics of an organization's culture to help identify which are relevant to future success and growth, a concept that is certainly applicable to a library trying to discover who their users are and why they use this particular library.
Determining the bottom line of the total operational cost is crucial for a library seeking outside funding. Forrest (2003) discusses this trend towards outcomes assessment and recommends "a commitment to systematic and ongoing collection, compilation, analysis and reporting of data about inputs, outputs, and outcomes" to develop a culture of assessment in a library.
The University of Washington libraries have been assessing user needs, satisfaction, and library performance since 1992, and Hiller (2001) discusses the many approaches employed over the years, including large-scale user surveys, informal methods such as suggestion boxes and service desk comments, meetings between faculty and librarians, focus groups, and finally LIBQUAL+ (an outcomes measurement tool adapted specifically for libraries from SERVQUAL and frequently used to determine customer satisfaction based on quality of service). In an era of dwindling budgets Eng and Gardner (2005) suggest reviewing other surveys and adapting them to meet individual library needs by focusing on performance and satisfaction.
Morris and Barron (1998) suggest a constant dialogue with library patrons as a means to uncover needs and work towards achieving user satisfaction. They discuss the need to put customers at the center of public services and emphasize the four themes of "quality, choice, standards, and value." They also point out the positive aspects of using specific communication methods such as comments and complaint procedures, surveys of users and non-users, user consultation, staff feedback, and the development of library charters in which customers are central.
These studies show that education (both K-12 and adult continuing ed) forms the primary focus for library activities at these institutions, with seemingly less emphasis placed on serving researchers or scientists.
The estuarine institutions gain funding opportunities from partnerships, particularly with local universities (which provide access to resources and funding) and also with local schools, nonprofits, and community decisionmakers. The libraries are also used as a community hub, with auditoriums or public spaces available for distance learning or video conferencing (seminars, presentations, etc.).
A recent overview of zoo and aquarium libraries by Barr (2005) provides useful parallels for Wells, with whom they share a narrow subject focus, generally small size, and attachment to a larger nonprofit institution. Most zoos and aquariums have missions emphasizing education, research, and conservation, and they cannot gain accreditation unless they have a library supporting those missions. Like NERRS institutions, these libraries serve staff, outside researchers (students, scientists from other institutions), and the public. Barr ties the quality of the library to the reputation of its institution, which, if it does not "...keep up with advances in knowledge will lose credibility and ultimately suffer, very likely, with effects on attendance and funding." Zoo/aquarium libraries also participate in creative partnerships with related academic or community institutions in order to provide resources in the face of low budgets and rising costs. Finally, Barr's survey revealed that these libraries are moving towards professionalization, with nearly 20% of responding libraries reporting a librarian with an MLS. This can be compared to an earlier study by Gibbs (1993) regarding the qualifications of special science librarians, in which all 12 special librarians interviewed in the North Carolina Research Triangle held the MLS degree.
The two main measures of usage data are to be derived from circulation and interlibrary loan requests as tabulated in the regional OPAC system and through tallies. Circulation data are to be examined in terms of resources checked out and resources used in-house. Analysis of interlibrary loan data should be performed to shed light on which materials are most used and where the collection has gaps or weaknesses.
Surveys of reserve staff; local educators and students, both K-12 and college-level; scientists and researchers; other environmental nonprofits in the region; local government and community decisionmakers; and a random sample of the community in general will result in a comprehensive understanding of who is and is not using the collection.
By contacting staff and/or volunteers of other estuarine libraries, the Wells Reserve Library should develop benchmark data to compare the usage and users of other libraries with their own.
Mindful of the disadvantages of circulation data described by Christiansen, Davis, and Reed-Scott, a variety of in-house use measures have been included. We hope that the combination of the more subjective measures of the user data (satisfaction, etc.) with the quantitative data here will also shed light on areas where improvements might be made.
Every effort has been made to minimize staff involvement in data collection due to absence of professional staff and the limited hours that the library is open to the public.
The Wells Reserve Library has joined a statewide online public access catalog known as Minerva. Although the patron interface to the holdings is quite basic, Innovative Interfaces, Inc. is the underpinning. Known as Millennium Web Interface, this library system software permits sophisticated data collection of both usage and collection data. The methodology used in this study focuses on the usage components, but it is worth noting that extensive analysis can be done of a library's holdings and the circulation of those holdings on the item level.
Currently the Wells Reserve Library collects only a minimum of information from its registered users (name, address, phone, e-mail), and identifies only two types of user -- staff and public. The Millennium system permits the addition of multiple "Patron Statistical Codes" (Pcodes) that are used to identify subsets of users for collecting statistics based on patron characteristics. Those Pcodes are library-defined, so they can reflect any characteristics that would be useful in the gathering of data. The Millennium system can generate reports on registered patrons ranging from a simple count to an analysis of circulation and collection use based on Pcode.
The Wells community is difficult to characterize with any accuracy since it reflects an interest group rather than a geographic area. In addition, while the focus of interest is narrow, the patron perspectives are varied. Maintaining a registered user population without any identifying characteristics limits the library's ability to assess its collection as it reflects the community's needs. We suggest that the Wells Reserve Library begin to include more detailed information in its patron registrations in order to be able to better understand both the user population and the usage patterns that will appear over time. For example, simply identifying age and education level might be a quick indicator of the level of materials most likely to be used by patrons. The collection of usage statistics will be accomplished through the library's participation in the Innovative Interfaces system. Preformatted reports are available through a simple user interface, and more customized reports can be generated by exporting data to an Excel spreadsheet.
Two main areas will be considered during data collection: circulation of materials (both owned, and borrowed or loaned through ILL) and in-house use of resources (materials, computer use, and reference). In all cases, due to the small size of Wells Reserve Library, complete counts will be made rather than sampling a subset of transactions.
Drawing from the measures and calculations described by Van House et al., we plan to generate the following reports:
It should be noted that there are no viable means in place with which to track queries to the OPAC system that do not result in formal loan requests. Simple queries to the catalog are not logged in a way that identifies the originating computer. Wells Reserve Library should monitor the OPAC's ability to do this tracking, as it might be an important means for measuring unmet needs.
For respondents, both print and online, an incentive to complete the survey should be offered as discussed by Eng and Gardner in an American Libraries article (2005). One possibility: a chance to win a private tour of the Wells National Estuarine Research Reserve and one year membership in the Laudholm Trust. According to the Wells Reserve web site membership in the Laudholm Trust provides "free admission to the Reserve from Memorial Day through Columbus Day and program discounts" (www.wellsreserve.org/).
It is recommended that the surveys be available both online and in the library for three two-week periods (spring, summer, fall) -- the same period during which the use surveys are conducted -- and that the survey be publicized for the month prior to the survey period to build community awareness and interest. Awareness can be built using press releases to local/regional news sources, Reserve web site announcements, public service announcements on local/regional cable TV and radio stations, and flyer and poster distribution by Reserve members and volunteers. The Reserve's community partners (school district officials and teachers, local environmental nonprofits, local/regional government bodies, Chambers of Commerce) can also help spread the word about the survey.
Because of the Reserve's limited budget, the survey (see Appendix 2) is modeled after LibQual and other existing survey tools, with adaptations that take into account the specific context of Wells' situation as suggested by Chivers and Thebridge (2000).
Since the library is only open two half days each week from January 16 to December 15 we recommend that a head count be done for at least two periods of two months each, in the winter (while school is in session but the Reserve is less busy) and in the summer (while school is out but during the Reserve's peak season). During these time periods, library volunteers perform a head count of persons entering the library on a daily basis during regular hours of operation using a simple tick sheet method. It is recommended that this be done annually until the library has enough gate statistics to chart patterns of use.
A high-profile thematic tie-in for the survey could be Earth Day; the survey could be publicized along with Earth Day activities on signs, posters, or in public service announcements on radio and television and in the schools themselves. As with the general survey, some form of incentive will be useful to encourage participation. Possibilities include entry into a drawing for free enrollment at a Well's Reserve program of the winners' choosing.
It is assumed that if the library is to enjoy continued growth new users must be developed early on and added to the existing base of users. Another benefit of attracting school-aged users is the likely involvement of their parents at some level of library/Reserve use.
By what percentage is usage of the collection changing?
This will be gleaned from circulation records, ILL data, in-house circulation measurement methods (including a tally of reference materials used by in-house staff and barcode scanning to collect reshelving data), and librarian observation/experience with patrons who request research appointments or have reference questions by telephone or in person. Also included in this calculation will be usage of the public research computer. Its popularity will be tracked with a simple tally sheet on which users indicate their hours of use and nature of research. This percentage will be calculated both per resource type and total. Due to lack of employee resources, this data will be collected only during predetermined periods of time throughout the year.
Which resources are the most popular?
The usage of resource types (periodicals, reference, AV, docent materials, and general collection are the indicators used in the catalog) will be ranked by popularity and measured on a monthly basis using the same data as above.
Which resources do users themselves indicate using?
We will measure the usage of resources as indicated by the in-house surveys, surveys mailed/e-mailed out, and online surveys, and compare these results to the actual use statistics above in order to determine any gap between stated and actual use.
When do users visit the library?
Usage ranked by day of week and block of time (9-12, 12-3, 3-6) will be measured as determined by observation, survey, and sign-in sheet which contains time in, time out, affiliation, and the date.
How many people use the library?
Overall attendance will be measured twice a year for two months each time using observation and sign-in sheet. We will also measure the percentage change in registered users as measured against any change in overall borrowing or in-house use rates.
How do users interact with the library?
We will measure the percentage popularity of telephone reference, e-mail reference, pre-arranged reference appointment, and walk-in usage.
What is the breakdown of users?
Users will be broken down into percentages by region. (Note: Wells has specifically chosen not to collect data beyond name, address, phone number, and e-mail for cardholders, even though it is possible to do so through the regional OPAC system.) The user surveys and sign-in sheets contain an affiliation field and, depending on the statistical accuracy of the responses to these, affiliation may or may not also be calculated. Wherever possible within these results, the overall response rate as well as the response rate within each identified grouping will be determined, as well as what percentage of the respondents used which type of measurement tool.
This survey will be run on a yearly basis to measure growth or shrinkage in user base and groupings within that base.
How do we compare to benchmark NERRS institutions?
The percentage difference between our results and the responses of the NERRS benchmark libraries will be compared. The accuracy and usefulness of this section will depend upon the NERRS libraries, their willingness to participate, and their possession of similar data to share, but it is hoped that at least some baseline comparisons and qualitative analysis can be made.
Biggs, M. 1990. Discovering how information seekers seek: methods of measuring reference collection use. Reference Librarian 29: 103-117.
Broome, Joellen. 2004. Science and technology library innovations without a science and technology library. Science and Technology Libraries 24(3/4): 375-388.
Burns, R. 1978. Library use as a performance measure: its background and rationale. Journal of Academic Librarianship 4(4): 5-8.
Chivers, B., & Thebridge, S. 2000. Best value in public libraries: the role of research. Library Management 21(9): 456-465.
Christiansen, D. E., Davis, C. R., & Reed Scott, J. 1983. Guide to collection evaluation through use and user studies. Library Resources & Technical Services 27: 432-440.
Cullen, R. 2001. Perspectives on user satisfaction surveys. Library Trends 49(4): 662-686.
Eng, S., & Gardner, S. 2005. Conducting surveys on a shoestring budget. American Libraries 36(2): 38-39.
Forrest, C., & Williamson, A. J. 2003. From inputs to outcomes: measuring library service effectiveness through user surveys. Georgia Library Quarterly 40(2): 12-18.
Galvin, T. J., & Kent, A. 1977. Use of a university library collection: a progress report on a Pittsburgh study. Library Journal 102(20): 2317-2320.
Gibbs, Beth Liebman. 1993. Subject specialization in the scientific special library. Special Libraries 84(1): 1-8.
Gohlke, Dorothy Annette. 1997. Benchmark for strategic performance improvement. Information Outlook 1(8): 22-24.
Hernon, P., & Altman, E. 1998. Service quality and customer satisfaction do matter. American Libraries 29(7): 53-54.
Hiller, S. 2001. Assessing user needs, satisfaction, and library performance at the University of Washington libraries. Library Trends 49(4): 605-625.
Holt, G. E., & Elliott, D. 2003. Measuring outcomes: Applying cost benefit analysis to middle sized and smaller public libraries. Library Trends 51(3): 424-440.
Kaarst-Brown, M.L. 2004. Organizational cultures of libraries as strategic resource. Library Trends 53(1): 33-53.
Kairis, R. 2000. Comparing gifts to purchased materials: a usage study. Library Collections, Acquisitions and Technical Services 24(3): 351-359.
Keyes, A. M. 1995. The value of the special library: review and analysis. Special Libraries 86(3): 172-187.
Langley, Anne, & Martinez, Linda. 1999. Learning our limits: the science libraries at Duke University retreat to respond to our changing environment. Issues in Science and Technology Librarianship 24 [Online]. Available: http://www.istl.org/99-fall/article1.html [Accessed July 25 2006].
Laughlin, Sara, Shockley, Denise Sisco, and Wilson, Ray. 2003. The Library's Continuous Improvement Fieldbook. Chicago: American Library Association.
Mason, M. G., St. Lifer, E., & Rogers, M. 1998. Cleveland Public redefines patron usage in electronic age. Library Journal 123(6).
Morris, A., & Barron, E.a> 1998. User consultation in public library services. Library Management 19(7): 404-415.
National Estuarine Research Reserve System. 2006. [Online]. Available: http://nerrs.noaa.gov/ [August 7, 2006].
Nicholson, S. 2004. A conceptual framework for the holistic measurement and cumulative evaluation of library services. Journal of Documentation 60(2): 164-182.
Ochola, J. N. 2002. Use of circulation statistics and interlibrary loan data in collection management. Collection Management 27(1): 1-13.
Oltmanns, G.V. 2004. Organization and staff renewal using assessment. Library Trends 53(1): 156-171.
Pandion Systems, Inc. 2003. Inventory and Assessment of K-12 and Professional Teacher Development Programs in the National Estuarine Research Reserve System. [Online]. Available: {http://nerrs.noaa.gov/Education/k-12.html} [Accessed: March 15, 2005].
Paris, Marion. 1990. A management survey as the critical imperative for a new special library. Special Libraries 81(4): 280.
Powell, R. R. 1988. The relationship of library user studies to performance measures -- a review of the literature (Occasional Paper, no 181). Urbana, Ill: University of Illinois GSLIS.
Powers, Janet E.a> 1995. Marketing in the special library environment. Library Trends 43(3): 478-493.
Pungitore, V.a> 2001. Identifying and analyzing user needs. Library & Information Science Research 23(4): 373.
Responsive Management. 2003. Coastal Training Needs Assessment and Market Inventory for the Jacques Cousteau National Estuarine Research Reserve, Volume 1. [Online]. Available: {http://web.archive.org/web/20130418034658/http://www.responsivemanagement.com/download/reports/NJCoastalReportFinaldist.pdf} [Accessed March 26, 2005].
Van House, N. A., et al.a> 1987. Output Measures for Public Libraries: A Manual of Standardized Procedures (2nd ed.). Chicago: American Library Association.
Weiner, Barbara. 2000. A bottom-line adventure: time and cost study at Hazelden Library & Information Resources; presented at the Substance Abuse Librarians and Information Specialists conference, April 1999. Behavioral & Social Sciences Librarian 18(2): 27-31.
Zarnosky, Margaret R. and Evans, Edward G. 2000. Developing Library and Information Center Collections (4th ed.). Englewood, Colo.: Libraries Unlimited.