Article

 

A Holistic Look at Reference Statistics: Whither Librarians?

 

B. Jane Scales

Reference Team Leader / E-Projects Librarian

Washington State University Libraries

Pullman, Washington, United States of America

Email: scales@wsu.edu

 

Lipi Turner-Rahman

Faculty

Washington State University

Pullman, Washington, United States of America

Email: ilipi@wsu.edu

 

Feng Hao

Visiting Lecturer of Sociology

University of Richmond

Richmond, Virginia, United States of America

Email: fhao@richmond.edu

 

Received: 22 June 2015   Accepted: 17 Nov. 2015 

 

 

cc-ca_logo_xl 2015 Scales, Turner-Rahman, and Hao. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

Abstract

 

Objective – Washington State University (WSU) Pullman campus librarians track a diverse set of reference statistics to gain a “holistic” look at local reference transaction trends. Our aim was to aggregate virtual, reference desk and office transaction data over the course of three years to determine staffing levels. Specifically, we asked “Where should reference librarians be to answer questions?”

 

Methods – Using Springshare’s LibAnalytics, we generated longitudinal (2012-2014) statistics and data, to help us assess the patterns and trends of patron question numbers, types, communication modes, and locations in the Terrell Library. With this data, we considered current staffing patterns and how we could best address patron needs.

 

Results – Researchers found that compiling data across modalities of location, communication, question type, and the READ Scale led to a better understanding of user behavior trends.

 

Conclusion – Examining and interpreting a more inclusive and richer set of transaction statistics gives reference managers a better picture of how patrons are seeking help, and can serve as a basis for making staffing decisions.



Introduction

 

Washington State University (WSU), a land grant institution, was established in 1890. Its main campus is located in Pullman. The largest library on campus, Holland and Terrell, houses the humanities and social sciences collection as well as the only traditional reference desk on campus.  The library is contained in two buildings, with the reference desk located in Terrell.

 

Reference services on the WSU Pullman campus are coordinated by the Libraries’ Reference Steering Committee. This group establishes service hours and staffing for desk and online chat services, and co-ordinates services for the email-based LibAnswers. With tight budgets and a changing student population, the committee was tasked with assessing the demand for services. In 2014, the group looked at a comprehensive set of statistics covering three years of reference services to get a better picture of the behavior trends of patrons seeking assistance.

 

We had been aware that our reference questions had declined in number for several years. Between 2012 and 2014, the committee made incremental adjustments to the schedule and staffing of the reference desk, based on a cursory review of data.

 

We implemented a tiered-reference model, a term which Massey-Burzio (Huling, 2002) articulated. Tiered or stratified reference models use paraprofessionals as the first point of contact for patrons needing help. These were, in our case, a mix of undergraduate and graduate students, who were instructed to refer patrons on to librarians and subject specialists (the next tier) if their needs required more expertise to address and answer.

 

After three years taking this tactic, however, we lacked a clear picture of how those changes met and fulfilled (or not) patrons’ needs. To understand what patrons wanted, and where reference was happening, the committee reviewed a comprehensive set of statistics spanning several years. The question the committee posed to the data was, “Where is reference happening in Terrell Library, and at what level of complexity? In which location(s) are the librarians most needed to answer reference questions, and how does the data show this?”

 

Literature Review

 

There is little debate that academic library reference services have changed in the last several years. Tyckoson (2012) points out that, while the concepts of what constitutes reference services have remained stable since the 1870s, the tools and skills used to deliver those services have evolved dramatically. Our means of communication with patrons, staffing pressures, assessment practices, and information access have all contributed to a more complex and nuanced view of reference.

 

Many have also described the downturn in the use of traditional academic reference services. Reference transactions have declined significantly. Martin (2009) cites statistics from the Association of Research Libraries (ARL) that document this downturn. Coffman (2012) describes these changes as a “decline of the library empire.” The result of this has been dramatic. Some universities have abandoned the reference desk, and replaced it with other models of patron assistance (Lederer & Feldmann, 2012). Others have tried new models such as combined reference desks and tiered services, to form new collaborative models (Meserve, Belanger, Bowlby, & Rosenblum, 2009; Deineh, Middlemas, & Morrison, 2011; Dinkins & Ryan, 2010).

 

These changes challenge library managers. Multiple models of reference mean different methods of collecting transaction statistics, and require a more intensive and inclusive look at data. King and Christensen-Lee (2014) prefaced their study by reviewing longitudinal trends of patrons’ question-types, as well as overall trends specific to email and online chat reference, at the Valley Library at Oregon State University. 

 

Baro, Efe, and Oyeniran (2014) looked at a more expansive set of possible reference channels used within the universities of Nigeria by surveying librarians. Specifically, they considered: in-person visits, Facebook, telephone, short message service (SMS), instant messaging, and email. The authors’ consideration of so many means of communication is unique, and necessary in order to discover the preferred method of their patrons to ask questions.

 

When the WSU’s Libraries’ Reference Steering Committee decided to evaluate services provided by the largest, most used library on the Pullman campus, members looked to the literature for guidance. The committee found the recommendations in Kern’s 2006 article a good basis for the study. Kern advises librarians to think “...about your reference services as a single reference service with many modes of communication.” She asks researchers to delineate a clear and precise question to ask of their statistics before embarking on surveying the data.

 

Methodology

 

In order to understand where questions were being answered, and the nature and complexity of those questions, the WSU Libraries’ Reference Steering Committee looked at a multifaceted set of data: IM (instant messaging), LibAnswers, email, phone, in-person reference desk, and in-person office visits. Implicit to this research was an examination of how the tiered reference model was working.

 

Since 2012, the reference team has used Springshare’s LibAnalytics, a popular tool reviewed by Dworak (2011). LibAnalytics facilitates the customization and consolidation of reference transaction data across communication modes. Both Flatley and Jensen (2012) and Gossett, Stephan, and Marrall (2012) describe this tool’s flexibility. 

 

After every transaction, library staff (both librarians and student employees) record their location, type of question, mode of communication, READ Scale difficulty, length of the transaction, and whether or not the exchange required the use of government documents. The interface provides reference workers multiple forms, including text boxes, check boxes, radio buttons, Likert scale, and multiple column categories of information to characterize a transaction (see Figure 1).

 

Table 1 outlines the various data points that staff record after each reference transaction. Only a few points need to be further explained. The Contact Type includes the various communication modes patrons use to request information from us. Currently, there are five options available.

 

 

Figure 1

LibAnalytics Transaction screen as configured by the WSU Pullman libraries.

 

 

Table 1

Data Types Recorded for Reference Transactions

Contact Type

Person-to-person, Telephone, Instant Messaging (IM), Email, LibAnswers (a variation of email)

Level of Question Difficulty

READ Scale (Reference Effort Assessment Data) – indicates the complexity of the question, and the amount of effort necessary to answer the question on a scale of 1 to 6

Question Type

Policy, Technology, Directional, Reference

Location & Service Points

Terrell Library was the consistent “location.”  Service points included in this study included Reference Desk and Office

 

 

READ Scale assessment, devised by Gerlich and Berard (2007), is a qualitative measurement of the amount of effort and knowledge necessary to answer a question. Questions deemed a READ level 1 require no specialized knowledge, so that staff can answer them without consulting a database or our LibGuides. Questions assessed at levels 2 and 3 require increased knowledge and effort to answer. Student employees, who participate in our tiered reference model, are trained to recognize the point at which a question should be passed off or referred to a subject specialist librarian. These are considered the higher-end level 3 questions.

 

In addition to tracking the READ Scale levels of questions, we track a set of locally identified and defined “Question Types.”

 

Question Type

 

 

Locations used by reference staff for this study included the Terrell reference desk and the librarians’ offices. Those options can be seen illustrated in Figure 1 under the Location and Service Points columns (see Table 1).

 

The authors extracted the data from LibAnalytics, to increase the reliability and correct interpretation of the information. The charts, tables, and figures were organized in a single document before our analysis began. Lastly, we reviewed Terrell Library gate counts between 2012 and 2014.

 

The committee also combined these different data points, figuring monthly and semester averages of Terrell reference transactions. In an effort to account for any other factors that affect the number and quality of reference transactions, we gathered data on how our Springshare LibGuides were accessed. While it is not reasonable to conclude that a reference question was answered with every access of a LibGuide, the group saw value in looking at overall usage trends. Similarly, we looked at any changes in foot traffic into the Terrell Library by gathering gate count data.

 

Results

 

Our first interest was documenting the change in the number of reference transactions from the two most common Location and Service Points: the Terrell reference desk and librarian offices. Table 2 contains the monthly average number of questions answered at the Terrell reference desk over the last several years.

 

Between 2012 and 2013, the number of average monthly questions answered at the desk declined by 16%. A more significant decline in reference desk transactions was recorded in 2014, when the average number of questions librarians received at the Terrell reference desk every month dropped to 584: a 35% decline from 2012, and a 22.5% decrease from 2013.

 

During the same time period, the average number of questions librarians answered in their offices declined 33% from 2012 to 2013, but increased slightly by 4.4% from 2013 to 2014.

 

 

Table 2 

Monthly average number of office and reference desk questions by for 2012, 2013, and 2014

Year

Monthly Average of Reference Desk Questions Answered

Monthly Average of Office Questions Answered

2012

898

135

2013

754

90

2014

584

94

 

 

Table 3

Temporal Look at Communication Modes in Librarian Offices

Transactions in 2012

Transactions in 2013

Transactions in 2014

In-person

427 (22%)

71 (7%)

85 (8%)

Telephone

164 (8%)

148 (14%)

87 (8%)

IM

640 (33%)

311 (29%)

87 (8%)

Email

323 (17%)

244 (23%)

450 (40%)

LibAnswers

395 (20%)

304 (28%)

415 (37%)

 

 

Using LibAnalytics, the reference committee then looked for trends in the data points represented in Table 1. Contact types did not significantly change at the reference desk between 2012 and 2014. For example, while the number of questions answered at the desk declined over those years, the percentage occurring “in-person” changed only 1%, from 93% in 2012 and 2013 to 94% in 2014. Telephone calls hovered between 4% and 5%. IM reference and email reference at the desk remained stable around 3% and 0%, respectively.

 

Reference transactions that took place within librarian offices, however, changed much more (see Table 3). The percentage of questions answered via email and LibAnswers has risen significantly, from a combined 37% in 2012 to 77% in 2014.

 

Next, we looked for trends in the READ levels recorded by library staff at the Terrell reference desk and Terrell offices over the same three year time period. We noted that in 2012, it was more common for staff to forget to record the READ level, so data for many transactions were not recorded. Over time, we became better at understanding the use of the READ Scale, and used it more frequently. For example, in 2012 a monthly average of 53 transactions, which accounts for 33% of the total transactions that took place in librarian offices, was not assigned a READ number. In 2013, the number of unassigned transactions fell to 5%. By 2014, the percentage of office transactions that did not have a READ Scale number assigned was only 3%. The averages in Table 4 below include only those transactions that were recorded with a READ Scale number.

 

READ value 1-3 questions (those which require less effort to answer) comprise the bulk of questions at the reference desk, while those READ values 4 and 5 have decreased. This is an inverse of the situation in the librarian offices. READ value 1 in librarians office fell dramatically, whilst value 2 remained constant. READ values 3, 4 and 5 increased from 49% in 2012 and 58% in 2013, to 62% in 2014. Temporary employees (TEs) providing reference service give assistance for questions including some with READ value 3. READ value 3 questions which are out of the TEs’ area of study, and all of those of READ value 4 and up are transferred or referred to a librarian by the temporary employees. Referral can happen either by email or by furnishing contact information for the appropriate subject specialist.

 

 

Table 4 

Monthly Averages of reference desk questions and office questions by READ values, 2012-2014

Reference Desk Questions

 

READ Values

Monthly READ Value Averages for 2012

Monthly READ Value Averages for 2013

Monthly READ Value Averages for 2014

1

187 (37%)

287 (40%)

248 (44%)

2

161 (32%)

252 (35%)

179 (31%)

3

125 (25%)

160 (22%)

127 (22%)

4

30 (6%)

24 (3%)

14 (2%)

5

6 (1%)

2 (0%)

1 (0%)

6

0

0

0

Office Questions

1

33 (30%)

4 (5%)

5 (5%)

2

27 (25%)

22 (26%)

25 (27%)

3

38 (35%)

43 (51%)

47 (51%)

4

9 (8%)

12 (14%)

12 (13%)

5

2 (2%)

3 (4%)

3 (3%)

6

0

0

0

 

 

Table 5

Monthly Averages of Question Types at the Terrell Reference Desk and Librarian Offices January 2012-December 2014

Reference Desk

Monthly Averages for 2012

Monthly Averages for 2013

Monthly Averages for 2014

Policy

12 (1%)

32 (4%)

13 (2%)

Technology

125 (14%)

150 (20%)

113 (19%)

Directional

301 (34%)

258 (34%)

186 (32%)

Reference

459 (51%)

313 (42%)

272 (47%)

Offices

Policy

4 (2%)

11 (12%)

15 (16%)

Technology

28 (17%)

22 (24%)

32 (34%)

Directional

49 (30%)

9 (10%)

7 (8%)

Reference

83 (51%)

48 (53%)

39 (42%)

 

 

The next set of tables displays trends in the types of questions asked. Table 5 shows a slight dip in Reference questions – those queries related to searching for and finding information. Technology questions at the desk, on the other hand, have increased by 5%. The greater percentage of Policy questions in 2013 was partly due to some confusion as to what constituted that type of question. After some training and discussion, the group of librarians and staff reached a broader consensus on what constitutes Policy questions, which changed how those were recorded. Generally, Table 5 shows that the percentages of each of the Question Types have remained consistent at the Terrell reference desk.

 

Table 5 demonstrates real changes occurring in the type of questions librarians are seeing in their offices: increasingly, a higher percentage concerns Policy and Technology. The percentage of Reference and Directional questions has dipped.

 

We examined how reference hours (the number of hours that the Terrell reference desk offered services) and the staffing level (number of librarians and staff hours spent at the reference desk) changed over the three years, comparing numbers by like semester, to track the trends. Table 6 shows that the hours of service have dropped 9% during Spring and Fall semesters.

 

Table 7 shows that between 2012 and 2014, the average number of hours the desk was staffed during the Spring semester dropped almost 12% (from 223 to 198). Average hours for staffing during the Fall dropped 13% (239 to 210

 

Next, we looked at the changes in reference desk staffing in terms of library staff vs. graduate student worker or temporary employees (TEs) assigned to the desk. The TEs consisted of primarily graduate students, as well as a few select undergraduates. Table 8 shows that the role of TEs at the desk has increased and librarian time has decreased. TEs now staff the desk at almost the same level as librarians. There are rarely two librarians on the desk simultaneously. More commonly, one librarian and a TE, or two TEs, are at the desk at any time.

 

 

Table 6

Number of Regularly Scheduled Hours per Week in which the Terrell Reference Desk Provided Service

Year

Semester

Number of Regularly Scheduled Hours per Week

2012

Spring

46

 

Summer

20

 

Fall

46

2013

Spring

46

 

Summer

20

 

Fall

42

2014

Spring

42

 

Summer

20

 

Fall

42

 

 

Table 7

Terrell Reference Desk Staffing Levels 2012-2014

Spring 2012

Spring 2013

Spring 2014

Total hours staffed

1112

1299

988

Total Questions

9632

6855

5729

Average hours staffed per month

223

260

198

Hours to Question Ration

.12

.19

.17

Summer 2012

Summer 2013

Summer 2014

Total hours staffed

285

302

295

Total Questions

4605

2870

1173

Average hours staffed per month

95

101

98

Hours to Question Ration

.06

.11

.25

Fall 2012

Fall 2013

Fall 2014

Total hours staffed

955

742

843

Total Questions

8203

5945

2943

Average hours staffed per month

239

186

210

Hours to Question Ration

.12

.12

.29

 

 

Table 8 also compares staffing levels for librarians and TEs, and provides an hours-to-questions ratio. The ratios have increased between 2012 and 2014, meaning that more time was spent per question (including time between questions).

 

The researchers also decided to incorporate data not previously considered in past reference service assessment. By collecting data on the use of LibGuides, a Springshare product which facilitates the creation of online content by non-programmers, we tracked additional patron activity (See Table 9). Over the past five years, LibGuides have replaced many of the Libraries’ web sites, and serve as informational resources for instruction and research. Between 2012 and 2014, the use of the WSU Libraries’ LibGuides increased 6.4%.

 

Finally, we looked at annual gate counts for the Terrell Library, to see if there was any possible correlation between in-library reference traffic and overall traffic. Between 2012 and 2014, foot traffic in the Terrell Library actually increased by 3.3%. Some part of this increase stems from the library going to a 24/7 operational schedule in 2014. See Table 10.

 

 

Table 8

Librarian Hours vs. Temporary Employee Hours at the Terrell Reference Desk for Spring and Fall Semesters 2012-1014

Average Librarian Hours per Week

Average TE Hours per Week

Percent of Weekly Staffed Hours by TEs

Spring 2012

46.5

21

31%

Fall 2012

53

12

18%

Spring 2013

54

24

31%

Fall 2013

32.5

23

41%

Spring 2014

33

27

45%

Fall 2014

33

23

41%

 

 

Table 9

LibGuide Views 2012-2014

Year

Views of Published LibGuides in Thousands

2012

217.4

2013

242.2

2014

231.3

 

 

Table 10

Gate Counts for Terrell Library 2012-2014

Year

Gate Counts (in the Millions)

2012

1.033

2013

1.037

2014

1.067

 

 

Discussion

 

The committee took considerable time to understand how this data informed answers to our research questions:  “Where is reference happening, and at what level of complexity? Where are librarians most needed?”

 

The university population seems increasingly comfortable accessing online information from the Libraries. Evidence of this can be seen in a) the increased use of LibGuides, and b) the rise in number of email and LibAnswers transactions. The latter composed 77% of questions answered in librarian offices in 2014. This indicates that librarians are more needed in their offices where LibGuide maintenance is more likely to occur, and where the other online transactions can happen without the interruption or time constraints one experiences at the desk.

 

The steep drop of in-person transactions at the Terrell reference desk seen in the data occurred when building hours actually expanded, and gate counts were rising. Students are entering the library to use it as a study space, without seeking research assistance from traditional services.

 

It also marks the time we introduced a tiered reference model at the Terrell desk. Evidence that the tiered reference model functions as we envisioned can be seen in the decreasing difficulty of the questions answered at the reference desk, and the increased difficulty of those addressed from offices. We hypothesize that the increase of READ value 3, 4, and 5 questions in the library offices is a result of the bifurcation of reference service.

 

Staffing changes at the reference desk have also contributed to this transition. We’ve noted (see Table 8) that more time is spent per question at the reference desk, which economically, is usually not optimal. However, with heavier reliance on student workers to field these questions, this has become less costly, because they do not earn as much per hour as a librarian.

 

Staffing changes over the years have been justified, as they have allowed librarians more time to maintain LibGuides and address complex questions, and TEs to field simpler ones. The data suggests that we continue the trend of using graduate students on the desk, and encourage librarians to provide more specialized assistance from their offices. By incorporating Kern’s (2006) call for a holistic approach, and articulating a research question before beginning our analysis, the reference committee was better able to identify the data that address our research questions, and plan ahead. 

 

Conclusions

 

A holistic look at reference statistics means considering all modes of reference service delivery. Kern (2006) recognized reference as a system of communication modes, which should be considered as a whole.

 

The authors have looked at a comprehensive set of data from reference transactions over multiple communication modes and staffing configurations to ask, essentially, “What is happening to reference?” An analysis of our data has demonstrated the growing significance of online transactions occurring in librarian offices, despite IM chat reference numbers being low. The tiered-reference model has largely facilitated this change, allowing librarians more time to work in their offices creating online guides, and address more complex questions from patrons.

 

Many questions remain unanswered. For example, we cannot say conclusively why reference transactions dropped so quickly during the study period, but we know that the trend is not unique to WSU. The data does not necessarily support any cause-and-effect hypothesis, but rather provides us a few snapshots of our reference services over three years.

 

There are other factors affecting reference services that are difficult to quantify, and are outside the realm of this paper: for example, library instruction sessions, changes in course assignments, and changing student demographics and skill sets.

 

We will continue tier-modeled reference, with layered points of discovery, complexity and specialization. The increased use of email, LibAnswers, and LibGuides suggests a developing library user who is very comfortable engaging and interacting with multiple sites within multiple tiers to discover information. It indicates that the university community is comfortable finding their information from the Libraries, but values it being at their fingertips.

 

References

 

Baro, E. E., Efe, B. U., & Oyeniran, G. K. (2014). Reference inquiries received through different channels: The challenges reference librarians face in university libraries in Nigeria. Reference Services Review, 42(3), 514-529. http://dx.doi.org/10.1108/RSR-09-2013-0049

 

Coffman, S. (2012). The decline and fall of the library empire. Searcher, 20(3). Retrieved from http://www.infotoday.com/searcher/apr12/Coffman--The-Decline-and-Fall-of-the-Library-Empire.shtml

 

Deineh, S., Middlemas, J., & Morrison, P. (2011). A new service model for the reference desk: The student research center. Library Philosophy and Practice (e-journal), (554). Retrieved from http://digitalcommons.unl.edu/libphilprac/554

 

Dinkins, D., & Ryan, S. M. (2010). Measuring referrals: The use of paraprofessionals at the reference desk. The Journal of Academic Librarianship, 36(4), 279-286. http://dx.doi.org/10.1016/j.acalib.2010.05.001

 

Dworak, E. (2011).  LibAnalytics. Charleston Advisor, 13(2), 41-44. http://dx.doi.org/10.5260/chara.13.3.41

 

Flatley, R., & Jensen, R. B. (2012). Implementation and use of the Reference Analytics module of LibAnswers. Journal of Electronic Resources Librarianship, 24(4), 310-315. http://dx.doi.org/10.1080/1941126X.2012.732838

 

Gerlich, B. K., & Berard, L. G. (2007). Introducing the READ Scale: Qualitative statistics for academic reference services. Georgia Library Quarterly43(4), 7-13. Retrieved from http://digitalcommons.kennesaw.edu/glq/vol43/iss4/4

 

Gossett, J. G., Stephan, E., & Marrall, R. (2012). Implementing reference statistics collection software at multiple library service points. New Library World, 113(5/6), 235-248. http://dx.doi.org/10.1108/03074801211226328

 

Huling, N. (2002). Reference services and information access. In Schement, J. R. (Ed.), Encyclopedia of Communication and Information (2nd ed.) (pp. 867-874). New York: Macmillan Reference USA.

 

Kern, K. M. (2006). Looking at the bigger picture: An integrated approach to evaluation of chat reference services. The Reference Librarian, 46(95-96), 99-112. http://dx.doi.org/10.1300/J120v46n95_07

 

King, V., & Christensen-Lee, S. (2014). Full-time reference with part-time librarians. Reference & User Services Quarterly, 54(1), 34-43. http://dx.doi.org/10.5860/rusq.54n1.34

 

Lederer, N., & Feldmann. L. M. (2012). Interactions: A study of office reference statistics. Evidence Based Library and Information Practice, 7(2), 5-19. Retrieved from http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/12282

 

Martin, P. N. (2009). Societal transformation and reference services in the academic library: Theoretical foundations for re-envisioning reference. Library Philosophy and Practice, (260). Retrieved from http://www.webpages.uidaho.edu/~mbolin/pamelamartin.htm

 

Meserve, H. C., Belanger, S. E., Bowlby, J., & Rosenblum, L. (2009). Developing a model for reference research statistics. Reference & User Services Quarterly, 48(3), 247–258. http://dx.doi.org/10.5860/rusq.48n3.247   

 

Tyckoson, D. A. (2012). Issues and trends in the management of reference services: A historical perspective. Journal of Library Administration, 52(6/7), 581–600. http://dx.doi.org/10.1080/01930826.2012.707953