Article
Understanding Patterns of Library Use Among
Undergraduate Students from Different Disciplines
Ellen Collins
Research Consultant
Research Information Network
London, United Kingdom
Email: ellen.collins@researchinfonet.org
Graham Stone
Information Resources Manager
University of Huddersfield
Huddersfield, United Kingdom
Email: g.stone@hud.ac.uk
Received: 17 Jan. 2014 Accepted: 16
July 2014
2014 Collins and Stone. This is an
Open Access article distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – To test whether routinely-generated library usage data could be linked
with information about students to understand patterns of library use among
students from different disciplines at the University of Huddersfield. This
information is important for librarians seeking to demonstrate the value of the
library, and to ensure that they are providing services which meet user needs.
The study seeks to join two strands of library user research which until now
have been kept rather separate – an interest in disciplinary differences in
usage, and a methodology which involves large-scale routinely-generated data.
Methods – The study uses anonymized data about individual
students derived from two sources: routinely-generated data on various
dimensions of physical and electronic library resource usage, and information
from the student registry on the course studied by each student. Courses were
aggregated at a subject and then disciplinary level. Kruskal-Wallis and Mann
Whitney tests were
used to identify statistically significant differences between the
high-level disciplinary groups, and within each disciplinary group at the
subject level.
Results – The study identifies a number of statistically
significant differences on various dimensions of usage between both high-level
disciplinary groupings and lower subject-level groupings. In some cases,
differences are not the same as those observed in earlier studies, reflecting
distinctive usage patterns and differences in the way that disciplines or
subjects are defined and organised. While music students at Huddersfield are
heavy library users within the arts subject-level grouping arts students use
library resources less than those in
social science disciplines, contradicting findings from studies at other
institutions, Computing and engineering students were relatively similar,
although computing students were more likely to download PDFs, and engineering
students were more likely to use the physical library.
Conclusion – The technique introduced in this study
represents an effective way of understanding distinctive usage patterns at an
individual institution. There may be potential to aggregate findings across
several institutions to help universities benchmark their own performance and
usage; this would require a degree of collaboration and standardisation. This
study found that students in certain disciplines at Huddersfield use the
library in different ways to students in those same disciplines at other
institutions. Further investigation is needed to understand exactly why these
differences exist, but some hypotheses are offered.
Introduction
Libraries and librarians have often been accused of
deciding on what’s best for the user without consultation (Wells, 1996; Wilson,
2000; Tilley, 2013). “One of the most complex issues to deal with in acquiring
knowledge about students is concerned with the assumptions library staff make
about student behaviour” (Tilley, 2013, p.84).
However, in times of austerity in higher education
funding, increased competition for financial resources within a University as
well as increased competition between universities this approach is no longer
adequate. Simply counting data, such as anonymized usage statistics, or
assuming that librarians and libraries know ‘best’ is no longer enough.
Libraries must justify both their value and impact to university senior
management and to the student body who want to see their fees are invested in
services that will add value to their studies. However, as Oakleaf suggests,
“Librarians can develop systems that will allow data collection on individual
user library behaviour”… …“Until librarians do that, they will be blocked in
many of their efforts to demonstrate value” (Oakleaf, 2010, p.96).
One important aspect of this work is recognizing
different patterns of usage among different groups of library patrons. We have
long known that information behaviours are very different in different
disciplines (Covi, 1999; Whitmire, 2002). In order to develop services which
meet these different needs, and to thereby show that the library has value,
librarians must first understand patterns of need and usage among different
groups.
The first stage of the Library Impact Data Project
(LIDP), based at the University of Huddersfield, established that a
statistically significant relationship existed across a number of UK
universities between library activity data and student attainment (Stone & Ramsden,
2013). The second phase of the project looked at the data in more detail to
establish whether there is a relationship between subject discipline and
undergraduates’ use of academic libraries. The paper will outline the
methodology of the research and present findings that show that there is a
statistically significant difference between various disciplines on several
different dimensions of physical and electronic library usage. The paper
concludes with a discussion of the findings and recommendations for further
study.
Literature review
The literature shows a longstanding interest in the
differences between disciplines, and how these affect the way students and
researchers use the library. A large number of approaches, methodologies, and
definitions were used in order to try to understand the answer to this
question. Studies have used surveys, both purpose-built (Chrzastowski &
Joseph, 2006; Housewright, Schonfeld, & Wulfson, 2013) and re-analysis of
pre-existing responses (Whitmire, 2002), case studies (Meyer et al., 2011;
Bulger et al., 2011), or a combination of the two (Maughan, 1999) to try to
understand disciplinary differences. The specific definitions of disciplines
have been shaped to fit the needs of research methods or of organisational
structures. For example, the case-study approach adopted by both Meyer et al.
(2011) and Bulger et al. (2011) demanded an intense focus on very small and
tightly-defined groups of researchers, while Housewright et al.’s 2013 survey
used high-level categories to define disciplines in order to permit statistical
analysis. Chrzastowski and Joseph (2006) use high-level categories in order to
fit with their university’s organisational structure, but Whitmire (2002) is
forced to exclude the life scientists at her institution from her analysis,
because the theoretical structure of the study does not allow for them. Studies
have also looked at different groups of library users: undergraduates (Wells,
1996; Bridges, 2008; Cox & Jantti, 2012), postgraduates (Chrzastowski &
Joseph, 2006), and researchers at all stages of their careers (Meyer et al.,
2011; Bulger et al., 2011; Housewright et al., 2013; Tenopir & Volentine,
2012). Finally, they have adopted various definitions of what constitutes
library use – from gate entries to e-resource usage, book borrowing to
searching behaviours – to explore how different groups engage with the library
and its services.
The differences in methodology and approach limit
librarians’ ability to make use of the findings in their own context. In some
cases, findings are relatively consistent across studies: for example, arts and
humanities are usually found to be the biggest users of library materials (De
Jager, 2002; Maughan, 1999; Whitmore, 2002). Nackerud et al. (2013) found, at a
more granular level, that College of Design undergraduates were the highest
library borrowers in their study. But in other instances, different ways of
defining subjects and user groups can lead to confusion in understanding
exactly how findings may apply in other settings. For example, many studies
found engineering students to be the least engaged library users across
resources (Kramer & Kramer, 1968; Bridges, 2008; Cox & Jantti, 2012;
Nackerud et al., 2013). However Chrzastowski and Joseph (2006) found that
graduate students from the physical sciences and engineering used online
resources more than graduates in other disciplines. This study looks at a
smaller group of students (graduate students only) but across a bigger
selection of disciplines (physical sciences and engineering). How is a reader
to tell which change has made the difference, or whether there is something
inherent to the University of Illinois at Urbana-Champaign, where their study
was carried out, that is affecting the results?
In recent years, a new group of studies have begun to
take a more data-driven approach to understanding library usage, deriving value
from data that is routinely generated by people who use the library – gate
entries or e-resource logins for example (Jisc, 2012). This data is then linked
with information from student registry or central administration systems,
including degree classifications, demographic characteristics, and discipline.
The advantage of this methodology is twofold. First, unlike survey or interview-based
studies, it does not rely upon self-reported data to understand the phenomenon
being investigated. Second, it can capture data for every student in the
institution, removing the possibility of bias on the part of either the
researcher or the survey respondents.
Most studies using this methodology were directed
towards understanding the relationship between student library usage and degree
result, usually in order to engage university management with the importance
and value of the library. So, for example, Wong and Webb (2011), Cox and Jantti
(2012), Stone and Ramsden (2013), and Soria, Fransen, and Nackerud (2013) have
looked at various measures of library usage to understand their relationship
with final degree outcome. All of these studies have demonstrated a
statistically significant relationship, though they hold back from inferring
what kinds of cause and effect mechanisms may be at work.
Some of these studies have begun to incorporate other
variables into their work such as the demographic characteristics of library
users (Stone and Collins, 2013). Other studies have looked specifically at
usage by discipline. Nackerud et al. (2013) showed use by college of all types
of library use, finding, for example, that 100% of pharmacy students visited
the library in one semester. Nonetheless, much of this work continues to
examine usage in the context of attainment. Jantti and Cox (2013) broke down
their analysis by department in order to show that the science faculty got the
most academic benefit from books and electronic resources, while health and
behavioural sciences obtained the least academic benefit from books, and
creative arts the least from electronic resources. While very informative for
librarians seeking to demonstrate the impact of their work, this analysis does
not provide information to identify how different groups use the library.
This study attempts to fill a hole in the literature
by using routinely-generated data to understand different usage patterns across
disciplines within a single institution. Studies based upon a survey
methodology do not typically achieve high response rates: 14% in the case of
Chrzastowski and Joseph (2006) and 7.8% in the case of Housewright et al.
(2013). There can also be problems around recollection: Tenopir and Volentine
(2012) deal with this through a critical incident technique which asks about
the last time the respondent used the library in a particular way, but this
relies upon large numbers of respondents. Case study techniques, while
providing considerable depth of understanding, have similar problems around
recall, and cannot always be generalised to wider communities of interest.
Using routinely-generated data circumvents the problems of generalizability and
recall, and presents an interesting opportunity to understand exactly how
students at a particular institution use their library.
Aims
This study explores how full-time undergraduate
students in a range of disciplines at the University of Huddersfield use the
library and information resources. The aims are twofold: first, to explore
whether routinely-generated usage data can be used to provide an insight into
working patterns, and second, to analyze the different patterns of usage to
inform librarian practice and the support services offered to students.
Methods
There were two sources of data for this analysis. The
first was data that are routinely generated when students use Huddersfield’s
physical or electronic library resources, such as library gate entries, logins
to e-resources, or hours spent on library computers. E-resource data do not
relate to a specific resource used, but that the student logged into a
database. This methodology was also used by the Minnesota study (Nackerud et
al., 2013). The second were data from Huddersfield’s student registry, such as
information on demographic characteristics, course and mode of study, and final
degree result (where available). These datasets were amalgamated using unique
identifiers and then anonymized.
Both datasets underwent considerable processing before
analysis could be undertaken. Only full-time undergraduate students based at
Huddersfield’s main campus were included. The usage data were restructured to
create new variables that permitted more sensitive analysis. For example, the
data on e-resource logins were aggregated to give the hours spent logged into
e-resources, counting the number of hours in a year when students logged into
e-resources at least once.
The analysis method required the 105 full-time
undergraduate courses offered by Huddersfield at the time of the research to be
grouped into a small number of categories; ideally no more than six. Upon
discussion with project stakeholders, we established that in doing this we
would lose a great deal of detail and produce findings that, while useful,
would be too broad an approach. To permit both rigorous analysis and useful
outputs we adopted a two-tier approach; grouping courses into subject-level
groups, and then aggregating these subject groups into higher-level
disciplinary groupings. We could then compare subject groups within each
disciplinary grouping, and also compare the disciplinary groupings for some
high-level results. Note that it is not possible to compare subjects from
different disciplinary groupings using the results we have provided here.
These groupings reflect the distribution of students
and courses within Huddersfield and were determined by library staff. In some
cases, only a top-level disciplinary grouping exists, because there is no
logical way to subdivide into smaller groups – usually because Huddersfield
does not offer many courses in this area. Universities wishing to replicate
this study will need to identify a disciplinary structure which suits the
profile of courses at their institution.
Complete lists of library usage variables and their
definitions are shown in Table 1. A list of disciplines and their respective
student enrolment by course is shown in Table 2.
The data were analyzed using SPSS. They were tested
for normality and found to be non-normal. We therefore used Kruskal-Wallis and
Mann-Whitney tests to establish whether a relationship existed between
discipline and the usage variables. On disciplinary groups with three or more
variables, we used an initial Kruskal-Wallis test to identify whether a
statistically significant difference existed followed by Mann-Whitney tests to
identify which variables differed from each other. A Bonferroni correction was
applied to these Mann-Whitney tests to compensate for the increased chance of
Type 1 errors from multiple Mann-Whitney tests. For groups with two variables,
we simply used the Mann-Whitney test.
For tests with six or more groups, we used a control
group in our second stage of testing (the Mann-Whitney tests). This was to
ensure we did not have an unacceptably small p value for the significance
testing, following the Bonferroni correction. In each case, we selected the
largest group as our control, in order to identify differences from the
majority which might not be noticed by librarians in their day-to-day work. At
the disciplinary level, social sciences was selected as the control as it was
the largest group (contained the highest number of students). There was no need
to use a control group for any of the subject-level analysis as these all
contained five or fewer groups.
Throughout our analysis, we have followed Cohen (1992)
in classifying effect sizes:
.1 – small effect
.3 – medium effect
.5 – large effect
Table 1
Library Usage Definitions
Variable |
Definition |
Number of items
borrowed |
Items checked out
from the library; not limited to books |
Number of library
visits |
Measured via gate
entries – all students must swipe their ID card to enter the library, this
data is recorded on library systems |
Hours logged into
library PC |
Number of hours in a
year in which a student was logged into a library PC (maximum possible number
of PC hours per year is 8, 760 = 24 hours x 365 days). Multiple logins within
a single hour on a single day are not counted |
Hours logged into
e-resources |
Number of hours in a
year in which a student was logged into e-resources, both on-site and remote
logins (maximum possible number of e-resource hours per year is 8,760 = 24
hours x 365 days). Multiple logins within a single hour on a single day are
not counted |
Number of PDF
downloads |
|
Total number of
e-resources accessed |
The number of
different e-resources accessed both on-site and through remote logins. Within
Huddersfield’s data, a single e-resource varies from an individual journal
subscription to a large multi-journal platform or database, so this data must
be treated with some caution |
Number of e-resources
accessed 5 or more times |
|
Number of
e-resources accessed 25 or more times |
|
Table 2
Course Enrolment
Discipline |
Subject |
Number of students |
Science |
Science |
30 |
Discipline total |
30 |
|
Health |
Health |
138 |
Discipline total |
138 |
|
Computing and engineering |
Computing |
74 |
Engineering |
43 |
|
Discipline total |
257 |
|
Arts |
Music |
74 |
Architecture |
59 |
|
Fashion |
130 |
|
2D Design |
29 |
|
3D Design |
47 |
|
Discipline total |
339 |
|
Humanities |
English |
70 |
Drama |
41 |
|
Media and Journalism |
111 |
|
Discipline total |
222 |
|
Social sciences |
Business, management and accountancy |
352 |
Law |
60 |
|
Behavioural sciences |
236 |
|
Social work |
85 |
|
Education |
70 |
|
Discipline total |
803 |
Results
Table 3 shows the median values for each measure of
library usage at the discipline level. Table 4 shows the effect sizes, in a
range from 0 to 1, and the statistical significance of Mann-Whitney tests on
each measure when comparing the discipline to the control group of social
sciences. Social sciences has been used as a control because it is the largest
disciplinary group (containing the highest number of students). A light grey
cell indicates that usage in the group under examination was lower than in the
control group of social sciences, while a darker grey cell indicates that it
was higher than the control group. Cells that have no highlighting indicate no
significant difference between the group and the control group. All results are
significant at the .005 level, which is the value generated by the Bonferroni
correction for a .05 significance level.
Table 4 shows that students within the social science
grouping are, in most respects, significantly higher users of library content
and resources than any other disciplinary grouping. Arts students are the
lowest users, with a large effect size for the number of PDF downloads, and
medium effect sizes for most of the variables associated with e-resource use.
The courses which make up arts disciplines may explain this lower level of
usage. Many of them rely upon visual or audio content rather than the journal
articles available via Huddersfield’s e-resources.
Tables 5 and 6 show the breakdown of the arts group in
more detail. In this case, we compared all of the groups against each other, so
Table 6 is slightly more complex. The top line shows the two groups that we are
comparing, and the letter in the cell indicates which group was higher as per
the key below the figure. As before, a blank cell indicates no significant
difference between the two groups. All results are significant at the .001
level, which is the value generated by the Bonferroni correction for a .05
significance level.
Table 3
Median Values for Library Usage Measures, by
Discipline
Library usage measure |
Number of items borrowed |
Number of library visits |
Hours logged into library PC |
Hours logged into e-resources |
Number of PDF downloads |
Number of e-resources accessed |
Number of e-resources accessed 5 or more times |
Number of e-resources accessed 25 or more times |
Science |
14.0 |
180.5 |
11.5 |
16.0 |
32.0 |
11.0 |
1.5 |
0.0 |
Computing and engineering |
10.0 |
48.0 |
4.0 |
6.0 |
10.0 |
6.0 |
0.0 |
0.0 |
Arts |
29.0 |
132.0 |
18.0 |
5.0 |
1.0 |
5.0 |
0.0 |
0.0 |
Humanities |
43.0 |
116.5 |
16.0 |
28.5 |
46.0 |
14.0 |
3.0 |
0.0 |
Health |
57.5 |
111.5 |
13.0 |
47.0 |
111.5 |
26.5 |
6.0 |
0.0 |
Social sciences |
43.0 |
112.0 |
16.0 |
26.0 |
47.0 |
14.0 |
2.0 |
0.0 |
Table 4
Effect Sizes and Statistical Significance of
Mann-Whitney Tests by Discipline
Library usage measure |
Number of items borrowed |
Number of library visits |
Hours logged into library PC |
Hours logged into e-resources |
Number of PDF downloads |
Number of e-resources accessed |
Number of e-resources accessed 5 or more times |
Number of e-resources accessed 25 or more times |
Science |
.232 |
|
|
|
|
|
|
|
Computing and engineering |
.337 |
.214 |
.106 |
|
.283 |
.281 |
.272 |
.157 |
Arts |
.193 |
|
|
.435 |
.559 |
.485 |
.432 |
.183 |
Humanities |
|
.113 |
.064 |
|
.138 |
|
|
.087 |
Health |
.064 |
.295 |
.147 |
|
.057 |
.114 |
|
.147 |
Table 5
Median values for Library Usage Measures for Arts
Discipline, by Subject
Library usage measure |
Number of items borrowed |
Number of library visits |
Hours logged into library PC |
Hours logged into e-resources |
Number of PDF downloads |
Number of e-resources accessed |
Number of e-resources accessed 5 or more times |
Number of e-resources accessed 25 or more times |
Music |
107.0 |
162.0 |
10.5 |
17.5 |
5.0 |
8.0 |
1.0 |
0.0 |
Architecture |
26.0 |
81.0 |
21.0 |
12.0 |
18.0 |
10.0 |
1.0 |
0.0 |
Fashion and textiles |
21.0 |
124.5 |
18.0 |
2.0 |
0.0 |
2.0 |
0.0 |
0.0 |
2D design |
2.0 |
162.0 |
42.0 |
4.0 |
0.0 |
2.0 |
0.0 |
0.0 |
3D design |
43.0 |
164.0 |
18.0 |
8.0 |
1.0 |
5.0 |
0.0 |
0.0 |
Table 6
Effect Sizes and Statistical Significance of
Mann-Whitney Tests in Arts Discipline, by Subject*
Library usage measure |
Number of items borrowed |
Number of library visits |
Hours logged into library PC |
Hours logged into e-resources |
Number of PDF downloads |
Number of e-resources accessed |
Number of e-resources accessed 5 or more times |
Number of e-resources accessed 25 or more times |
Music /Architecture |
.646 (M) |
.434 (M) |
|
.300 (M) |
|
|
.322 |
.256 |
Music /Fashion |
.524 (M) |
|
|
|
.315 (M) |
.292 (M) |
.248 |
|
Music /2D design |
.621 (M) |
|
|
.361 (M) |
.293 (M) |
.322 (M) |
.401 |
.363 |
Music /3D design |
.676 (M) |
|
.280 (3D) |
.430 (M) |
.488 (M) |
.427 (M) |
.428 |
.316 |
Architecture /Fashion |
|
.352 |
|
|
|
|
|
|
Architecture /2D design |
|
.328 (2D) |
|
|
|
|
|
|
Architecture /3D design |
|
|
|
|
.324 (3D) |
.299 (3D) |
|
|
Fashion /3D design |
|
.363 |
|
|
|
|
|
|
*Music (M); Architecture (A); Fashion (F); 2D Design
(2D); 3D Design (3D)
Clearly, music dominates usage against all other subjects
on a number of variables and, in relation to the number of items borrowed, with
a large effect size. This may be because the music subject group includes some
courses that might have fitted alongside English or drama in the humanities
group, as well as some that are more technology-focused and rightly belong in
the arts group. It is also worth noting that fashion students visit the library
frequently; this may be because they are making extensive use of the art and
design resource area which has traditionally been strong in their discipline.
Architects have a separate resource area outside the library, which may explain
their lower levels of usage. We found no statistically significant differences
in usage when comparing 2D design with fashion and with 3D design.
Table 7 shows the breakdown for subject groups within
the social science discipline, and table 8 shows the results of the statistical
tests. Again, all the groups are compared with each other. All results are
significant at the .001 level, which is the value generated by the Bonferroni
correction for a .05 significance level.
Many of the effect sizes in this group are large,
indicating very different patterns of usage between subjects. Overall, students
in behavioural sciences tend to show the highest usage on most measures, when
compared to other subjects. Business students have higher usage than law,
social work, and education students on several dimensions but not on the number
of items borrowed, which is consistently lower (and with a large effect size).
Lawyers are extremely low users of library resources, particularly e-resources;
we hypothesize that this may be because, more than any other discipline, they
rely upon a few core texts which they purchase for themselves. We observed no
difference in usage for social work and education, which may reflect a
similarity in how these two groups of vocational courses are taught.
Table 7
Median Values for Library Usage Measures for Social
Sciences Discipline, by Subject
Library usage measure |
Number of items borrowed |
Number of library visits |
Hours logged into library PC |
Hours logged into e-resources |
Number of PDF downloads |
Number of e-resources accessed |
Number of e-resources accessed 5 or more times |
Number of e-resources accessed 25 or more times |
Business |
26.0 |
113.0 |
17.0 |
33.0 |
74.5 |
13.5 |
3.0 |
0.0 |
Law |
24.0 |
159.5 |
25.5 |
0.0 |
0.0 |
0.0 |
0.0 |
0.0 |
Behavioural sciences |
89.0 |
132.5 |
22.0 |
34.5 |
74.0 |
18.0 |
3.0 |
0.0 |
Social work |
81.0 |
74.0 |
8.0 |
18.0 |
29.0 |
16.0 |
1.0 |
0.0 |
Education |
72.0 |
76.5 |
4.0 |
21.0 |
42.0 |
17.0 |
2.0 |
0.0 |
Table 8
Effect Sizes and Statistical Significance of
Mann-Whitney Tests in Social Sciences Discipline, by Subject*
Library usage measure |
Number of items borrowed |
Number of library visits |
Hours logged into library PC |
Hours logged into e-resources |
Number of PDF downloads |
Number of e-resources accessed |
Number of e-resources accessed 5 or more times |
Number of e-resources accessed 25 or more times |
Business /Law |
|
|
|
.477 (B) |
.456 |
.459 (B) |
.421 (B) |
.200 |
Business /Behavioural sciences |
.590 (BS) |
|
|
|
|
.175 (BS) |
|
|
Business /Social work |
.409 (SW) |
.264 (B) |
.185 (B) |
.155 (B) |
.168 |
|
.139 (B) |
|
Business /Education |
.405 (E) |
.154 (B) |
.177 (B) |
|
|
|
|
|
Law /Behavioural sciences |
.537 (BS) |
|
|
.573 (BS) |
.549 |
.576 (BS) |
.477 (BS) |
.188 |
Law /Social work |
.642 (SW) |
.354 (L) |
.265 (L) |
.636 (SW) |
.626 |
.679 (SW) |
.565 (SW) |
.257 |
Law /Education |
.715 (E) |
|
.276 (L) |
.744 (E) |
.713 |
.775 (E) |
.724 (E) |
.358 |
Behavioural sciences/Ssocial work |
|
.358 (BS) |
.220 (BS) |
|
|
|
|
|
Behavioural sciences /Education |
|
.213 (BS) |
.219 (BS) |
|
|
|
|
*Business (B); Law (L); Behavioural Sciences (BS);
Social Work (SW); Education (E)
The computing and engineering subgroups had very few
differences between them. Computing students were more likely to visit the
library (median = 61.0, r=.362) and spent more hours logged into the library
PCs (median = 8.0, r=.235). We think that this may be because computing
students are more likely to use their own personal computing equipment,
compared to the engineers.
Among the humanities subgroups, there were no
statistically significant differences in usage between students on the English
and drama courses. However, both groups showed higher levels of usage than
media students on most of the e-resource dimensions, English students with
slightly bigger effect sizes. This probably reflects the way that the courses
are taught, and in particular the importance of written texts and criticisms to
English and drama students.
Discussion
Our results demonstrate the value of a data-driven
approach for librarians seeking to understand usage patterns among library
users from different disciplines. Comparing our findings to previous studies,
several disparities appear. Arts and humanities students are not particularly
heavy library users, as they have been found to be in earlier work (De Jager,
2002; Nackerud et al., 2013; Maughan, 1999); in fact, they are lower users than
social scientists on most dimensions. Earlier research found computing and
engineering students to be relatively low users of library resources (Kramer
& Kramer, 1968; Bridges, 2008; Cox & Jantti, 2012; Nackerud et al.,
2013), although Chrzastowski and Joseph (2006) found that graduate students
from the physical sciences and engineering used online resources more than
graduates in other disciplines. Again, our results show that although students
from the computing and engineering discipline are low users (relative to the
control group of social sciences), they are not particularly different from
some other disciplines, such as arts, in this respect.
This study was also able to show quite nuanced
differences in library usage within the high-level subject groupings. This
information, for example – showing the high usage level of musicians compared
to other “arts” subjects, or the strong usage by behavioural scientists
compared to other social science groupings – helps librarians develop a more
realistic understanding of how students use resources and to target areas of
particularly low uptake which may be masked by the behaviour of bigger groups
within a subject. This is a distinct advantage of this methodology over earlier
survey-based methodologies, where response numbers were too small to permit
statistical analysis at this level of granularity.
Findings from this phase of LIDP regarding subject
disciplines gives the library evidence that a one size fits all approach, such
as information literacy sessions could be enhanced by intelligence from library
analytics. For example, known “low-use” subjects could be targeted differently
from known “high use” subjects in order to give a more personalized boutique
service to the end user. This addresses one of Tilley’s (2013) success factors
of the boutique model, “[k]nowledge of users’ needs and activity-their
preferences, the irritants-and their methods of working” (p.82). However, using
library analytics and making the assumption that increased use of library
resources may lead to increased achievement, knowledge of subject cohorts methods
of working could be used to guide them to appropriate resources.
Of course, this methodology retains some limitations.
The usage measures are very accurate representations of student behaviour but
we must be cautious about how we interpret them. For example, we cannot claim
that students only entered the library in order to study, as other student
services were also located there at the time of the study; gate entries
recorded by library systems might represent students seeking help with issues
completely outside the library. Interestingly, previous research indicated that
gate entries are one of the library usage measures that are not correlated with
student outcomes (Stone & Ramsden, 2013).
We must also be cautious about over-interpreting why
usage patterns look the way they do. Qualitative methodologies are more useful
in understanding this kind of issue. Face to face discussions with the cohort
provides a much richer seam of information. Tilley explores this in her
discussion on the knowledge about English students at the University of
Cambridge and the implications for the library service. But library analytics
can help to identify the “context” that Tilley (2013) describes, which, “…
allows us to prioritize areas of our service for improvement” (p. 91). This is
also supported by Poll (2012) who suggests a mixed methods approach as the most
effective way of exploring library impact. At Huddersfield, this mixed methods
approach has been adopted and used to support the findings of the LIDP. Towards
the end of the study, a focus group was held with a cohort of computing students
– a cohort that had been identified as low users in the study. This proved
valuable as a way to evidence the data from the project in a real life
situation, where students could explain their reasons for library use. As
Tilley (2013) states, this should not be a one off conversation, but the
beginning of frequent knowledge collection.
There have also been two spin off projects at
Huddersfield that were heavily influenced by the study. The first is the
‘Roving Librarian’ project, which was being piloted at the time of the study,
and was continued using the findings of LIDP in order to target areas of low
use. “The statistics gathered showed that many students are not using our
resources…” Therefore the Roving Librarian project extended its roving “… to
take it to social spaces and resource centers within all schools to reach
students who may otherwise be library non-users.” (Sharman & Walsh, 2012)
The other project to come out of LIDP was Lemontree (Running in the Halls,
2012), which was designed to be a fun, innovative, low input way of engaging
students through new technologies and increasing use of library resources. When
registering for Lemontree, students sign terms and conditions that allow their
student number to be passed to Computing and Library Services (CLS), which
allows CLS to track usage of library resources by Lemontree gamers versus
students who do not take part. This study only planned to come up with a proof
of concept, however, over 850 users registered by October 2012, thus providing
a solid base for further analysis in order to establish whether intervention
using gamification can have an impact throughout a student’s academic course.
Since completion of the study, Lemontree, now known as Librarygame (Running in
the Halls, 2013), is being used by the universities of Huddersfield, Glasgow,
and Manchester.
Just as this study identifies findings that contradict
earlier research, we would not expect that the findings at Huddersfield will
necessarily translate into other institutions. The subject groupings reflect
Huddersfield’s structure and strengths, and may not be typical of other
universities in the U.K., let alone in the wider higher education sector. The
specificity which makes our findings so useful at Huddersfield make them much
less useful to other institutions, and mean that it can be rather difficult to
benchmark the library’s strengths and weaknesses against comparable
institutions, or to aggregate data to get a better picture of usage patterns
across institutions (a strength of the first phase of the work, which worked
with eight institutions altogether) (Stone & Ramsden, 2013). With this in
mind, towards the end of the study, the project collaborated with colleagues at
Mimas (2013) to produce a library analytics survey in order to assess the
demand for a national library analytics tool. The survey found that 94.6% of
those who replied wanted to benchmark their data with other institutions and
that 87.7% were interested in the richer data that was used as part of this
study (Showers & Stone, 2014). As a result of the LIDP findings and the
LIDP-Copac survey, Jisc have commissioned a new project, the Library Analytics
and Metrics Project (JiscLAMP), which in 2013 produced, “a prototype shared
library analytics service for U.K. academic libraries” (Jisc, 2013).
Conclusions
This study examined whether large datasets could be
used to understand disciplinary differences in student library usage. It used
statistical analysis to explore routinely-generated data from the University of
Huddersfield’s library, linked to information about students from the student
registry.
This technique revealed significant differences among
groups of students and found that these differences were not always the same as
those identified by previous studies. In doing so, it demonstrated the value of
undertaking this analysis on an institution-by-institution basis in order to
avoid developing services based upon information from other universities or
studies which may not reflect usage patterns across all institutions.
Unlike more qualitative methods, the technique is
unable to say much about why these different usage patterns exist. However,
findings could be followed up with focus groups or interviews with the groups
of students in question, in order to gain a greater depth of understanding.
The Jisc-funded (2013) Library Analytics and Metrics
Project (LAMP) is an interesting attempt to automate this analytics service for
libraries that are able to supply the relevant data; it also offers
opportunities to develop standardised definitions for subject, ethnicity,
country of residence, and other demographic variables so that they can analyse
their data on their own terms or compare it against other institutions. In 2014
LAMP produced an “ugly prototype”, which was able to manipulate the raw data
from this study and other partner institutions (Showers, Palmer & Stone,
2014). LAMP has now received additional funding to produce a shared service for
the U.K., which will enable libraries to submit their own data for analysis,
which will include statistical significance testing. This will allow follow up
research to be conducted by libraries that join the service.
Both phases of the LIDP have produced toolkits to aid
institutions wishing to collect and analyze their own data (Stone and Collins,
2012; Stone, Ramsden & Pattern, 2011), in addition a value impact starter
kit (Oakleaf, 2012) comprising 52 exercises for librarians, an outcome of the
value of academic libraries project (Oakleaf, 2010), is also available. The
LAMP project is also considering a toolkit approach in order to address concern
over the level of statistical knowledge required by users in order to interpret
the outputs of the system. One possible outcome would be to collaborate with
Oakleaf on a new toolkit and initial discussions are underway.
At the University of Huddersfield, discussions are now
underway to consider how the results of the study can be used to improve the
student experience. Now that the library can evidence the results of the study,
a set of briefing papers are planned for specific subject areas that shows the
evidence in areas that relate specifically to academic staff - it was decided
at an early stage that low usage is not acceptable in any discipline.
Furthermore, longitudinal data is required to look at usage over time so that
the library can start to benchmark and show whether interventions have made a
difference.
References
Bridges, L. M. (2008). Who is not using the library? A comparison of
undergraduate academic disciplines and library use, portal. Libraries and the Academy, 8(2), 187–196. http://dx.doi.org/10.1353/pla.2008.0023
Bulger. M., Meyer, E. T., de la Flor, G., Terras, M., Wyatt, S.,
Jirotka, M., Eccles, K., & Madsen, C. (2011). Reinventing research? Information Practices in the Humanities.
London, Research Information Network. Retrieved from http://www.rin.ac.uk/our-work/using-and-accessing-information-resources/information-use-case-studies-humanities
Chrzastowski, T. E., & Joseph, L. (2006). Surveying graduate and
professional students’ perspectives on library services, facilities and
collections at the University of Illinois at Urbana-Champaign: Does subject
discipline continue to influence library use? Issues in Science and Technology Librarianship, 45(Winter).
Retrieved from http://www.istl.org/06-winter/refereed3.html
Cohen, J. (1992). A power primer, Psychological
Bulletin, 112(1),
155-159. doi:10.1037/0033-2909.112.1.155
Covi, L.M. (1999). Material mastery: situating digital library use in
university research practices. Information
Processing & Management, 35(3),
293-316. http://dx.doi.org/10.1016/S0306-4573(98)00063-6
Cox, B.L., & Jantti, M. (2012). Capturing business intelligence
required for targeted marketing, demonstrating value, and driving process
improvement. Library and Information
Science Research, 34(4), 308-316.
http://dx.doi.org/10.1016/j.lisr.2012.06.002
De Jager, K. (2002). Impacts and outcomes: searching for the most
elusive indicators of academic library performance. In Meaningful Measures for Emerging Realities, proceedings of the 4th
Northumbria International Conference on Performance Measurement in Libraries
and Information Services. Washington, DC, Association of Research
Libraries. Retrieved from http://www.libqual.org/documents/admin/4np_secure.pdf
Housewright, R., Schonfeld, R. C., & Wulfson, K. (2013). Ithaka S+R, Jisc, RLUK, UK Survey of
Academics 2012. Retrieved from http://www.ulrls.lon.ac.uk/wesline/documents/UK_Survey_of_Academics_2012.pdf
Jantti, M., & Cox, B. (2013). Measuring the value of library
resources and student academic performance through relational datasets. Evidence
Based Library and Information Practice, 8(2), 163-171. Retrieved from http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/19574/15217
Jisc (2012). Activity Data:
Delivering Benefits from the Data Deluge. Retrieved from http://www.jisc.ac.uk/publications/reports/2012/activity-data-delivering-benefits.aspx
Jisc (2013). Library Analytics and
Metrics Project. Retrieved from http://jisclamp.mimas.ac.uk/
Kramer, L. A., & Kramer M. B. (1968). The college library and the
drop-out. College and Research Libraries,
29(4), 310-12. Retrieved from http://crl.acrl.org/content/29/4/310
Maughan, P. D. (1999). Library resources and services: A
cross-disciplinary survey of faculty and graduate student use and satisfaction.
Journal of Academic Librarianship, 25(5), 354-66. http://dx.doi.org/10.1016/S0099-1333(99)80054-8
Meyer, E. T., Bulger, M., Kyriakidou-Azcharoudiou, A., Power, L.,
Willians, P., Venters, W., Terras, M., & Wyatt, S. (2011). Collaborative yet independent: Information
practices in the physical sciences. London, Research Information Network.
Retrieved from http://www.iop.org/publications/iop/2012/page_53560.html
Mimas. (2013). Copac Collections
Management. Retrieved from http://copac.ac.uk/innovations/collections-management/
Nackerud, S., Fransen, J., Peterson, K., & Mastel, M. (2013).
Analyzing demographics: Assessing library use across the institution. portal:
Libraries and the Academy, 13(2),
131-145. Retrieved from http://www.press.jhu.edu/journals/portal_libraries_and_the_academy/portal_pre_print/current/articles/13.2nackerud.pdf
Oakleaf, M. (2010). The Value of
Academic Libraries, Chicago, IL., Association of College & Research
Libraries. Retrieved from http://www.ala.org/ala/mgrps/divs/acrl/issues/value/val_report.pdf
Oakleaf, M. (2012). Academic
library value: The impact starter kit. Syracuse, NY: Della Graphics, 2012.
Poll, R. (2012). Can we quantify the library's influence? Creating an
ISO standard for impact assessment. Performance
Measurement and Metrics, 13(2), 121–130. http://dx.doi.org/10.1108/14678041211241332
Running in the Halls (2012). Lemontree. Retrieved from http://librarygame.co.uk/
Running in the Halls (2013). Librarygame. Retrieved from https://library.hud.ac.uk/lemontree/
Sharman, A., & Walsh, A. (2012). Roving librarian at a mid-sized, UK
based university. Library Technology
Reports, 48 (8). 28-34. http://dx.doi.org/10.5860/ltr.48n8
Showers, B., Palmer, J., & Stone, G. (2014). JiscLAMP: Shining a Light on our Analytics and Usage Data. Paper
presented at UKSG 37th Annual Conference and Exhibition, HIC, Harrogate.
Retrieved from http://eprints.hud.ac.uk/19260/
Showers, B., & Stone, G. (2014). Safety in numbers: Developing a
shared analytics service for academic libraries. Performance Measurement and Metrics, 15(1/2). 13-22. http://dx.doi.org/10.1108/PMM-03-2014-0008
Soria, K. M., Fransen, J., & Nackerud, S. (2013). Library use and
undergraduate student outcomes: New evidence for students' retention and
academic success. portal: Libraries and the Academy, 13(2), 147-164. Retrieved from http://www.press.jhu.edu/journals/portal_libraries_and_the_academy/portal_pre_print/current/articles/13.2soria.pdf
Stone, G., & Collins, E. (2012). Library
impact data project toolkit: Phase 2. Manual. University of Huddersfield,
Huddersfield. Retrieved from http://eprints.hud.ac.uk/16316/
Stone, G., & Collins, E. (2013). Library usage and demographic characteristics of undergraduate students
in a UK university. Performance
Measurement and Metrics, 14(1), 25-35. http://dx.doi.org/10.1108/14678041311316112
Stone, G., & Ramsden, B. (2013). Library impact data project:
Looking for the link between library usage and student attainment. College and Research Libraries, 74(6), 546-559. Retrieved from http://crl.acrl.org/content/74/6/546.abstract
Stone, G., Ramsden, B., & Pattern, D. (2011). Library impact data project toolkit. Manual. University of
Huddersfield, Huddersfield. Retrieved from http://eprints.hud.ac.uk/11571/
Tenopir, C., & Volentine, R. (2012). UK Scholarly Reading and the Value of Library Resources: Summary
Results of the Study Conducted Spring 2011, Knoxville, University of
Tennessee, Center for Information and Communication Studies. Retrieved from http://www.jisc-collections.ac.uk/Reports/ukscholarlyreadingreport/
Tilley, E. (2013). Personalized boutique service: Critical to academic
library success? New Review of Academic
Librarianship, 19(1), 78-97. http://dx.doi.org/10.1080/13614533.2012.753464
Wells, J. (1996). The influence of library usage on undergraduate
academic success. Australian Academic and
Research Libraries, 26(2),
121-128. http://dx.doi.org/10.1080/00048623.1995.10754923
Wilson, M. (2000). Understanding the needs of tomorrow’s library user:
Rethinking library services for the new age. Australasian Public Libraries and Information Services, 13(2),
81-86. Retrieved from http://library.sau.edu/jpollitz/reserves/libraryforthenewage.htm
Whitmire, E. (2002). Disciplinary differences and undergraduates’
information-seeking behaviour. Journal of
the American Society for Information Science and Technology, 53(8), 631-638. http://dx.doi.org/
10.1002/asi.10123
Wong, S. H. R., & Webb, T. D. (2011). Uncovering meaningful
correlation between student academic performance and library material usage. College and Research Libraries, 72(4), 361-370. Retrieved from http://crl.acrl.org/content/72/4/361.abstract