Conference Paper

 

Measuring the Value of Library Resources and Student Academic Performance through Relational Datasets

 

Margie Jantti

University Librarian

University of Wollongong

Wollongong, New South Wales, Australia

Email: margie@uow.edu.au

 

Brian Cox

Manager, Quality and Marketing

University of Wollongong Library

Wollongong, New South Wales, Australia

Email: brian_cox@uow.edu.au

 

 

cc-ca_logo_xl 2013 Jantti and Cox. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 2.5 Canada (http://creativecommons.org/licenses/by-nc-sa/2.5/ca/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

Abstract

 

ObjectiveThis article describes a project undertaken by the University of Wollongong Library (UWL) to identify whether a correlation exists between usage of library resources and academic performance.

 

Methods – A multidimensional approach to systems design was implemented, requiring collaboration between among the library, university administration, Performance Indicator Project team (PIP), and information technology services. The project centers on the integration and interrogation of a series of discrete datasets containing student performance, attrition, demographic, borrowing, and electronic resources usage data. PIP built a cube for the library that links usage of library resources to student demographic data and academic performance (the “Library Cube”). Other cubes will be linked later. 

 

Results – While initial reports are rudimentary and do not yet incorporate data on e-resource usage, results are favourable in demonstrating the value of using the library information resources in coursework. Based on the data generated to date, students who borrow library resources do outperform students who do not. Early trend data shows up to a 12-point difference in grades.

 

Conclusion – The Library Cube signals a new milestone in the UWL’s quality assessment journey. Well-established measures of effectiveness and efficiency will be further complemented by measures of impact and value, allowing the library to step even closer to the goal of having effective and valued partnerships with the university community to realize teaching, learning, research, and internalization goals.

 

Introduction

 

When the University of Wollongong Library (UWL) first commenced its quality assessment journey in 1994 there was a paucity of measures within the library and information sector to guide the evaluation of quality and effectiveness, to supplement the data demonstrating efficiency. Performance indicators and measures primarily consisted of those mandated by government agencies or professional associations. The emphasis, typically, was on inputs and outputs. This situation is somewhat different now. A Quality and Service Excellence program (QSE), conceived in 1994, provided the catalyst to critically review and evaluate UWL’s capacity to deliver services of value to its clients and stakeholders. The QSE encapsulated the improvement goals of the Library; an emerging commitment to total quality management and a recognised need for an overall planning and management framework to replace the well-intentioned, but somewhat fragmented improvement efforts of the past.

 

To complement the QSE program, UWL adopted the Australian Business Excellence Framework (ABEF) as a change management model (McGregor, 2004). The ABEF provides descriptions of the essential features, characteristics, and approaches of organisational systems that promote sustainable and excellent performance, with emphasis on determining and evaluating customer needs, expectations and perceptions of excellent service. The ‘customer focus’ category of the ABEF encourages organisations to assess their ability to understand the needs and expectations of its customers, how customer relationships are managed, and customer perception of value. At UWL, the term client is used to describe the individuals seeking to and/or utilising services and resources.

 

Early forays into assessment indicated that clients’ perceptions of library services were mostly favourable, however, success was difficult to measure and promote due to the lack of robust performance indicators and measures. To address this deficit, the collection, and interpretation of information and data was essential to facilitate and sustain the vision for transformational change. A Performance Indicator Framework (PIF), mapped to stakeholders’ needs and expectations was developed, providing a foundation for the systematic review of services and processes using quantitative and qualitative measures. Through the reporting mechanisms embedded in the PIF, it became possible to systematically measure and evaluate performance (i.e., how effectively and efficiently we manage and improve processes) and to assess clients’ satisfaction with services and resources. This represented a significant shift in the way that data and information was viewed and used; the emphasis was starting to change from inputs and outputs to measures of outcomes.

 

The introduction of a new element within the ABEF, customer perception of value, revealed an area addressed less rigorously by UWL; that is, how clients perceived the Library’s competency in meeting their value goals or whether clients believed they received fair value for the ‘investment’ or cost of engaging with a service. While surveys and feedback systems provide data and information on a range of service elements, they are limited in their capacity to provide information and insight into the perceived value gained by engaging with the library (i.e., the return on the client’s effort for using services and resources).

 

Measuring the Value of Using Library Resources

 

While the processes for evaluating expectations, performance, and satisfaction with available resources are robust and sustainable; measures of impact or affect are less well addressed. For UWL the critical impact question is: what is the value to the student when they use library information resources? This question cannot be answered adequately through satisfaction indices, or by de-identified usage rates of resources.

 

Typically, information resources funds represent a significant proportion of the total allocation to libraries. In academic libraries, millions of dollars are committed annually to the acquisition of and subscription to information resources to meet the research, teaching and learning needs of their clientele. Conversely, anecdotal evidence and local research (Cooper, 2010) shows that many students bypass the Library and almost exclusively use commercial browsers or resources (e.g., Google, Wikipedia) to fulfil their information needs.

 

The challenge for this Library (and others) is to maintain visibility and relevance as a reputable interface for coursework and research resources in the context of an expanding information market. What is needed is a credible hook to show the value of engaging with library resources. We need to produce evidence that shows by using library resources students can improve academic performance - that students who use the Library get better grades.

The approach chosen to measure the impact or value of library information resources differs from more traditional approaches to measuring return on investment (ROI). ROI can be defined as income received as a percent of the amount invested in an asset (Luther, 2008). A positive ROI indicates that more benefit than cost has been generated by the process/investment/result; a negative ROI indicates less benefit was generated than the resource provided (White, 2007). The approach chosen at UWL has focused not purely on monetary return or loss. Rather, we have sought a way to unambiguously demonstrate to students why using library resources is worth their time and effort (Holt, 2007).

 

It turns out that there is a lot of useful information already being collected that can potentially speak to the value generated by the Library. This information is managed by the Library, and by other units on campus. Internally, we have our Library Management System (LMS). This system, like all LMSs, contains a large amount of information about our clients, both borrowing and demographic data. There are also other systems on campus used to manage students’ university experience; systems that contain information collected before, during and after student enrolment. These systems include information managed by the recruitment arm of the University, information managed by campus administration, and information managed by the campus IT department; and includes details on enrolment, academic performance, demographics, attrition, equity, alumni, and usage of the Library’s resources. Each of these information silos is useful to this assessment effort; collectively, they have allowed the Library to make more informed decisions about the services and resources it provides, and the communication styles it has adopted. However, the real power of this information can only be unlocked by joining these data silos together. Separated, these information silos tell a small and fragmented story about one facet of the student experience. Together, the joined datasets tell a richer story (Beckerle, 2008). Without a joined dataset, for example, we can only know the demographic composition of the overall student population. However, if, for example, the student demographic data was joined to data on relating to usage of our resources, then we would be in a position to know both the demographic profile of Library users, and be able to compare this profile to the demographic profile of non-Library users.

 

The project we have embarked on involves joining as many datasets as is ethically, politically, and technically possible to join; with the aim of producing data that will allow the Library to:

 

 

The main requirement for joining any two datasets together is that that each must contain a common unique identifier. All of the systems mentioned above do contain a unique personal identifier, the student number. The political, ethical, and technical accessibility of the datasets varies from system to system. As an absolute minimum, we needed to be able to join information about the usage of our resources to student demographic and academic performance. Anything less would not deliver a worthwhile return on effort. The joined datasets are encapsulated in a “cube,” (Romero & Abelló, 2009) and managed via business intelligence software.

 

The University Performance Indicator Project Team has built a cube for the Library that links usage of library resources to student demographic data and academic performance (the “Library Cube”). Other cubes that will be linked later in the year to the Library Cube include course and subject and student attrition. Later plans include linking to the student satisfaction, equity, recruitment, and admission cubes. The Library Cube is currently still under development, and should be completed by the end of 2010.

 

Converting data about usage of our resources into a usable form proved to be one of the more challenging aspects of the project. Information about usage of our resources is held in two places. Information about anything that is borrowed from our physical collection is held in the LMS. Unfortunately, the information contained in the LMS is locked inside a “black box” that for the most part only allows access to aggregated data or individual records. We can, however, export a flat file containing a snapshot of all current clients and the books they have borrowed to date. This is not as much information as we need, but it is information we can use. We export this ‘snapshot’ each week, and the difference between two snapshots represents the amount borrowed by each client over the period between the snapshots.

 

Like most libraries, demand for our physical collection is diminishing, while demand for our electronic resources is rising. Consequently, the long-term success of the project hinges upon being able to access information about usage of our electronic resources. Fortunately, this information is captured in logs as part of the authentication process. These logs do not contain all the information we need, but it does contain information we can use.

 

Each time a user accesses any of our electronic resources a record is written to our EZproxy log. This log contains the student’s unique ID, the electronic resource they accessed, and the time they accessed the resource. The number of log entries generated depends upon the content and code of the website that contains the resource the client is accessing. Consequently, the number of log entries is arbitrary; so there is no value in counting the number of entries. However, we do know which database platform they used, and in many cases the actual database. So, in the spirit of pragmatism (i.e., take what you can use) we decided to convert the logs into meaningful data as follows:

 

 

Using these rules, we will be able to identify how many different electronic resources a user accessed during the day, and for how many ten-minute periods they accessed these databases. The number of ten-minute periods can be converted into a score (count) with a maximum score of 144 for a day for a given database. This method will provide a proxy measure for sessions—which despite its limitations should give a reasonably reliable and valid indication of the depth and scope usage.

 

Aside from the technical challenges, there were also ethical, legal and political issues to resolve.

 

Privacy

 

The primary ethical and legal consideration was privacy. The University of Wollongong’s Privacy Information Sheet outlines the 12 principles to which the University must comply regarding the collection, storage, access, use, and disclosure of personal information (”Privacy Information Sheet,” 2010). Fortunately, there are no legal barriers, as UOW has obtained consent to use personal information for the project, via its Privacy Policy to which students must agree as part of their enrolment.

 

At an ethical level, the additional privacy risks potentially posed by the project have been eliminated by the way the personal information will be managed. Privacy is only an issue to the extent that it involves the use, disclosure, etc. of personal information. Information is only personal if it is possible to uniquely identify an individual from the information in question. The project will result in the construction of a cube built by joining several datasets, all of which will contain personal information. However, the Library will not be able to use the cube to drill down to see a specific individual’s personal information. In other words, the data that the Library can view in the cube will always be aggregated, which means we will not be able to identify a specific individual’s usage, except in the highly unlikely situation where a very small number of individuals belong to the variable contained within a dimension in the cube (e.g., hypothetically, if we only have five students from Botswana, then it may be possible to identify those individuals from manipulating various aggregated views filtered to citizenship) (Aggarwal & Yu, 2008). In all cases, the personally identifiable data that could be gleaned from the cube is significantly less than that which can already be ethically and legally obtained by the Library from its LMS, usage logs, and access to student management systems. Moreover, access to the cube will be even more restricted than is the case for the other systems that contain the same information.

 

Executive Support

 

The project involves doing something that is quite different for a library, and it requires the support of other units, and their executives. Consequently, it is only healthy and expected that the project should encounter resistive inertia in some places. The Library Senior Executive provided full and enthusiastic support for the project from the beginning. Without this support, the project could not have succeeded.

 

The Library has been very fortunate in the sense that the campus Vice-Principal (Administration), has been and continues to be a major force behind improving performance measures at the University, notably through the creation of the Performance Indicators Project Team (PIP). Our goal to improve our ability to measure our performance sits very well with the Vice-Principal’s vision. The PIP Team’s Vision is “to improve University performance through enhancing business decision-making by offering a seamless and secure architecture that provides business users with access to accurate, meaningful and shared data in a timely manner” (Performance Indicators Project Team, 2009). Through carefully planned communication and demonstrated goal alignment, we were easily able to obtain the external senior executive support we needed for the project to succeed.

 

Other libraries considering pursuing a similar project may not be as fortunate as we have been in obtaining support, and may benefit from reading Lombardo and Eichinger’s writings on Political Savvy and Organisational Agility (2009). From a practical point of view, anyone considering such a project should allow their Library Executive at least a month to absorb, understand, and commit to undertaking such a project; and allow at least six months to obtain support from all the necessary units. Most importantly, undertaking such a project is only feasible if most of your student data is housed in online analytical processing (OLAP) cubes, or managed by other business intelligence software with similar functionality. Our project could not have got off the ground without PIP; they are the team that built the Library Cube.

 

There are three broad uses for which the Library plans to use the information: to improve accountability; to support process improvement; and to support marketing.

 

Accountability

 

UOW makes a significant investment in its Library. In 2009, the Library had a budget of over $12M (AUD), representing 4% of the campus budget (“Library Annual Report,” 2009). The campus expects, and is entitled to know, the return it is obtaining from investing in the Library. It is highly unlikely that the Library will ever be able to provide a hard answer to this question, given that many of our activities generate real but largely unquantifiable value. For example, what value could be placed on rekindling an individual’s interest in learning? How much of that value can be attributed to the Library? Nevertheless, the project will allow us to provide better performance data than we have in the past.

 

We actually have seen a positive correlation between borrowing activity and academic performance for the data we have put into the Cube so far. But we have not yet put in all the desired data elements (e.g. e-resources use) for that correlation to have much meaning. Most importantly, the Library understands and recognises that it cannot claim all the credit for increased academic performance. Clearly, students would not perform nearly as well without the guidance, support, research, and teaching activities of academic staff. But it is also equally true that a student could fail their degree if they do not read anything. This point cannot be overemphasised. Academic learning is about exploration and intellectual growth, and there are many paths to this destination. However, despite all the technological changes, the best way to grow academically is still by reading from and engaging with the body of knowledge generated by scholarly enquiry (Levy & Levy, 2005). Students read from many places, and we hope to show that students are better off reading material from our collection.

 

The data we obtain from this project will allow us to demonstrate that those students that do not use our resources are at a disadvantage academically, and we will be able to quantify the degree of disadvantage. We will be able to quantify this disadvantage both in terms of lower academic performance and higher attrition rates.

 

Process Improvement

 

The Library Cube will provide the information we need to further support continuous improvement in three areas: collection development; academic relationships; and marketing.

 

The Library spends a significant proportion of its budget subscribing to electronic databases. We are able to obtain information on the number of downloads associated with subscriptions; and we combine this with cost data to create rough indices, such as cost per download. The Library uses this information, in consultation with academic staff, to continually improve and develop its collection. There are, however, two major limitations of this data: it is not linked to academic performance; and it takes far too long to get the data.

 

The Library Cube will be updated weekly, which will allow us to view in a much more timely fashion how our electronic resources are being used. We will also be able to see at the end of each session, which resources had a significant impact on academic performance, and which resources did not. We will be able to use this information to make more informed decisions about electronic resource collection development and to identify and replicate the processes that led to specific resources facilitating higher academic performance.

 

On this last point, we hope and expect that the Cube will provide information that will support the Library in taking a more holistic systems-based approach to improving the contributions the Library makes to academic learning. For example, we will have enough information to be able to differentiate between those courses that have a higher proportion of Library users, and those that do not. We will know which academics run those courses; so we will be in a position to begin to investigate what specifically some academics are doing differently that results in their students being more likely to use the Library. This will allow us to identify what behaviours and practices support greater library usage, which in turn will provide the information we need to champion and support the rollout of best practices across the campus.

 

Marketing

 

The Library Cube will also allow us to integrate marketing more closely with our core business activities, and to do so with surgical precision. For example, we will be able to provide academics with the evidence they need to effectively promote the Library to their students. We will also be able to draw on this information in our own teaching activities, to convincingly demonstrate the research behaviours that led to academic success. We will know which specific group we should target to improve take-up. Most importantly, we will know almost immediately whether our marketing efforts succeeded, which in turn will help us to make informed decisions about whether to change tack, or continue with more of the same.

 

Conclusion

 

The ability to demonstrate the value of libraries and their collections is becoming all the more important and undeniably challenging in a period of generational change embodied in a fundamental shift in students’ attitudes to using information. Not only do we need to convince the university executive and faculty of the value of libraries, our most challenging audience is increasingly that of the student body. We needed to garner evidence that would unequivocally demonstrate that academic performance can improve by using a library’s information resources.

 

To address this problem, a multidimensional approach to systems design was implemented, requiring not inconsiderable collaboration and cooperation between the Library, University Administration, PIP, and Information Technology Services (ITS). The project centred on the integration and interrogation of a series of discrete datasets (e.g., student performance, attrition, demographic, borrowing, and electronic resources usage data). Although the time required to establish the problem statement, business rules, and reporting requirements has been lengthy, the genesis of the Library Cube is proving worthwhile. While initial reports are rudimentary, and do not yet incorporate data on e-resource usage (e.g., online journals), results are favourable in demonstrating the value of using Library information resources in coursework. Based on the data generated to date, students who borrow Library resources do outperform students who do not. Early trend data shows up to a 12-point difference in grades. Such improved performance could influence a student’s decision to stay at university or leave; the overall quality of the learning experience; or the capacity to produce students who embody the University’s Graduate Qualities, notably that of being an independent learner who values scholarly information resources. Importantly, the Library Cube will help to identify those students who use the Library’s resources infrequently, or not at all. Through this knowledge, highly tailored and tightly focused promotion and marketing strategies can be deployed, with immediate feedback on the effectiveness of chosen strategies.

 

The Library Cube signals a new milestone in the UWL’s quality assessment journey. Well established measures of effectiveness and efficiency will be further complemented by measures of impact and value, allowing us to step even closer to the goal of having effective and valued partnerships with the University community to realise teaching, learning, research and internationalization goals.

 

 

References

 

Aggarwal, C. C., & Yu, P. S. (2008). Privacy-Preserving Data Mining: Models and Algorithms. New York: Springer.

 

Beckerle, M. (2008). How business intelligence systems deliver value: Interview with Mike Beckerle of Oco, Inc. Journal of Digital Asset Management, 4(5), 277-291. doi:10.1057/dam.2008.22

 

Cooper, Lynda. (2010). Online Education Program for Transitioning Students. Incite, 31(6), 25. Retrieved 20 May 2013 from http://archive.alia.org.au/incite/2010/v31.06.pdf

 

Holt, Glen. (2007). Communicating the value of your libraries. The Bottom Line: Managing Library Finances, 20(3), 119-124. doi:10.1108/08880450710825833

 

Levy, P., & Levy, S. (2005). Developing the new learning environment: The changing role of the academic librarian. London: Facet.

 

Library Annual Report 2009. (2009). In University of Wollongong. Retrieved 20 May 2013 from http://www.library.uow.edu.au/content/groups/public/@web/@lib/documents/doc/uow077099.pdf

 

Lombardo, M. M., & Eichinger R. W. (2009). FYI: For your improvement, a guide for development and coaching. (5th ed). Lominger International: A Korn/Ferry Company.

 

Luther, J. (2008). University investment in the library: What’s the return? A case study at the University of Illinois at Urbana-Champaign. San Diego: Elsevier.

 

McGregor, F. (2004). Excellent libraries: A quality assurance perspective. In A. A. Nitecki (Ed.), Advances in librarianship. (pp. 17-53). San Diego: Elsevier Academic Press.

 

Romero, O. and Abelló, A. (2009). A Survey of Multidimensional Modelling Methodologies. International Journal of Data Warehousing and Mining, 5(2), 1-23. doi:10.4018/jdwm.2009040101

 

Privacy information sheet – General. (2010). In University of Wollongong. Retrieved 20 May 2013 from http://www.uow.edu.au/content/groups/public/@web/@fin/@lcu/documents/doc/uow061010.pdf 

 

Performance Indicators Project Team. (2009). Vision. Unpublished document, University of Wollongong, Wollongong, New South Wales, Australia.

 

White, L. N. (2007). An old tool with potential new uses: Return on investment. The Bottom Line: Managing Library Finances, 20(1), 5-9. doi:10.1108/08880450710747407