Evidence Summary

 

Information Literacy Course Yields Mixed Effects on Undergraduate Acceptance of the University Library Portal

 

A Review of:

Chen, Y. (2015). Testing the impact of an information literacy course: Undergraduates' perceptions and use of the university libraries' web portal. Library & Information Science Research, 37(3), 263-274. http://dx.doi.org/10.1016/j.lisr.2015.04.002

 

Reviewed by:

Heather Coates

Digital Scholarship & Data Management Librarian

University Library

Indiana University-Purdue University Indianapolis (IUPUI)

Indianapolis, Indiana, United States of America

Email: hcoates@iupui.edu

 

Received: 7 Mar. 2016    Accepted: 15 Apr. 2016

 

 

cc-ca_logo_xl 2016 Coates. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 


Abstract

 

Objective – To determine the effects over time of a 3-credit semester-long undergraduate information literacy course on student perception and use of the library web portal.

 

Design – Mixed methods, including a longitudinal survey and in-person interviews.

 

Setting – Information literacy course at a comprehensive public research institution in the northeastern United States of America.

               

Subjects – Undergraduates at all levels enrolled in a 3-credit general elective information literacy course titled “The Internet and Information Access.”

 

Methods – A longitudinal survey was conducted by administering a questionnaire to students at three different points in time: prior to instruction, near the end of the course (after receiving instruction on the library portal), and three months after the course ended, during the academic year 2011-2012. The survey was created by borrowing questions from several existing instruments. It was tested and refined through pre-pilot and pilot studies conducted in the 2010-2011 academic year, for which results are reported. Participation was voluntary, though students were incentivized to participate through extra credit for completing the pre- and post-instruction questionnaire, and a monetary reward for completing the follow-up questionnaire. Interviews were conducted with a subset of 14 participants at a fourth point in time.

 

Main Results – 239 of the 376 (63.6%) students enrolled in the course completed the pre- and post-instruction questionnaire. Fewer than half of those participants (111 or 30% of students enrolled) completed the follow-up questionnaire. Participants were primarily sophomores and juniors (32% each), with approximately one-quarter (26%) freshman, and only 10% seniors. Student majors were concentrated in the social sciences (62%), with fewer students from science and technology (13%), business (13%), and the humanities (9%). The 14 participants interviewed were drawn from both high- and low-use students.

 

Overall, the course had a positive effect on students’ perception of usefulness (PU) and ease of use (PEOU), as well as usage of the library portal. This included significant positive changes in perceived ease of use and information quality in the short-term (from pre-instruction to post-instruction). The results were mixed for perceived usefulness and system quality. Though there was mixed long-term impact on usage, the course does not appear to have had a long-term effect on PU and PEOU. The interview participants were asked questions to explore why and how they used the library portal, and revealed that both high- and low-use students used the library portal for similar reasons: to find information for research papers or projects, to search the library catalogue for books, and in response to a mandate or encouragement from instructors.

 

Conclusion – The study supports the theory that an information literacy course could change student perception and use of the library portal in the short-term. Replicating this design in other settings could provide a systematic approach for assessing whether information literacy courses address learning outcomes over time. A longitudinal approach could be useful for comparing proficiency and information behaviors of those who take information literacy courses with those who do not.

 

Commentary

 

This well-designed study has several strengths and offers a model for future research. The use of technology acceptance models to assess library resource use is an interesting approach, particularly when combined with instructional intervention. Applying Glynn’s (2006) critical appraisal checklist indicates that overall validity is good, particularly in relation to the study design, data collection, and results. However, readers should be cautious in generalizing the results given that the study used a non-random sample of a student population that may not be representative of their local student populations.

 

The primary strengths of the study are its careful design and execution. Two well-tested models, Technology Acceptance Model and the information systems success model, informed the development of the questionnaire, which was piloted twice. Its face validity appears to be good. Although timing may have been a factor in the attrition from the post-instruction to follow-up phases, this possibility was not discussed. The results are clearly reported and connected back to the hypotheses.

 

Statistical analysis is an area for improvement in future studies. Use of a one-tailed t-test only detects changes in the constructs (PU, PEOU, and portal usage) in one direction. This choice increases the ability to detect positive changes at the expense of detecting negative changes. Additionally, it is unclear whether a key assumption for using the dependent t-test is met – the author does not report whether the differences between the paired scores are normally distributed.

 

The smaller sample size at follow-up raises two questions. Were the long-term effects of instruction undetected because the sample size was too small? Were the students who completed the follow-up phase different in some meaningful way from the students who did not? Neither of these considerations is explored in the article. Finally, readers would have benefited from deeper examination of the partially-supported hypotheses. In particular, what implications do they have for the validity of the questionnaire and use in future studies? How could those concerns be addressed or explored in future studies?

 

This study is particularly relevant to librarians engaged in course integrated information literacy instruction, instructional coordinators, and assessment librarians. It provides a model for examining the impact of information literacy instruction on student use of library resources. Considerations for future studies include gathering additional information on student demographics and experience with particular library resources, as well as carefully considering the timing of the follow-up survey and interviews. Finally, a pre- and post-skills assessment administered in conjunction with the technology acceptance questionnaire could be powerful for identifying potential relationships between information literacy skill level and acceptance of library resources.

Reference

 

Glynn, L. (2006). A critical appraisal tool for library and information research. Library Hi Tech 24(3), 387-399. http://dx.doi.org/10.1108/07378830610692154