Evidence Summary

 

Mixed-Method Survey Research is Useful to Incrementally Improve Library Homepage Design

 

A Review of:

Deschenes, A. (2014). Improving the library homepage through user research – without a total redesign. Weave, 1(1). http://dx.doi.org/10.3998/weave.12535642.0001.102

 

Reviewed by:

Kathleen Reed

Assessment & Data Librarian

Vancouver Island University

Nanaimo, British Columbia, Canada

Email: kathleen.reed@viu.ca

 

Received: 3 Jun. 2015     Accepted: 11 Aug. 2015

 

 

cc-ca_logo_xl 2015 Reed. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

Abstract

 

Objective – To assess content organization and wording of links on the library’s homepage.

 

Design – Mixed-methods survey.

 

Setting – Small college, United States of America.

 

Subjects – 57 library users.

 

Methods – Library staff distributed paper surveys at the entrance to the library, with the goal of collecting a minimum of 30 surveys. The survey directed participants to indicate their preferred terms from a list, and their preference for ordering the menu items on the library’s homepage. Qualitative survey data was also collected via several open-ended questions that began with prompts such as “I really love…” and “I can never find…”

 

Main Results – The search box tab labelled “Library Catalogue” was preferred over “Books and Media,” which the staff believed to be a more user-friendly term. Using a pre-defined list, participants ranked the Library Catalogue as the most important tab, followed by E-Resources, Articles, and Library Guides. A link to the Library Catalogue was also selected as the most important resource sidebar link, followed by E-Resources, Full-Text Journals, Library Guides, and Refworks. The service sidebar links by order of importance were found to be: Library Hours, Group Study Rooms, Writing & Citing, Interlibrary Loan, and Chat with a Librarian. Qualitative feedback received demonstrated a lack of understanding what the terms “Library Guides” and “A-Z List” mean, and difficulty finding a complete list of databases. Library staff received feedback that the Library Hours and Account Log In should be made more prominent.

 

Conclusion – Library staff updated the website to reflect user preferences for wording and order of links on the homepage. Google Analytics showed a decrease of 30 seconds per average visit after the changes, which the author attributes to better wording and organization. There were no complaints about the website in the first three months after the change. The author concludes that a paper survey is an effective tool for librarians who would like to make incremental changes on their homepages.

 

Commentary

 

Website design and usability are much-discussed topics both within libraries and more broadly. With so much research already done on particular wording and design for creating an optimal library homepage, this study has few original findings to add to the conversation and made few links to previous research findings. Rather, the value of this article is in the type of evidence collection that the work discusses and models.

 

This study is an excellent example of a small, incremental assessment activity that was undertaken between major user experience studies. It exemplifies the difference between research and assessment. As a research study, this work has significant flaws which the authors do acknowledge. The sample size is small and not representative of all students. As the college is home to a library and information school, there is the potential for students in this program to skew results. The survey design assumes that people who physically visit the library also use the website, and those who use the website visit the physical library. The survey results are not generalizable to other institutions, failing to pass Glynn’s critical appraisal checklist (2006).

 

However, if understood as an assessment activity, a short mixed-methods survey can be helpful to an institution. Instead of leaving website design up to library staff, the college was able to update website terms and link order using some evidence. For example, library hours were identified in the survey as being the most important link related to services and thus listed at the top of the navigation sidebar. Without the survey, the homepage would have been completely influenced by library staff’s opinions. This information was gained with a small amount of staff time (20 hours) and funds, and did not conflict with institutionally-mandated branding or content management systems. Not every activity intended to collect evidence to assist decision-making can or should be a thorough research study.

 

An exercise like the one outlined in this article has significant benefit for collecting evidence supporting small, continual changes. The process discussed would be beneficial in any type of library and does not require staff to be well versed in web design or user experience testing. The author does a thorough job of detailing the process, making it easily actionable for librarians at other institutions to do similar work. The author is to be commended for publishing an example of an evidence-based practice that any librarian could pick up and use, regardless of his or her familiarity with research methods.

 

References

 

Glynn, L. (2006). A critical appraisal tool for library and information research. Library Hi Tech, 24(3), 387-399. doi: 10.1108/07378830610692154