augustine.p65 354 College & Research Libraries July 2002 354 Discovering How Students Search a Library Web Site: A Usability Case Study Susan Augustine and Courtney Greene Susan Augustine is an Assistant Reference Librarian and Assistant Professor in the Richard J. Daley Library at the University of Illinois at Chicago; e-mail: saugusti@uic.edu. Courtney Greene is a Resident Librarian and Visiting Instructor in the Richard J. Daley Library at the University of Illinois at Chicago; e-mail: crgreene@uic.edu. Have Internet search engines influenced the way students search li- brary Web pages? The results of this usability study reveal that students consistently and frequently use the library Web site’s internal search engine to find information rather than navigating through pages. If stu- dents are searching rather than navigating, library Web page designers must make metadata and powerful search engines priorities. The study also shows that students have difficulty interpreting library terminology, experience confusion discerning difference amongst library resources, and prefer to seek human assistance when encountering problems online. These findings imply that library Web sites have not alleviated some of the basic and long-range problems that have challenged librarians in the past. eb sites have become one of academic libraries’ most com- monly used mediums for com- municating with patrons. Not only do library Web sites offer informa- tion about policies, items, and services available in the physical library, they also conveniently deliver electronic resources, such as electronic journals, reference tools, research guides, and electronic books, directly to the patron’s computer screen. Libraries must make the interfaces of these sites intuitive and easy to use if they expect patrons to identify and effec- tively utilize the ever-increasing print and electronic resources and services made available through this venue. Numerous ways exist to test the effectiveness of Web sites, including online surveys, focus groups, analysis by staff, and usability testing. According to a recent survey of Association of Research Libraries mem- ber libraries, 37 percent of respondents selected usability testing as a method for evaluating their Web sites.1 Usability testing involves observing members of targeted user groups as they perform a series of tasks intended to ad- dress specific functions or portions of a Web site. Observers look for repeated patterns of use to determine strengths and problems with the site. This systematic process of analysis provides information that can lead to a user-centered design as well as reveals information about how patrons search. Discovering How Students Search a Library Web Site 355 The Library at the University of Illinois at Chicago (UIC) launched a new Web site in summer 2000. Although the library solicited feedback from users during the design process via a Web form, the site had not yet been subject to a controlled study of the its patrons. To this end, the authors, two reference librarians from the Richard J. Daley Library at UIC, under- took a usability test of the library Web site in spring 2001. The authors’ goals were twofold. First, they wanted to test the clar- ity and ease of navigation of the recently redesigned site. Anecdotal experience of public service staff at the reference desk had revealed potential design and lan- guage issues that could be confirmed or disproved through a controlled study. Second, they wanted to observe the way users searched for information in order to tailor services to the library’s patrons. Literature Review Until very recently, much of the library literature on usability testing was process oriented, giving explanation, instruction, and encouragement to those intending to undertake such a test. Alison J. Head ad- dressed many fundamental procedural issues, recommending the testing of three to five users and allowing no more than four to five minutes per question for a total of no more than an hour for each test.2 The number of testers necessary is a matter of some debate: Janet K. Chisman, Karen R. Diller, and Sharon L. Walbridge recommended eight users, and Ruth Dickstein and Vicki Mills recommended eight to twelve. 3,4 Brenda Battleson, Aus- tin Booth, and Jane Weintrop, in present- ing their case study, replicated Jakob Nielsen and Thomas Landauer ’s “curve showing the relationship between the number of users tested and the number of problems found in a usability test.” 5,6 The curve showed that with fifteen us- ers, 100 percent of the problems should be revealed; eight users revealed approxi- mately 90 percent of the problems and five users revealed approximately 80 percent of the problems. For what Nielsen char- acterized as “medium-size development projects” with “relatively homogeneous user groups,” he considered the use of three to five subjects to produce maxi- mum cost-benefit ratio.7 Based on the recommendation of Chisman, Diller, and Walbridge, the au- thors used Jeffrey Rubin’s 1994 Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests, an excellent source on how to prepare for and imple- ment a usability test.8, 9 The authors also heeded the recommendations of Dickstein and Mills, who emphasize the establishment of clear goals before under- taking the actual testing: “You can’t test everything. Decide what are the most important tasks you want users to be able to perform on your site.”10 In addition to the how-to advice found in the literature, findings of usability test- ing have been published as well. A consis- tent problem cited across studies is exces- sive use of library terminology. Although not a new issue in the field, the use of highly technical language and jargon in library Web sites frequently poses difficul- ties for users of library Web sites as well.11,12 Battleson, Booth, and Weintrop reported that users had “obvious confusion with terminology.”13 Louise McGillis and Elaine G. Toms found that “terminology used in the set of menus was not meaningful de- spite the fact that it is standard in librar- ies.”14 Realizing that this is an important issue, and one that is sometimes hard to detect by those in the field, the authors focused part of the testing on detecting the use of jargon throughout the site. Methodology As part of the necessary human subjects protection approval process in place at UIC, the authors submitted a summary of the project and all supporting docu- ments (questionnaires, task lists, etc.) to This systematic process of analysis provides information that can lead to a user-centered design as well as reveals information about how patrons search. 356 College & Research Libraries July 2002 TA BL E 1 Re sul ts o f th e S cre eni ng Su rve y Ho w l on g h ave yo u b een us ing a c om pu ter ? Le ss t han 5 y ear s 5�1 0 y ear s Mo re tha n 1 0 y ear s Nu mb er o f re spo nse s 1 10 1 Wh at is y our sta tus at UI C? Un der gra du ate Gr adu ate Nu mb er o f re spo nse s 10 2 Ho w m uch ex per ien ce do yo u Oc cas ion al u se Fre qu ent us e hav e u sin g t he UI C l ibr ary ? Ne ver us ed bef ore (m ont hly ) (w eek ly) No an sw er Nu mb er o f re spo nse s 0 6 5 1 Ha ve you at ten ded a l ibr ary in str uct ion se ssi on? Ye s No Nu mb er o f re spo nse s 4 8 Wh ich of th e fo llo win g d o y ou Ge ner al e-m ail On lin e Ar tic le Ot her ele ctr on ic ha ve exp eri enc e u sin g o n t he sea rch ing cat alo gs dat aba ses res ou rce s (e .g., on lin e Wo rld W ide W eb? jou rna ls, ref ere nce so urc es) Nu mb er o f re spo nse s 11 12 12 10 6 Ho w o fte n d o y ou use the In ter net ? Ne ver or ra rel y Oc cas ion all y ( we ekl y) Fre qu ent ly ( dai ly) Nu mb er o f re spo nse s 1 1 10 the Institutional Review Board. Approval was obtained for the study, and the au- thors began to recruit participants. As suggested in the literature, the au- thors attracted participants through an advertisement placed in the student newspaper and through flyers posted in public buildings on campus.15 They also offered an incentive to encourage partici- pation. Each participant successfully com- pleting the test would receive a gift cer- tificate in the amount of $15, redeemable at any student bookstore. To preserve the confidentiality of par- ticipants’ identities, only the first names and phone numbers of those who re- sponded to the advertisements were col- lected; names were later replaced with coded identifiers (e.g., 1A, 1B). A screening survey asked for information about computer expertise and familiarity with li- brary resources so as to better recruit a group of participants with diverse levels of skill.16 The survey also ensured greater con- sistency of approach. The results are shown in table 1. Approximately sixteen students responded to the advertisement; with the exception of two partici- pants, all reported that they “frequently” used the Internet. The authors hoped to observe at least eight participants but ar- ranged sessions with twelve students in order to account for possible no- shows or dropouts and any other unforeseen problems.17 The original goal was to recruit several subgroups of participants with diverse levels of com- puter experience, but those who responded to the advertisement re- ported relatively homoge- neous computer experi- ence.18 It was found, how- ever, that the profiles cor- responded with data about the UIC student body as a whole. Statistics gathered on the entering freshman population of Discovering How Students Search a Library Web Site 357 2000 show that computers have pen- etrated into their households deeply. Sev- enty-six percent have a personal com- puter at home, and over 90 percent have access to a computer at home or work or both. Almost 70 percent of the students indicated that they were familiar with the Internet and its applications from experi- ences while in high school.19 The authors developed tasks based on previous interactions with patrons. If cer- tain issues were addressed time and time again at the reference desk, they war- ranted closer inspection in this study. Tasks also were based on the authors’ ex- perience with the site. If certain pieces of information were difficult for reference li- brarians to find, the design needed to be tested. The authors acknowledge that the choices were somewhat subjective. For reasons of time and money, this is a limi- tation of nearly all usability studies. Twenty tasks were created. Because the authors wanted to keep the test under an hour to keep students from being over- whelmed and burnt out, users were given three minutes to complete each task.20 This time restriction is also subjective. Rubin recommends numerous ways for deciding on time restrictions, including everything from looking at previous usability tests performed on the system in question to guessing how long a task should take.21 To determine time restrictions, the authors created expert search paths for each task (i.e., the quickest and shortest path one could take to complete the task) and tested how long it took to follow each expert path. The time required for the longest expert path was approximately one minute. Therefore, it was deemed that three min- utes would be ample time to complete each task. Shelley Gullikson, Ruth Blades, Mark Bragdon, et al. determined in their usabil- ity study that “when the answer was hu- manly ‘findable,’ it was locatable in much less time than the three minutes allotted to participants.”22 The goal was not to make users give up or to demand perfect efficiency but, rather, for the authors to see all the problems that users encounter along the way. Naturally, some tasks took longer to complete than others, but time was only one measurement of user performance. As recommended by other researchers, the authors conducted a pilot test com- posed of three volunteers, which revealed some difficulties with the wording of the tasks.23–25 Each identified problem was restated in a clearer fashion. In addition, the authors discovered that the pilot test participants had difficulty fully compre- hending the tasks when they were ver- bally requested to perform certain func- tions. To address this problem, partici- pants were given a print copy of the tasks to refer to during testing. Rubin also sug- gested this approach.26 Appointments were made for hour- long sessions with twelve participants, during which each was asked to complete the list of twenty tasks. The authors each scheduled appointments with six partici- pants; the two groups executed the tasks in a different order, as suggested by Rubin.27 One group performed the tasks in their original numeric order; the sec- ond group performed the latter ten tasks first. This was done to ensure that increas- ing familiarization with the site through- out the session did not unduly affect completion of the latter tasks. If certain questions gave one group trouble but were easily accomplished by the next group, one could infer that the learning curve might be influencing performance. During each session, one of the au- thors worked with the participant in a private space within the library, using an IBM-compatible personal computer and viewing the Web pages with Netscape Navigator 4.7. (During the screening in- terview, students were asked whether they preferred an IBM-compatible or Macintosh personal computer. All re- sponded that they preferred IBM-compat- Users employ a trial-and-error method when searching online catalogs, are frequently unable to interpret the information they retrieve, and struggle to interpret commonly used terminology. 358 College & Research Libraries July 2002 ible; hence, that was the only platform used.) The authors read the tasks and re- corded a variety of data. They also audiotaped the comments of each partici- pant, manually noted the search path for each task as well as any other germane details, and timed each task. Participants were asked to reset their browser to “Home” (the top-level library Web page) after completing each task to ensure con- sistency in recording their paths. A ver- bal protocol method was used, recom- mended by many studies as a way to elicit qualitative feedback.28–31 Participants were encouraged to speak out loud throughout the process, explaining their decisions and describing their thoughts and feelings about the site. At the comple- tion of the session, participants received the incentive and contact information for the authors. Process of Analysis The authors examined both quantitative and qualitative results. One important measurement was whether participants finished each task within the time bench- mark; however, there were other reveal- ing data to consider, such as whether the user struggled along the way, complained about the design of the pages, or took eight clicks to complete a task with an expert path of two clicks. Although us- ability studies are meant to disclose trends in behavior, the authors found individual comments revealing as well. In addition to observing and measur- ing the users during testing, the authors took into account users’ prior computer and library experience, as shown in table 1. Every respondent reported that he or she had experience using online catalogs, and ten out of twelve reported that they had used article databases before. In ad- dition, eleven out of twelve reported that they use the Internet on a daily or weekly basis. These responses reveal a group that is, with the exception of one, quite famil- iar with searching the Web and using online library resources. Although two- thirds of the participants described them- selves as frequent users of the library or as having attended at least one library instruction session, performance on tasks across the board revealed a general lack of understanding about library services and resources. This issue is addressed in greater detail in the results below. Results Two kinds of information were gained from this usability study. First, specific problems relating to the UIC library Web page were discovered. Some of these dis- coveries, such as the use of library jargon, are widely applicable to those working in other libraries who are trying to create user-friendly Web pages. Second, useful information was gained about patrons’ search patterns. For example, all partici- pants but one used the internal Web site search engine to complete tasks rather than navigating through the pages by fol- lowing links. The authors examined four quantita- tive measurements. The results are shown in table 2. 1. Was the user able to complete the task in the time allotted? 2. How long did each task take to complete (including mean time and stan- dard deviation)? 3. How many clicks were required in the expert search path (the most efficient way to complete the task)? 4. How many clicks on average did it take for the users to complete each task? Clearly, the tasks with a low percent- age of completion highlight some of the most problematic parts of the Web site. By categorizing the questions, as shown in table 3, it was possible to see which areas of the site were most difficult to navigate. Participants consistently dem- onstrated difficulty evaluating the func- tions and uses of a wide variety of elec- tronic resources. On several occasions, they attempted to find information about library holdings in article databases. Thus, in many cases, the participants’ basic lack of understanding and aware- ness of library resources impacted their ability more than the organization of the site did. D isco v erin g H o w S tu d en ts S earch a L ib rary W eb S ite 359 TABLE 2 Results Task Percentage of (abbreviated for Participants Performing Mean Time Standard Deviation Expert Search Path Average (median) reasons of space) Correctly (within 3 minutes) (minutes.seconds) (minutes.seconds) Number of Clicks1 Number of Clicks2 Finding the online index ERIC 92% 1.23 0.36 2 5.7 Does the library own the journal Notes and Queries 58% 1.48 0.36 2 10.7 Submit a reference question online 100% 0.24 0.22 2 2.2 Find electronic copy of journal Gender and History 100% 0.57 0.26 2 3.3 Find hours of Architecture and Art Library 100% 0.29 0.23 2 3.2 Find the page for making an interlibrary loan request 100% 1.02 0.34 3 3.8 Find information about jobs at the library 8% 2.16 0.50 4 8.7 Find services available to students with disabilities 100% 0.11 0.05 2 2 Find a map of the Library of the Health Sciences 100% 0.39 0.32 2 3.4 Find article database for the field of history 75% 1.18 0.44 5 5.75 Find online guide for doing women�s studies research 42% 1.59 0.33 4 6 Find technical help online 50% 1.15 0.40 1 3.4 Find information about workshops at the library 67% 0.57 .32 3 4.4 Does the library own The Bluest Eye 67% 2.04 0.40 2 8 360 College & Research Libraries July 2002 TA BL E 2 Re sul ts ( con t.) Ta sk Pe rce nta ge of (ab bre via ted for Pa rti cip ant s P erf orm ing Me an Ti me Sta nd ard De via tio n Ex per t S ear ch Pa th Av era ge (m edi an ) rea son s o f sp ace ) Co rre ctl y ( wit hin 3 m inu tes ) (m inu tes .se con ds) (m inu tes .se con ds) Nu mb er of Cli cks 1 Nu mb er of Cli cks 2 Ac ces s th e W eb ver sio n of ILL IN ET , an on lin e c ata log 67% 0.4 2 0.2 5 2 2.3 Fin d th e p age fo r re new ing a b ook 80% 1.3 1 0.2 9 3 5.7 Ac ces s th e o nli ne cat alo g for Lo yol a U niv ers ity 75% 2.0 0.5 1 3 7.5 Fin d a n e lec tro nic co py of the OE D 100 % 1.0 5 0.3 6 3 4 Fin d a n o nli ne art icle ind ex wit h f ull -te xt a rtic les 83% 1.0 3 0.3 4 2 5.1 Ho w l ong ca n g rad stu den ts hav e b ook s c hec ked ou t 42% 1.1 6 0.3 8 3 5.6 1.T he few est nu mb er o f m ous e c lick s re qui red to com ple te t he tas k 2.T he ave rag e (m edi an) nu mb er o f m ous e c lick s m ade by us ers , re gar dle ss o f w het her th ey acc om pli she d t he tas k, r an out of tim e, o r g ave up Many participants experi- enced confusion interpreting records in the library’s online catalog. For instance, four us- ers could not find out if the library owned a particular book, even though every single user searched the ap- propriate tool: the library’s online catalog. Participants also struggled when attempt- ing the task asking them to determine whether the UIC library has print and/or elec- tronic holdings of a particular journal. Five of the twelve us- ers could not determine whether the library owned the journal in either format. These results suggest a pos- sible lack of intuitive design in the catalog interface. How- ever, the literature suggests that this problem may be com- mon in most user interactions with library catalogs. Several recent studies reported simi- lar difficulties; results indicate that users employ a trial-and- error method when searching online catalogs, are frequently unable to interpret the infor- mation they retrieve, and struggle to interpret com- monly used terminology (e.g., “Webcat”).32,33 Moreover, most partici- pants had problems complet- ing the task asking them to find an online guide for doing research in women’s studies. The authors altered the origi- nal wording of the question, which had used the term path- finder, after pilot test partici- pants were unable to interpret the request. Although the re- vised language of this question was descriptive, it did not match the language on the screen; those test participants who appeared to have some D isco v erin g H o w S tu d en ts S earch a L ib rary W eb S ite 361 TABLE 3 Questions Divided by Category Instruction Related: Pathfinders, Workshops, etc. % of Participants Performing Correctly Question within Time Benchmark Can you find an online guide for doing research in women�s studies at the UIC library? 42% Find out where to sign up for a workshop in the Daley (Main) Library. 67% Access to resources % of Participants Performing Correctly Question within Time Benchmark Access the Web version of ILLINET Online, the library catalog for a number of academic libraries in Illinois. 67% Can you find an electronic copy of the Oxford English Dictionary? 100% Can you find an online article index that offers full-text articles? 83% Find an online article index that covers the field of history. 75% Access the online article index ERIC. 92% Can you access the online catalog for Loyola University through the UIC library Web site? 75% Access to materials % of Participants Performing Correctly Question within Time Benchmark Does the UIC Library own the journal Notes and Queries? 58% Does the library own a copy of The Bluest Eye by Toni Morrison? Is it available? 67% Does the library have an electronic/online copy of the journal Gender and History? 100% Hours/Locations % of Participants Performing Correctly Question within Time Benchmark What are the hours of the Architecture and Art Library for this semester? 100% Find a map showing the location of the Library of the Health Sciences (LHS) in Chicago. 100% 362 College & Research Libraries July 2002 TA BL E 3 Qu est ion s D ivi ded by Ca teg ory (co nt. ) Cir cul ati on/ IL L % of Par tici pan ts P erf orm ing Co rre ctly Qu est ion wi thi n T im e B enc hm ark If y ou are a g rad uat e s tud ent , ho w l ong ca n y ou hav e a bo ok che cke d o ut f rom the Da ley (M ain ) L ibr ary ? 42% Ho w w oul d y ou sub mit a r efe ren ce que stio n v ia e -m ail? 100 % Fin d th e o nli ne for m t o s ubm it a n in ter lib rar y lo an req ues t (t o b orr ow a b ook no t he ld a t U IC fro m a not her lib rar y). 100 % Fin d a pa ge wh ere yo u c an ren ew a b ook on lin e. 75% Sy ste ms % of Par tici pan ts P erf orm ing Co rre ctly Qu est ion wit hin Ti me Be nch ma rk If y ou we re h avi ng tec hni cal dif ficu ltie s a cce ssi ng UI C l ibr ary ele ctr oni c re sou rce s fr om yo ur hom e c om put er, w her e w oul d y ou loo k f or hel p? 50% Ot her % of Par tici pan ts P erf orm ing Co rre ctly Qu est ion wit hin Ti me Be nch ma rk Wh at s erv ice s a re a vai lab le t o s tud ent s w ith dis abi liti es? 100 % Ca n y ou fin d in for ma tio n a bou t jo bs at t he UI C l ibr ary ? 17% understanding of path- finders/subject guides located it easily. Gener- ally speaking, tasks re- lated to online instruc- tional materials and re- sources, such as “path- finders,” “workshops,” and “article indexes,” caused difficulties. Al- though this may imply poor architecture of the pages, in consideration of the literature, it may more likely indicate that patrons are unfamiliar with the vocabulary of librarians or are un- aware of the types of re- sources and services of- fered by libraries. Though there was de- scriptive text within the pages to help guide us- ers to these resources, it was not always suffi- cient. Karen Eliasen, Jill McKinstry, Beth Mabel Fraser, et al. suggested that undergraduates, in particular, benefit from descriptive text.34 Dickstein and Mills dis- covered through itera- tive testing that they needed to describe the various types of re- sources in many differ- ent ways for users at dif- ferent levels of under- standing: icons, short descriptive text, and some library language for those who are famil- iar with library termi- nology.35 Participants fared better when complet- ing tasks that were phrased in language used within the Web site. For example, one Discovering How Students Search a Library Web Site 363 others. The researchers looked for reasons to explain these variations. For example, the mean time for find- ing a map for the health science/medical library was thirty-nine seconds; the stan- dard deviation was thirty-two seconds. Although this task can be completed in two clicks of the mouse, three individu- als took more than a minute to find it. The wording of the link to the map assumes that the user knows the general layout of the campus. Because some of the partici- pants were unsure of this, it took them longer to find the map. Almost all par- ticipants who quickly completed this task referred to some prior knowledge of the organization of the campus. Every participant but one used the “Web site search” during testing, and sev- eral utilized it while completing a major- ity of the tasks; the individual who did not search in this fashion was the only participant who indicated she had limited experience using the Internet. Many of the result sets retrieved via the Web site search contained misleading or unrelated links. Despite the search engine’s lack of efficiency, participants continued to uti- lize it rather than attempting to retrieve information through the site’s subject or- ganization. In fact, participants opted to use the site search function in all but six tasks, and during those six they consis- tently displayed high efficiency and re- ported low frustration. One participant commented that the “Web site search is the easiest way to find” a page, despite the fact that only one of her five Web site searches was successful. Other partici- pants made similar comments and had similar rates of success: “I want the thing where it says advanced search and you can type in a bunch of words. That would be way easier for me.” Participants’ com- ments emphasized their focus on ease of use rather than on the retrieval of accu- rate results, which corresponds to the re- sults of Barbara Valentine’s study on un- dergraduate searching behaviors: “many students’ … desire for knowledge seemed to have little influence on how the [re- search] process was negotiated.”36 task asked participants to “find the online article index ERIC.” Although 92 percent of the subjects successfully completed the task, many remained confused as to what exactly they had found and why it might be useful to them. One participant, de- spite success completing the task, com- mented, “I don’t even know what ERIC is.” This pattern was repeated through- out testing: Questions phrased in the lan- guage used within the Web site provided clues to the participants, who then per- formed more successfully. Performance on those tasks whose language did not mirror the language used within the Web site was lower. The users were good at recognizing terms but often struggled to understand their meaning. This suggests a need for more user instruction (in the classroom, online, or at the reference desk) and the use of clearer language. Participants also had problems com- pleting basic tasks involving the circula- tion of books. Only 42 percent of the us- ers were able to find the page that details information about the lending periods for graduate and undergraduate students. Similarly, although 80 percent found the page for renewing a book online, it took users an average of 11.3 clicks to locate it (the expert search path required three clicks). The high level of overall success coupled with the low level of efficiency apparent in these results does not indi- cate a lack of understanding of library resources. Therefore, the authors sur- mised that these paths are candidates for streamlining. Ideally, there would have been a nar- row range of times for the completion of each task, suggesting some consistency in the subjects’ abilities to solve them. For many of the questions, however, the mean time and standard deviation were nearly the same, indicating that what posed a problem for some was very simple for Web access has clearly not alleviated the more basic and long-range issues of how students learn about the library and its organization. 364 College & Research Libraries July 2002 Because subjects were being asked to “find” information in the Web site, they reverted to what appeared to be a famil- iar way of finding things online—using search engines rather than navigating through a hierarchical order of Web pages. Again, this finding corresponds with Valentine’s results indicating stu- dents preferred to begin research with familiar tools, regardless of their appro- priateness to the task at hand.37 What does this say about the Web-savvy, Google- bred college student? More research needs to be done to determine if this is the way library patrons prefer to find things online, but the implication is that the quality of the internal search engine and the wording of the metadata may be as important, if not more so, than the structure of the pages. (In this context, metadata refers to the descriptive text within the header of a Web page.) Another discovery was that when par- ticipants reported frustration or confu- sion, they frequently expressed a desire to contact library staff, whether via e-mail or by telephone, to receive personalized assistance, most particularly when trying to find technical assistance. In some cases, participants opted to entirely bypass find- ing information on their own in favor of contacting library staff. One participant commented, “I probably would ask a li- brarian, try to contact someone, rather than try and find something.” Other times, participants wanted to contact a librarian because they could not complete a task. This latter scenario could be partly alleviated by redesigning the pages and improving the search engine of the site; however, there is clearly a need for the availability of library staff. Conclusions The authors accomplished both their pur- poses in undertaking this usability test: evaluating user-friendliness and ease of navigation of the recently redesigned li- brary Web site, and observing the inter- actions of users with the Web site in hopes of gathering information about how to best meet user needs. No faculty mem- bers were included in the test, so the find- ings can only be extrapolated to the stu- dent population. The collective results reveal examples of a few easy-to-fix prob- lems, recurring use of jargon, and the ar- rangement of information that requires prior knowledge of the library, as well as several findings that call for closer exami- nation. Especially noteworthy were the par- ticipants’ difficulties with library termi- nology and lack of knowledge of library resources. These phenomena have been observed long before libraries built Web sites. Web access has clearly not alleviated the more basic and long-range issues of how students learn about the library and its organization. The online environment gives librarians another venue for edu- cating patrons, and this should be kept in mind when choosing the icons, layout, and language used within library Web sites. Another striking observation was the participants’ tendency to use the site’s in- ternal search engine rather than attempt- ing to navigate through the site by click- ing. Participants consistently desired an easy and familiar search process—regard- less of the accuracy of the search’s results. These search habits revealed that students had a high familiarity with Internet search engines and a low familiarity with library terminology, organization, and resources. They also indicate that more attention should be paid to metadata and a strong internal search engine so that library Web sites can be searched as easily as other prominent online sites. Although all participants but one re- ported a high level of proficiency with searching the Internet and using comput- ers, they consistently expressed a need for human contact when searching for infor- mation. This demonstrates that librarians are still greatly needed as educators, re- gardless of the venue. More research is recommended in these areas, and to test the validity of the authors’ conclusions, a future usability test, performed after changes are made to the site, would be most helpful. Usabil- Discovering How Students Search a Library Web Site 365 ity studies are a straightforward and cost- effective way to get design input from a library’s intended audience and to learn more about its patrons. Notes 1. Mary Pagliero Popp, “Testing Library Web Sites: ARL Libraries Weigh In,” paper pre- sented at the Association of College and Research Libraries, 10th National Conference, Denver, Colo., Mar 15–18, 2001. 2. Alison J. Head, “Web Redemption and the Promise of Usability,” Online 23 (Nov./Dec. 1999): 20–28. 3. Janet K. Chisman, Karen R. Diller, and Sharon L. Walbridge, “Usability Testing: A Case Study,” College & Research Libraries 60 (Nov. 1999): 552–69. 4. Ruth Dickstein and Vicki Mills, “Usability Testing at the University of Arizona Library: How to Let the Users In on the Design,” Information Technology & Libraries 19 (Sept. 2000): 144–51. 5. Brenda Battleson, Austin Booth, and Jane Weintrop, “Usability Testing of an Academic Library Web Site: A Case Study,” Journal of Academic Librarianship 27 (May 2001): 188–98. 6. Jakob Nielsen and Thomas Landauer, “Guerilla HCI: Using Discount Usability Engineer- ing to Penetrate the Intimidation Barrier,” in Cost-justifying Usability, ed. Randolph G. Bias and Deborah J. Mayhew (Boston: Academic Pr., 1994), 245–72. 7. Jakob Nielsen, “Why You Only Need to Test with 5 Users.” Available online from http:// www.useit.com/alertbox/20000319.html. 8. Chisman, Diller, and Walbridge, “Usability Testing.” 9. Jeffrey Rubin, Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests (New York: Wiley, 1994). 10. Dickstein and Mills, “Usability Testing at the University of Arizona Library.” 11. Rachael Naismith and Joan Stein, “Library Jargon: Student Comprehension of Technical Language Used by Librarians,” College & Research Libraries 50 (Sept. 1989): 543–52. 12. Mark A. Spivey, “The Vocabulary of Library Home Pages: An Influence on Diverse and Remote End-users,” Information Technology & Libraries 19 (Sept. 2000): 151–56. 13. Battleson, Booth, and Weintrop, “Usability Testing of an Academic Library Web Site.” 14. Louise McGillis and Elaine G. Toms, “Usability of the Academic Library Web Site: Impli- cations for Design,” College & Research Libraries 62 (July 2001): 355–67. 15. Chisman, Diller, and Walbridge, “Usability Testing.” 16. Rubin, Handbook of Usability Testing. 17. Dickstein and Mills, “Usability Testing at the University of Arizona Library.” 18. Nielsen, “Why You Only Need to Test with 5 Users.” Available online from http:// www.useit.com/alertbox/20000319.html. 19. “Interesting Facts about UIC Freshmen,” Student Trends 3.3 (Apr. 2001): 2–3. 20. Head, “Web Redemption and the Promise of Usability.” 21. Rubin, Handbook of Usability Testing. 22. Shelley Gullikson, Ruth Blades, Mark Bragdon, et al., “The Impact of Information Archi- tecture on Academic Web Site Usability,” Electronic Library 17 (Oct. 1999): 293–304. 23. Battleson, Booth, and Weintrop, “Usability Testing of an Academic Library Web Site.” 24. Dickstein and Mills, “Usability Testing at the University of Arizona Library.” 25. Gullikson, Blades, Bragdon, et al., “The Impact of Information Architecture on Academic Web Site Usability.” 26. Rubin, Handbook of Usability Testing. 27. Ibid. 28. Battleson, Booth, and Weintrop, “Usability Testing of an Academic Library Web Site.” 29. Dickstein and Mills, “Usability Testing at the University of Arizona Library.” 30. Gullikson, Blades, Bragdon, et al., “The Impact of Information Architecture on Academic Web Site Usability.” 31. McGillis and Toms, “Usability of the Academic Library Web Site.” 32. Chisman, Diller, and Walbridge, “Usability Testing.” 33. Dickstein and Mills, “Usability Testing at the University of Arizona Library.” 34. Karen Eliasen, Jill McKinstry, Beth Mabel Fraser, et al., “Navigating Online Menus: A Quantitative Experiment,” College & Research Libraries 58 (Nov. 1997): 509–16. 35. Dickstein and Mills, “Usability Testing at the University of Arizona Library.” 36. Barbara Valentine, “Undergraduate Research Behavior: Using Focus Groups to Generate Theory,” Journal of Academic Librarianship 19 (Nov. 1993): 300–304. 37. Ibid.