75 Student-Centered Design: Creating LibGuides Students Can Actually Use Amy E.G. Barker and Ashley T. Hoffman* In this mixed-methods study, librarians at Kennesaw State University Library System conducted a year-long design research project to create a flexible subject guide “blueprint” for undergraduate students using LibGuides. Methods included a card sorting study with 18 undergraduate students and usability testing with 40 under- graduate students. The study’s goals were to identify what content, aesthetic de- sign, organization, and structure students preferred on a subject guide. This paper addresses the current literature on research guides usability, overviews the design and implementation of the study, and highlights practical results that will easily be transferrable to other libraries. Introduction Creating and maintaining subject guides that students can, and will, actually use is an ongoing challenge for many academic libraries. Source types, research tools, and library instructional content vary widely among disciplines, requiring distinct individual guides for each subject or major offered at a university. Depending on an institution’s size, breadth of majors offered, and number of librarians, creating a large number of unique guides that meet the same standard of quality can seem an impossible task. Standardized templates can aid in the rapid production of multiple subject guides. However, these templates can be too rigid to modify for different subjects or so generalized that they offer little subject-specific content to students. Worst of all, subject guide templates are often designed to fit librarians’, and not students’, ideas of how information should be organized. One complication is that, while librarians are experts in subject-specific research, they are often not trained in best practices for web design and usability. Even the most thorough subject guide is useless if students are not able to find the information they need or if students choose easier, more attractive web sources with which they are already familiar. Therefore, librarians need to employ design research methods to create subject guide templates that are useful, attractive, and easy for students to use and librarians to create. In this mixed-methods study, librarians at Kennesaw State University (KSU) conducted a year-long design research project using card sorting and usability testing methods to create a flexible subject guide “blueprint” for undergraduate students with SpringShare’s LibGuides * Amy E.G. Barker is Instructional Design Librarian and Ashley T. Hoffman is eLearning Librarian in the Ken- nesaw State University Library System; email: abarke24@kennesaw.edu, ahoffm18@kennesaw.edu. ©2021 Amy E.G. Barker and Ashley T. Hoffman, Attribution-NonCommercial (https://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC. mailto:abarke24@kennesaw.edu mailto:ahoffm18@kennesaw.edu https://creativecommons.org/licenses/by-nc/4.0/ 76 College & Research Libraries January 2021 software Version 2. The study goals were to identify the type of content undergraduate stu- dents wanted on a subject guide, what aesthetic design they preferred, and what organization and structure best enabled students to find the information they sought. Situational Context The KSU Library System supports a large R2 state university of more than 35,000 students across two campuses, making the problem of creating usable research guides especially acute. Using LibGuides, 25 undergraduate librarians maintain approximately 50 undergraduate subject research guides. In 2017, these guides were faced with several issues. Subject guide usage had been static or declining for years, even after the implementation of a standard template in 2014. Librarians complained that this guide template was too inflexible to accom- modate subject areas with nontraditional research components, such as foreign languages or art. In addition, Kennesaw State University had consolidated with Southern Polytechnic State University since the creation of the old template, adding a variety of academic programs that did not fit neatly into the template’s design. Many librarians also noted that the guides were unattractive, outdated, and unable to meet web accessibility guidelines. Additionally, while many librarians had contributed to the old template design, the library had completely failed to consult students. In response to these issues, a group of five librarians came together in 2017 to form the Research Guides Assessment Task Force. This research team aimed to conduct a design research project to create a new guide blueprint based on student preferences and needs for guide content, aesthetics, structure, and organization. Literature Review A review of the recent literature highlights the importance of user-centered design in academic library research guides. How well students are able to use guides has a direct impact on their ability to learn. As Thorngate and Hoden write, “There is a growing recognition, both in the library community and in the e-learning community more broadly, that user experience and student learning are intimately connected.”1 Therefore, efforts to evaluate the ways students use research guides are necessary to ensure the best outcomes for student learning. Often design research unveils a gap between the expectations of guide users and creators. Unlike librarians, students do not have a mental model of research that includes subject research guides;2 and, even when students do think of research guides, they often have dramatically different ideas of how those guides should be organized.3 Students expect subject guides to contain subject-specific material, not general content such as links to the catalog or general- ized citation information.4 In addition, a 2017 study found that students were confused by instructional content in subject guides, expecting instead to see recommendations for sources.5 To counteract student confusion, Little recommended using the “worked example effect,” wherein librarians provide a direct link to a source for users when the process of teaching them how to find it themselves would cause extraneous cognitive load.6 There is much evidence that the structure of research guides impacts students’ ability to use the guide. Even before the debut of LibGuides Version 2, which enabled side navigation, libraries conducting usability testing noticed problems with guide navigation using a row of tabs on the top of the page. As Ouellette noted, “Overall, students do not like the [top] tab navigation system of LibGuides at all both for aesthetic reasons and because left-side local navigation menus have become quite standard on the web.”7 In addition to this convention, Student-Centered Design 77 users commonly suffer from “banner blindness” that causes them to overlook things at the top of the page.8 If guide designers do choose to use top tabbed navigation, they should make these tabs more visible and match existing heading conventions on the page, such as font style and color. For either top or side navigation, tab titles should be kept brief9 and their total number should be limited.10 When it came to number of columns, Alverson et al. noted that students fail to notice content in side columns, particularly if that information is generic, unactionable, or irrelevant to the guide’s subject.11 Thorngate and Hoden found that their study participants preferred a two-column layout over a one-column layout, particularly if the columns were divided between main and supplemental content. Of all variations, the three-column layout was the least preferred.12 One negative consequence of choosing fewer columns to display content is the necessity to scroll, which was found to be universally unappealing to students.13 On the issue of guide organization, several studies found that students preferred re- search guides to be organized by information need (such as how to research an author),14 the research process,15 or subject-specific content.16 Regardless of organization scheme, it is important to organize guides into broader categories that are then subdivided into smaller pieces with a clear hierarchy.17 Despite best efforts to organize information into a system students understand, many studies reported that students still desired a search box and will often look for one before attempting to navigate through a guide by clicking on tabs.18 Often in looking for this search box, they may confuse a site search box for one that searches the library’s discovery tool.19 Much of the literature discussed the effect of a guide’s visual appearance on cognitive load.20 Perhaps the most cited offense of poor guide design is that of “clutter,” or excessive words, links, images, tabs, and other visual forms that contribute to user confusion.21 Libraries can avoid clutter by using a clean, simple layout and creating a template to maintain consistency of guide look and feel.22 Guide creators should curate small selections of specific journals or databases rather than a comprehensive list of all pertinent resources.23 Multiple visual forms, such as images or infographics, can be combined into a form using movement or audio, such as a short video.24 Indeed, at least one study noted a student preference for multimedia and interactive content over text-based content.25 Despite the wealth of research on library guides and their design, a gap exists in the area of LibGuides Version 2 and the newly available side navigation option. This article aims to help fill in this gap through sharing a mixed-methods study that included two design research methods and a large sample size, as well as new LibGuides features. The results of the present study both reaffirm some of the literature and reveal new best practices for guide content, aesthetics, structure, and organization, all based on student preferences revealed through design research. Design and Methodology In spring 2017, the research team met to design the scope and methods to be used in the study. The scope was limited to 58 undergraduate subject guides, accounting for approximately 33 percent of the public-facing guides. As the largest cohesive type of guide, as well as the only type required to follow a template, targeting the undergraduate subject guides provided the biggest impact possible while still keeping the research project manageable. The goal was to learn more about three essential questions relating to these guides: 78 College & Research Libraries January 2021 1. What type of content do students want? 2. Is the overall look and design of the template accessible and aesthetically pleasing to students? 3. Can students find what they’re looking for? To answer these questions, the research team designed a mixed-method design research study that took place during a 10-month period. To address the first question, the research team decided to use card sorting to provide students an opportunity to select and eliminate proposed content. This method also addressed the third research question by asking stu- dents to organize information into a structure they thought made sense. Usability testing then allowed the research team to gauge whether the new structure worked in practice and observe student reactions to the overall look and feel of the guides. Both methods benefited from being low in cost and not requiring large sample sizes to be effective and accurate. To conform to the research team’s constraints of budget, time, and recruitment abilities, as well as Institutional Review Board requirements, specific logistics for both research methods were chosen and developed as described below. Card Sorting Method The research team conducted several card sorting sessions with undergraduate students at both campus libraries using a physical, rather than digital, method as described by Spencer.26 Initially, students were recruited via a one-question paper survey asking students what they would expect to find on a research guide. However, a lack of promised attendance at the first three sessions led the research team to recruit additional students via the university’s daily email digest. The participation issue also necessitated changing from students working in groups to working individually to complete open card sorts as described below. Additional sessions were also scheduled to achieve adequate participation and data. At the end of each session, participants received small bags of library promotional items as an incentive. The research team created a set of 40 cards based on information included in the original subject guide template and additional information frequently included on course guides. FIGURE 1 Example LibGuides box and corresponding card showing the “Citation Resources” box as it appeared in the old template, as well as the card students sorted during the study Student-Centered Design 79 Most content boxes in the original template became the basis for a single card. Cards were labeled with the box title and contained a brief description of the content of the box (see figure 1). In addition to the provided cards, each participant was also given a small set of blank cards they could use to add or duplicate content. To facilitate data collection, the cards were also numbered on the back for ease of entry into the analysis spreadsheet cre- ated by Spencer.27 A total of 18 undergraduate students participated in seven sessions, with one to four stu- dents per session. These students represented .05 percent of the total undergraduate student body and spanned all four undergraduate years of study. Based on the data collected, about half had completed multiple research assignments, but only a handful had experience working with a librarian or using research guides. Each session was facilitated by the lead investigator of the research team while one or more other team members took notes. Students were led step by step through the open card sorting process for 45 minutes, followed by a 10- to 15-minute group discussion on research guides in general. An open sort allows participants to create their own categories of information, rather than using predeveloped categories. For this study, the research team provided a “Trash” category from the beginning to encourage the option to remove content entirely. During the session, participants were prompted first to sort the cards, then label their groups of cards, and finally to arrange their groups into their desired order. Participants were also given the opportunity to write down their method, reasoning, or any other comments on their sorts for the research team. Following the card sorting activity, the lead investigator facilitated a group discussion of 10 questions related to subject guides, such as what they should be called, whether and how students would use these guides, and what structure and aesthetic design was preferred. The discussion portion was audiorecorded, transcribed for the research team, and then deleted to maintain participant confidentiality. After the completion of all seven sessions, the research team conducted data analysis following Spencer’s method.28 Usability Testing Method After analyzing the card sort data, the research team created an initial subject guide proto- type for usability testing.29 This prototype was then subjected to four rounds of task-based usability testing based on the method described by Krug.30 Each round of testing consisted of 15-minute one-on-one sessions with an average of 10 undergraduate students held once a month. The rounds of testing were iterative, meaning that the team made improvements to the prototype guide between rounds based on results. Over the course of four rounds, 40 undergraduate students participated in testing. Based on the demographic data collected, these students represented majors from all but one of KSU’s degree-granting undergradu- ate colleges. Most participants were in their second or third year of study, but first-, fourth-, and fifth-year students were included. Thirty-one were between 18 and 22 years of age, with the other eight ranging from 23 to 39. Participants were recruited through a combination of advance advertisement in the university’s daily email digest and on-site recruitment. At the end of their session, participants received a candy bar as incentive. The research team created a list of tasks for testing based on the most frequently asked questions in the reference desk question logs and chat reference transcripts. The five initial topics identified for testing were the following: (1) getting started on a research assignment; (2) finding a specific source type; (3) finding the library’s databases; (4) citations; and (5) getting 80 College & Research Libraries January 2021 help. Tasks were adjusted between rounds if issues of understanding arose or the problem the task was meant to measure had been resolved. Due to the need to change the design between rounds, data was analyzed at the end of each round of testing. While sessions were structured around a script based on Krug’s method,31 they were open-ended and nonquantitative. Sessions were held in a centrally located, private room with a computer on the ground floor of one of the campus libraries. Blackboard Collaborate Ultra was used to record the screen and audio for the benefit of the research team, who were not present during testing except for the session facilitator. The session facilitator led participants through the tasks in a random order and prompted along the way for verbal feedback using open-ended, probing questions. After the tasks had been completed, the participant was asked a list of scripted questions about their experience, such as whether they were able to find the information the tasks required and their impressions of the design. Data—Quantitative Card Sorting One of the primary aims in interpreting the card sorting data was to identify patterns in the categories participants created out of cards, which would later inform the organization of information on the subject guide blueprint. The research team was able to analyze 17 of the 18 participant “sorts.”32 Participants sorted the cards into anywhere from 3 to 11 different cat- egories, with an average of 6 categories across all sorts. In reviewing the results to determine standardized categories, some common themes were readily apparent, such as groups labeled “searching” or “getting help.” Other themes were not apparent until the research team was well into data analysis. Regardless of the number of categories and their names, one consistent trend across most sorts was organization based on the research process, as illustrated by the TABLE 1 Selected Card Sorting Results. Shows a selected portion of the 40 cards sorted and the frequency with which each card was sorted into standardized categories Card Choosing a Topic Databases Getting Started Help Search Sources Trash Writing Help Miscellaneous 21: The Writing Center 11% 56% 11% 17% 6% 22: Citation Resources 6% 11% 17% 22% 6% 28% 11% 23: Chat with a Librarian 89% 11% 24: More Ways to Ask 78% 6% 6% 6% 6% 25: Contact Me 89% 11% 26: Data and Statistics 28% 6% 22% 28% 17% 27: Dissertations and Theses 11% 22% 6% 11% 28% 6% 11% 6% 28: The Research Cycle 6% 78% 6% 11% Student-Centered Design 81 creation of a “Getting Started” type of category across all 17 sorts. Another indication of the importance of organizing the content by a research process was that 10 of the sorts started with information on how to do research, and only 5 suggested starting with searching. Within each standardized category, some patterns were also apparent with cards frequently assigned to that category. For example, a card labeled “The Research Cycle” appeared in the “Getting Started” category across 14 (82%) of the sorts. Additionally, some cards were repeat- edly sorted into the same standardized category by different students (see table 1). The most consistent set of cards was frequently found in the “Help” category, with cards 23, 24, and 25 appearing together in this category in 14 sorts; in fact, cards 23 and 25, “Chat with a Librarian” and “Contact Me,” were never separated. One surprising result was the lack of pattern regard- ing the “Trash” category, which students were instructed to use to discard any unimportant cards. Only eight of the participants used this option, however, and no significant trend was discernable. With such a lack of data on which information was not important, the research team decided to keep all of the original information in the initial prototype created for usability testing. Usability Testing During usability testing, one of the most important data points was examining how well students were able to complete assigned tasks. The research team scored each task comple- tion rate according to a rubric33 that reflected the amount of time and difficulty participants experienced in completing a task (see table 2). For each task, the research team decided what would generally be considered successful completion. The Citation Task, for example, was successfully completed if students found and identified links leading to the Purdue OWL. After testing, each participant’s attempt to complete a task was scored from 0 to 3, with 0 indicating failure to complete the task and 3 indicating successful completion with very little difficulty. It is important to note that sometimes students themselves considered the task to have been completed successfully, even when they did not find the “correct” solution decided upon by the team. In these cases, that task completion was evaluated as a 0. As the team progressed among the four rounds of usability testing, certain tasks were removed, modified, or added based on the results of the previous round. For example, once the majority of participants scored well on the Scholarly Article task, that issue was consid- ered solved and the task was replaced. The Getting Started Task represents the most constant task, which prompted participants to pretend they had been assigned a paper and asked them where they would begin. This task was necessary by way of introduction to the entire series TABLE 2 Average Task Completion Success. Shows the average success rate for the different tasks in each round of usability testing, as scored by the research team. Scoring key: 0 = Fail; 1 = Succeed very slowly, in a roundabout way; 2 = Succeed a little slowly; 3 = Succeed quickly   Get Started Task Scholarly Article Task Source Type Task Citation Task Help Task Evaluating Sources Task Database Task Search Tips Task Round 1 1.1 0.6 2.5 1 N/A N/A N/A N/A Round 2 2.5 2 2.2 2.1 N/A N/A N/A N/A Round 3 2.6 N/A 1 2.5 2.6 N/A 1.6 N/A Round 4 2.7 N/A 2 N/A 2.3 0.1 N/A 1.6 82 College & Research Libraries January 2021 of tasks, and so it remained despite receiving higher and higher scores with each round. By the last round, it scored as high as 2.7, meaning participants were able to complete this task very easily and accurately. Two of the most dramatic successes were with the Scholarly Article Task, which improved from 0.6 to 2 after redesigning the Find Sources page, and the Citation Task, which improved from a 1 to 2.5 after moving the box of citation links from the Help page to the heavily used Find Sources page. In addition to independently assessing the success of each task, participants were asked whether or not they were “able to find what [they] were looking for?” at the end of their ses- sion during the first three rounds of testing. This question represented the crux of the research team’s goal with the usability study and also allowed for comparison of a student’s perception of success against their actual success, as measured using the Kuniavsky rubric. Unsurpris- ingly, given the difficulty students experienced with most Round One tasks, the majority of students did not feel they were successful (see table 3). However, after subsequent rounds of prototype improvement and testing, participants were most likely to answer “yes,” with a few giving answers like “for the most part.” This aligns well with improvements overall in task completion rates. Data—Qualitative Throughout the study, qualitative information was also collected from student participants to clarify and expand on the basic data described above. Student comments were solicited through specific discussion questions at the end of each card sorting and usability testing session. Additional comments arose naturally as students talked through their search process or reacted to the guide. These qualitative responses centered on four main themes: potential usage of research guides and preferences for guide structure, content, and aesthetics. The research team asked students to share not only whether or not they thought they would use the guides once the study was done, but also what devices they would be likely to use to access the guides. During card sorting, the majority of participants believed they would use re- search guides for multiple subject areas. For this reason, they preferred a similar layout for each subject but also wanted it to be immediately clear what each guide was about. Usability testing confirmed both trends, as students easily navigated the same general structure for two different subjects and iterative improvements to the welcome page helped them readily identify the focus of the guide. In regard to the type of device they thought they would use, students were somewhat more divided, but about half the card sorting participants thought they would use a phone only rarely, depending on how mobile-friendly the site was. During usability testing, some students could tell the site was mobile-friendly, based on the minimalist design and navigation placement. Considering the overall structure of the guide, the decision between side and top navi- gation menus was a significant point of discussion among librarians prior to conducting the TABLE 3 Student Perception of Success. Shows how many participants in each of the first three rounds felt they were able to find what they were looking for overall, across all tasks attempted Q. Were you able to find what you were looking for? Yes No “For the most part” Round 1 5 3 3 Round 2 9 — 3 Round 3 5 — 2 Student-Centered Design 83 study. During card sorting, students were specifically asked about their preference and were almost evenly split, with seven preferring side navigation (vertical tabs), nine preferring top navigation (horizontal tabs), and two expressing no preference. Interestingly, their reason- ing tended to be similar regardless of which layout they preferred, with most mentioning familiarity, immediate access to all subtopics, or reducing clicks and scrolling. Indeed, during usability testing, students tended not to scroll past the center of the page, often resulting in poor task completion for any task relying on content below the middle of the page. For the second round of testing, the research team designed an alternate version of the Find Sources page that had a single tabbed box with a tab for every resource type (as opposed to the origi- nal version, which had subsequent boxes for different source types). Notably, 11 participants preferred the tabbed box design and only two preferred the original design. Confirming similar issues reported in the literature, students often overlooked content in the left column under the menu. According to one student who was asked why they did not read a section, “I didn’t read this at all because it’s kinda over to the left.” Another student explained, “There’s this thing of ignoring the side bar, you know—like you see these first three things and then you’re like, the rest of that is unimportant and you go [to the main con- tent].” This led the research team to conclude that, aside from the menu, only supplemental information should be placed in the left column. Interestingly, throughout the study, students seemed to expect to find—and appreci- ated having—both instructional and resource-list content. At the beginning of card sorting sessions, students were asked to write down what they would expect to find on a “research guide,” and at the end were asked how well the information aligned with their expectations. Fifteen of the participants agreed that it did align, and eight of these also mentioned, usually positively, that the information was more than they would expect. During both card sorting and usability testing, some students mentioned that the information provided would either help them improve as researchers or be useful for new researchers. In fact, although there were some sections of the guide that were often missed by par- ticipants during usability testing, such as sections on finding and developing a topic, stu- dents very rarely suggested removing any of these sections entirely. When led directly to sections they had avoided, most students said that the information was useful, even if they would probably only turn to those sections after they had run into trouble finding sources on their own. A few participants said they did not find those sections useful because they felt confident in their own abilities. For example, one student stated, “I probably wouldn’t use the Advanced Search Tips because I’ve developed my system from having used the library system.” However, they tended to agree that the information needed to be there, either for less experienced researchers or as a refresher for themselves. This recognition of the value of the instructional content explains why an overall organization following a research process made sense to students. It may also explain why even those students who preferred to jump straight to a search box were able to find it quickly. Unsurprisingly, one of the most common student behaviors during usability testing was to immediately look for a search box to fulfill most of the assigned tasks, even “How would you get started on this assignment?”—which did not have a specific answer. Once they found one of the database search box widgets, they usually reported that the task had been satisfied, even if they had not reached the conclusion the research team intended. One student explained their behavior by saying, “I’m looking for something and we’re in the age of Google…click 84 College & Research Libraries January 2021 ‘search,’ type what you need.” As a result, many students commented that the most useful section for them was the Find Sources page, which contained a search box for the library’s discovery tool. Despite this searching focus, when directly asked if they would recommend any changes during the final round of usability testing, none of the students mentioned mov- ing the database links or search box to make them more prominent. An important finding regarding content was that students did not read large blocks of text. The first prototype guide was very text-heavy, and throughout testing the research team gradually condensed the text to a few, web-friendly bullet points due to continued comments. By the final round of testing, most content had been rewritten to use short, informal sentences and rely on bulleted lists wherever possible. According to one student, the new boxes were easy to read “because you guys have bullet points, it’s not like reading for days.” During the final round of testing, the facilitator asked students to read a random section of the guide for brevity, clarity, and usefulness. Students generally responded well to the informal, jargon- free text, remarking positively on some of the formatting such as bolding important words or examples. Students expressed a preference for interactive content over static text. The level of in- teractivity, however, did not need to be high to satisfy this preference. Several students re- marked that the simple act of clicking tabbed or gallery boxes made content more engaging. Similarly, students commented positively on a fill-in-the-blanks exercise designed to help develop research topics. The research team found that students particularly cared about the visual design and aesthetic appeal of the guide. The importance of the aesthetic aspect was first seen in the card sorting sessions, even though none of the discussion questions were designed to look at this issue. When asked about their preferences for navigation placement, many students also commented on the use of images and amount of white space in each example shown. How- ever, during usability testing it became clear that the style of image used was important, not just the use of images. Initially, the pictures in the template were images of generic research scenes with the photo credit listed beneath (for example, the Write box featured a picture of a person writing). When asked about these images, students were indifferent. In a later prototype, these images were replaced by images modified with filters and text to resemble popular social media graphics, such as those on Instagram. During the final two rounds of testing, these images received a variety of positive comments from students. The appreciation of visual elements was not universal, however. The first round of testing included a video, which students tended to either ignore or respond to negatively. Only one participant even clicked on the video, and another said, “I don’t know why anyone would watch the video, unless they just have extra time on their hands…” In response, no videos were used in later versions of the guide. However, students did appreciate the animated GIF included at the top of the Search Effectively page. This simple GIF demonstrates how to iden- tify the main keywords of a sample research topic related to the guide’s subject and received positive comments from students through all rounds after it was included. Students also appreciated a variety of visual elements functioning as visual cues. Some participants noted that they skimmed headings until they found one that applied to their need and, during early rounds of testing, requested larger headings and more distinct coloring for active tabs. After making these changes, the research team noticed an improvement in how well participants completed tasks. In some cases, images were also used as visual cues for Student-Centered Design 85 box content. For example, each step of the research process was represented with an image and a relevant word, such as a photo of a tower viewer with the word “SEARCH.” This pic- ture appeared in the Research Process box on the homepage as well as at the top of the Find Sources page. Additionally, the research team tried using small icons or logos next to links to make them more visually distinct. During the final round of testing, students were much more likely to click links accompanied by an icon. The use of icons with links also helped draw attention to content in the left-hand column, which participants tended to overlook in earlier rounds of testing. Because students responded so well to visual elements, the research team decided to include links, images, and other visually distinct content in as many boxes of the final design as possible. Final Design By the end of usability testing, the research team had developed a new blueprint aligned to student preferences for subject guide organization, structure, content, and aesthetics. Analysis of the card sorting data had revealed that a significant majority of the students organized content along a research process. Because none of the students used the same process, the research team synthesized the general categories identified by the students to create a research process that was used as the organizational basis for the prototype guide during usability testing. Although the organization was refined throughout test- ing, the basic process remained the same and included five steps: (1) Explore Your Topic, (2) Refine Your Topic, (3) Search for Sources, (4) Evaluate Your Sources, and (5) Write. In the final blueprint, the process is laid out in a gallery box on the Welcome page, with a corresponding picture for easy identification that links to the appropriate section of the guide.34 TABLE 4 Final Blueprint Guide Organization Page Name Page Content Welcome • Subject-branded banner • Research process gallery box • Popular links box • Guide owner’s profile box Develop a Topic • “Explore Your Topic” research step • “Refine Your Topic” research step • Supplemental tips on finding an idea Find Sources • Discovery tool search box • “Search for Sources” research step: tabbed box organized by source types • Supplemental box on citation resources Search Effectively • Basic search tips tabbed box • Advanced search tips tabbed box • “Evaluating Sources” research step Get Help • Searchable FAQ LibAnswers widget • Live Library Help chat widget • Descriptions and links to research help services at the library • Another instance of citation resources box 86 College & Research Libraries January 2021 One of the first structural decisions the research team had to make was whether to use LibGuides’ top navigation (horizontal tabs) or side navigation (vertical tabs). During card sorting, students were largely divided on their preferences between the two navigation options, resulting in the research team making a decision based on other factors. The side navigation option was chosen because the literature revealed side navigation as a new web standard.35 In addition, side navigation aligns with other university websites (including the parent library website) and is also the more mobile- and accessibility-friendly interface of the two. The choice of side navigation was bolstered by student feedback during usability testing. FIGURE 2 The “Find Sources Page” from the Political Science guide, showing the guide structure, including side navigation on the left and a tabbed box in the main content column Student-Centered Design 87 When it came to number of tabs, or pages, the prototype guide went through several changes during the four rounds of iterative usability testing. Initially, each step of the five- step process had its own page, in addition to Welcome and Help pages. However, students responded more favorably to fewer tabs in the navigation menu, resulting in the final guide blueprint having five pages total, with some steps in the research process combined into a single page. The final pages are listed in table 4 along with a summary of their content. In addition to creating the overall structure, the research team redesigned the structure of individual guide pages. During usability testing, students were unlikely to scroll past the middle of the page. To compensate for this, the research team employed tabbed boxes or gal- lery boxes to organize subcontents on a page. Most pages on the new subject guide blueprint have the navigational tabs in the top left, supplemental information in the left column, and the main content condensed into one or two boxes in the right column (see figure 2). These main content boxes are divided into tabbed boxes where necessary. Because no students discarded any of the content cards during card sorting, the first prototype subject guide included all the information from the original cards. Usability test- ing revealed that, while students found most of the content on the guide valuable or interest- ing, they were most likely to go to the Find Sources page and use the search box first. After discussion, the research team decided that, while helping students find sources was one of the goals of subject guides, instructing them on the research process was the main objective. Therefore, the research team decided to keep sections of the guide that were less visited but combined several of them to minimize the number of tabs. This decision was also based on the fact that, when asked, most students also wanted to keep this information on the guide. The final blueprint guide’s aesthetic design was tightly linked with organization and structure. Structurally, the research team limited text content to small blocks with plenty of white space around them. Based on student feedback during the card sorting and usability testing discussions, most content boxes were converted to “floating boxes” to eliminate borders, allowing the content to stand out better on a clean, white background. Organizationally, the research team redesigned the look and feel of all tabs and headings based on student feedback during usability testing. In the final blueprint, minimalist light and dark gray are used for tabs in the side navigation menu, while tabbed boxes use gray and the university’s official gold color to highlight the active tab. Headings, as well as all other text, were increased in FIGURE 3 An example subject-branded banner image from the completed Political Science guide. An example of the blueprint template can be seen at http://libguides.kennesaw.edu/2018blueprintexample http://libguides.kennesaw.edu/2018blueprintexample 88 College & Research Libraries January 2021 font size across the blueprint guide to improve readability. Finally, the problem of “banner blindness”—in which students did not read the guide title and therefore did not understand what the guide was for—was addressed by creating a subject-branded banner image for the guide’s Welcome page (see figure 3). The final blueprint was introduced in May 2018 to librarians, who then spent the summer creating their assigned subject guides using the new blueprint as a template. Each librarian was encouraged to individualize the guide with a unique banner image, curated source types and links, and subject-specific example topics, as well as other minor changes. The new subject guides were launched in fall 2018. Discussion Across these two methods of design research, the research team identified several trends in student preferences for LibGuide organization, structure, content, and aesthetics that can be seen in the final blueprint design. Several of these findings also bolster trends found in other studies, lending support for general themes that may be broadly applicable to other institu- tions. The card sorting data collected strongly supports the case for organizing research guide content according to a research process, rather than by types of sources, as has been found in several other studies.36 Little research has been done so far on the preferences of students between the two navigation options available in the current version of LibGuides, but this study supports Thorngate and Holden’s finding that students prefer side navigation and two columns of content with a clear focus.37 However, since participants in this study were somewhat divided on that question during the card sorting phase, and usability testing only used the side navigation layout, those considering a redesign may want to test this point be- fore coming to a decision. The results related to aesthetic preferences are also seen elsewhere in the literature, with students stating a preference for modern images as well as a clean and uncluttered layout. Regarding content, this study contributes something new to the conversation. Although the idea that students prefer interactive content is not new, the research team was surprised to learn that some easily implemented features students found intellectually engaging were sufficient. Additionally, results throughout the study revealed that students felt the informa- tion provided was valuable for either themselves or other researchers; in some cases, they expressed surprise at the depth and variety of content offered. This suggests that the biggest hurdle to increasing guide usage is promoting awareness of the resource and how it can ben- efit students—assuming the guide itself is well designed. Of course, a well-designed guide was the goal of this study, a goal the research team felt was accomplished. This can be seen from the last two rounds of testing, when four students asked when the guide would go live. This accounts for 25 percent of the students in the final two rounds, during which the overall design was very similar to the final blueprint. Additionally, a significant number of students throughout the study said they would definitely recommend the guides to friends due to the value of the content, another positive sign. Despite the success of this study, there are several limitations that should be considered if preparing a similar study. First, the research team felt the card sorting methodology was somewhat flawed, in that the cards were too closely modeled on extant LibGuides content boxes. In reviewing the results, questions arose about whether or not students actually read the description of several cards or if they simply made a decision based on the title of the Student-Centered Design 89 card. Card titles also sometimes contained jargon that students clearly felt was confusing, likely impacting their ability to sort the card. Another limitation of the card sorting portion was the lack of direct feedback from participants. This resulted primarily from the last-minute change from group to individual sorting, which allowed no time to revise protocols to include a think-aloud protocol, as was done during usability testing. Usability testing had fewer overall limitations, although it was sometimes difficult to compare results from one round of testing to the next due to changes in the prototype guide. Despite this minor issue, however, consistent trends emerged. Additionally, constraints of physical space resulted in the research team only being able to conduct usability testing at one library location. This created a significant hurdle for participation by any students based on the other campus. Finally, one limitation applicable throughout the study was heavy recruitment of partici- pants from the pool of regular library users. Although some of the recruitment methods used may have reached students outside the library, the majority of participants were recruited while already in one of the library buildings. However, because very few students had interacted with the library research guides before, the research team felt that drawing from a pool of library users was unlikely to skew the results of the study in any significant way. Additionally, although online-only students were incidentally excluded from the study based on the use of face-to-face methods, the majority of students take multiple online courses during their time at the university, so the perspectives of online students were at least somewhat represented. However, a similar study focused on online learners would be a boon to scholarship in this area. Conclusion There are a wide range of issues in creating and maintaining a collection of subject guides that students can actually use. By creating a blueprint guide, librarians are able to save time in creating guides; however, it is important that the blueprint is created with the end user in mind. Design research methods are an effective strategy for evaluating student needs and pref- erences for subject guides. In this study, card sorting measured student preferences in guide content and organization and usability testing evaluated the structure, aesthetics, and overall design. These methods were low-cost, required a minimal number of staff to organize, and are easily reproducible in most library settings. The result was a subject guide blueprint that reflects undergraduate student preferences in organization, structure, content, and aesthetics. In the end, this new blueprint guide was easily reproduced by a large team of librarians and adapted to a variety of subject guides. Notes 1. Sarah Thorngate and Allison Hoden, “Exploratory Usability Testing of User Interface Options in Lib- Guides 2,” College & Research Libraries 78, no. 6 (2017): 845, https://doi.org/10.5860/crl.78.6.844. 2. Jennifer J. Little, “Cognitive Load Theory and Library Research Guides,” Internet Reference Services Quar- terly 15, no. 1 (2010): 8. 3. Caroline Sinkinson et al., “Guiding Design: Exposing Librarian and Student Mental Models of Research Guides,” portal: Libraries and the Academy 12, no. 1 (January 2012): 63–84. 4. Jessica Alverson et al., “Creating Audience and Environment-Friendly Research Guides: Findings from a User Study,” in Creating Sustainable Community: ACRL 2015 Conference Proceedings (ACRL 2015, Portland, OR: Association of College and Research Libraries, American Library Association, 2015), 127, http://www.ala.org/ acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2015/Alverson_Schwartz_Brunskill_Lefager. pdf. https://doi.org/10.5860/crl.78.6.844 http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2015/Alverson_Schwartz_Brunskill_Lefager.pdf http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2015/Alverson_Schwartz_Brunskill_Lefager.pdf http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2015/Alverson_Schwartz_Brunskill_Lefager.pdf 90 College & Research Libraries January 2021 5. Linsey Ford, Jennifer Holland, and Clarke Iakovakis, “I Don’t Know What I’m Looking At,” in At the Helm: Leading Transformation: The Proceedings of the ACRL 2017 Conference, ed. Dawn M. Mueller (ACRL 2017, Chicago, IL: Association of College and Research Libraries, American Library Association, 2017), 311–20, http://www.ala. org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2017/IDontKnowWhatImLookingat.pdf. 6. Little, “Cognitive Load Theory and Library Research Guides.” 7. Dana Ouellette, “Subject Guides in Academic Libraries: A User-Centred Study of Uses and Perceptions,” Canadian Journal of Information and Library Science 35, no. 4 (December 2011): 447. 8. Alverson et al., “Creating Audience and Environment-Friendly Research Guides”; Kimberley Hintz et al., “Letting Students Take the Lead: A User-Centred Approach to Evaluating Subject Guides,” Evidence Based Library and Information Practice 5, no. 4 (December 17, 2010): 39–52, https://doi.org/10.18438/B87C94; Rachel Hun- gerford and Lauren Ray, “LibGuides Usability Testing: Customizing a Product to Work for Your Users,” Library and Staff Publications 49 (October 2010), https://digital.lib.washington.edu:443/researchworks/handle/1773/17101; Thorngate and Hoden, “Exploratory Usability Testing of User Interface Options in LibGuides 2.” 9. Hungerford and Ray, “LibGuides Usability Testing.” 10. Alec Sonsteby and Jennifer DeJonghe, “Usability Testing, User-Centered Design, and LibGuides Subject Guides: A Case Study,” Journal of Web Librarianship 7, no. 1 (2013): 83–94. 11. Alverson et al., “Creating Audience and Environment-Friendly Research Guides.” 12. Thorngate and Hoden, “Exploratory Usability Testing of User Interface Options in LibGuides 2.” 13. Hintz et al., “Letting Students Take the Lead”; Ouellette, “Subject Guides in Academic Libraries”; Thorn- gate and Hoden, “Exploratory Usability Testing of User Interface Options in LibGuides 2.” 14. Sinkinson et al., “Guiding Design.” 15. Patricia Gimenez, Stephanie Grimm, and Katy Parker, “Testing and Templates: Building Effective Research Guides” (September 25, 2015), http://digitalcommons.georgiasouthern.edu/gaintlit/2015/2015/37; Sinkinson et al., “Guiding Design.” 16. Sonsteby and DeJonghe, “Usability Testing, User-Centered Design, and LibGuides Subject Guides.” 17. Gimenez, Grimm, and Parker, “Testing and Templates”; Little, “Cognitive Load Theory and Library Re- search Guides.” 18. Sonsteby and DeJonghe, “Usability Testing, User-Centered Design, and LibGuides Subject Guides”; Nora Almeida and Junior Tidal, “Mixed Methods Not Mixed Messages: Improving LibGuides with Student Usability Data,” Evidence Based Library and Information Practice 12, no. 4 (December 30, 2017): 62, https://doi.org/10.18438/B8CD4T. 19. Gimenez, Grimm, and Parker, “Testing and Templates.” 20. Little, “Cognitive Load Theory and Library Research Guides.” 21. Little, “Cognitive Load Theory and Library Research Guides”; Ouellette, “Subject Guides in Academic Libraries.” 22. Alverson et al., “Creating Audience and Environment-Friendly Research Guides”; Thorngate and Hoden, “Exploratory Usability Testing of User Interface Options in LibGuides 2.” 23. Alverson et al., “Creating Audience and Environment-Friendly Research Guides”; Hintz et al., “Letting Students Take the Lead”; Little, “Cognitive Load Theory and Library Research Guides.” 24. Little, “Cognitive Load Theory and Library Research Guides.” 25. Almeida and Tidal, “Mixed Methods Not Mixed Messages.” 26. Donna Spencer, Card Sorting: Designing Usable Categories (New York, NY: Rosenfeld Media, 2009). 27. Donna Spencer, “Card Sorting: Resources,” Rosenfeld Media, http://rosenfeldmedia.com/books/card- sorting/ [accessed 13 December 2017]. 28. Spencer, Card Sorting. 29. The first three rounds of usability testing used a prototype English guide, a subject at least somewhat familiar to all students. However, the research team also wanted to ensure the design worked for multiple subjects, so the fourth round of testing used a Political Science guide. 30. Steve Krug, Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems, Voices That Matter (Berkeley, CA: New Riders, 2010). 31. Krug, Rocket Surgery Made Easy. 32. One student created duplicate cards for most of the card deck and used a scheme whose meaning was unclear, rendering their results unusable. 33. The rubric used can be found here: Mike Kuniavsky, Observing the User Experience: A Practitioner’s Guide to User Research, The Morgan Kaufmann Series in Interactive Technologies (San Francisco, CA: Morgan Kaufmann, 2003). 34. Except the final step, which links to the university’s writing center website, as the library does not provide writing support. http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2017/IDontKnowWhatImLookingat.pdf http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2017/IDontKnowWhatImLookingat.pdf https://doi.org/10.18438/B87C94 https://digital.lib.washington.edu:443/researchworks/handle/1773/17101 http://digitalcommons.georgiasouthern.edu/gaintlit/2015/2015/37 https://doi.org/10.18438/B8CD4T http://rosenfeldmedia.com/books/card-sorting/ http://rosenfeldmedia.com/books/card-sorting/ Student-Centered Design 91 35. Ouellette, “Subject Guides in Academic Libraries.” 36. Gimenez, Grimm, and Parker, “Testing and Templates”; Sinkinson et al., “Guiding Design.” 37. Thorngate and Hoden, “Exploratory Usability Testing of User Interface Options in LibGuides 2.”