533 Information Literacy Instruction and Assessment in an Honors College Science Fundamentals Course Corey M. Johnson, Carol M. Anelli, Betty J. Galbraith, and Kimberly A. Green Corey M. Johnson is Head, Library Instruction, in the Holland Library at Washington State University; e-mail: coreyj@wsu.edu. Carol M. Anelli is Professor of Entomology in Food Nutrition Library at Washington State University; e-mail: carol_anelli@wsu.edu. Betty J. Galbraith is Science Librarian and Instruction Coordinator in Owen Library at Washington State University; e-mail: bettyg@wsu.edu. Kimberly A. Green is Director, Office of Assessment of Teaching and Learning at Washington State University; e-mail: kimberly_green@ wsu.edu. © Corey M. Johnson, Carol M. Anelli, Betty J. Galbraith, and Kimberly A. Green The Washington State University Honors College course, UH 290, “Sci- ence as a Way of Knowing,” engages freshmen in scientific inquiry and scholarly literature research. The UH 290 instructor, a learning design consultant, and two librarians collaborated to develop and deliver the course’s information literacy curriculum. The team used student surveys, research blogs, case studies, library instruction sessions, homework problem sets and exams. Students gained from scaffolded instruction including hands-on practice activities and feedback; embedded assess- ments informed adjustments to the course syllabus and activities. This case study details the progressively improved use of this approach and these tools over two semesters. I. Background Information Literacy at Washington State University and Its Honors College In 2005, the Washington State University (WSU) Faculty Senate endorsed the “Six Learning Goals of the Baccalaureate,” which were developed by the President’s Teaching Academy. These goals (critical and creative thinking, quantitative and symbolic reasoning, information literacy, communication, self in society, specialty knowledge) were envisioned as a means to (1) articulate educational learning goals for all undergraduates irrespective of major; (2) serve as a framework for course and curricular design; and (3) help faculty and administrators align course and curricular goals for assessment and accreditation.1 The WSU Honors College adopted WSU’s “Six Learning Goals,” distributing them on bookmarks in student-friendly language and providing faculty work- shops to help students achieve the goals. Serendipitously, endorsement of the goals coincided with a complete overhaul of the Honors College curriculum, which satis- fies general distribution requirements of the university. In the new curriculum, all 200-level courses emphasize informa- tion literacy (including research in the primary literature), critical thinking, and crl-166 534 College & Research Libraries November 2011 “ways of knowing” in the social sciences, arts and humanities, or sciences; student acquisition of these skills is prioritized over course content, which varies among instructors. This last point is important as teaching faculty often prioritize subject coverage above skill-based goals (such as enhancement of information-seeking abilities).2 By formal agreement, all Honors English Composition (Engl 298) sections have at least one library instruc- tion session. Many other 200-level honors courses include library instruction; this is often the case for UH 290 depending on the instructor. Information Literacy at Research Universities: Library Involvement Like WSU, many research universities across the nation have taken steps to em- bed information literacy into their over- arching goals and curriculum. Current roles for librarians in these processes vary greatly. It is clear that university stake- holders are increasingly deemphasizing traditional library quality measures (such as volume counts or number of books checked out) and instead want to know what students are able to do as a result of their interaction with library services and resources.3 While this mindset bodes well for robust librarian involvement in student information literacy skill building, there are still barriers to full participation. For example, at big institutions where de- partmental autonomy is often paramount, opportunities for developing anything beyond the traditional “one-shot” library session are uncommon.4 Also, univer- sity administrators often disagree about whether faculty or librarians should teach information literacy and do not regularly recognize and reward faculty/librarian collaborative information literacy proj- ects.5 Some higher education accreditation agencies view libraries primarily as re- source providers and minimize their role concerning information literacy instruc- tion.6 Finally, within the arena of honors undergraduate education, many instruc- tors falsely assume that these motivated students already possess the skills needed to identify and locate scholarly resources.7 One task is for librarians to gain qual- ity opportunities to teach information literacy; it is additionally critical to assess information literacy skill development. In other words, librarians assume they con- tribute to student learning and, in many instances, have created outcomes and are doing more teaching; now they need to assess the impact of this work.8 In 2008, a survey of Louisiana schools showed that, while information literacy had more formal definitions across the respondent institutions, only half the institutions identified any type of assessment.9 The UH 290 information literacy instruction described in this paper features course planning, teaching, and assessment. Assessment of Information Literacy There are many dimensions to successful assessment. The American Association for Higher Education and Accredita- tion (AAHEA) has created a set of best practices for assessment. They posit that assessment is most effective when it “reflects an understanding of learn- ing as multidimensional, integrated and revealed in performance over time.”10 Learning entails not only what students know but what they can do with what they know. For UH 290, the authors devised an array of ongoing assessment techniques including online threaded dis- cussions, culminating assignments, and group exams that challenged students to apply concepts learned to new contexts. Another key to effective assessment is involving representatives from across the educational community. “Faculty play an important role, but assessment’s ques- tions can’t be fully addressed without participation by student affairs educa- tors, librarians, administrators and stu- dents.”11 The UH 290 team included ac- tive participation and contributions from a subject expert (professor and instructor of record), an assessment specialist from the university’s center for teaching excel- lence, and two instruction librarians. Information Literacy Instruction and Assessment 535 Traditionally, a central facet of most as- sessment efforts is the student exam, but the higher education literature concerning testing continues to evolve. Educators and other learning experts are increasingly thinking that fixed choice tests are limited in their accurate measurement of student learning, because, for example, they impose unrealistic time constraints and do not test higher-level thinking skills.12 More generally, tests typically create ar- tificial situations that do not judge how the learner would react in a real-world situation.13 Performance-based tests are preferable, as they simulate instances where students would appropriately use new skills and knowledge.14 UH 290 in- cludes small-group take-home exams (3–4 students/group) and several case studies that demand critical thinking and per- formance to address authentic problems. Small-group activities allow students to discuss and potentially deepen their thinking; when this process is captured online (or by other means), instructors can see the students’ strategies and suc- cesses as well as conceptual blocks and/ or misperceptions. This information can in turn guide instructional improvement. Loanne Snavely and Carol Wright have successfully used research portfolios with honors students to both “track the [research] process as well as individual progress.”15 Information Literacy Instruction, Examples in Honors and Biology Despite a general scarcity in the library literature about information literacy instruction to honors students, there are several projects of note.16 In the early 1990s, librarians Abigail Loomis and Patricia Herrling collaborated with biology professors in the development and execution of an honors course focus- ing on evolution, ecology, and genetics. The librarians were pleased their ef- forts constituted what they described as “course integrated instruction.” The biology professors and librarians worked jointly to design the information literacy teaching sessions and the student work accompanying the lessons. The student deliverables were assessed by both the teaching faculty and the librarians, and credit was assigned to the work.17 This model closely parallels the information literacy components of UH 290. While Loomis and Herrling were largely satisfied with their collaborative project, they outline many factors that inhibited success. One central problem was the abbreviated time the professors allotted for information literacy activities. As the syllabus was developed, informa- tion literacy became marginalized; the library modules were pushed from the main lecture days to discussion recita- tions led by teaching assistants.18 The faculty essentially felt that subject cover- age was paramount to the development of information-seeking skills within the discipline.19 As discussed later, in UH 290 the instructor initially did not provide suf- ficient class time for the librarians to thor- oughly teach information literacy skills, a problem that was quickly rectified. Loomis and Herrling also found some biology professors averse to the idea of teaching the process of information seeking, reasoning that they did not have formal training and that information lit- eracy skills are proficiencies one “picks up here and there.”20 Ironically, the same professors who classified information literacy skills as relatively intuitive also expressed concern about the potentially high level of difficulty embodied in the librarian-generated assessment pieces. More specifically, many professors felt the use of scientific reference materials and scholarly journal articles would overwhelm students.21 Unfortunately, because the librarians created class as- signments under the influence of these professors’ ideas, honors students judged the assignments as too simplistic.22 These problems were averted in the case of UH 290, as the professor viewed information literacy skill-building as a critical course goal along with the adept use of library resources for scholarly research. 536 College & Research Libraries November 2011 In addition to Loomis and Herrling, Elizabeth Kraemer wrote about her ex- periences developing information literacy instruction for honors students (Oakland University). Kraemer draws a number of conclusions about honors students’ responses to information literacy. They place high demands on themselves in terms of academic performance, yet are similar to other students in their tendency to experience “library anxiety” and be ill- informed about information techniques and strategies.23 Regarding pedagogy for honors students, Kraemer recommends small group work, “advanced reference book usage,” time for class discussion, and problem-solving work.24 All of these activities were used in UH 290. Kraemer provided library instruction as part of her Honors College Introduction to the Thesis course. During the first semester of group work, students had two class sessions with librarians; this was determined to be too few and thus was expanded to four sessions. This change helped the librarian become better acquainted with the stu- dents, and more of them sought research consultations from the librarians in the second semester.25 In the case of UH 290, the authors learned from experience that one session was insufficient to thoroughly teach information literacy skills. Librarian Ignacio Ferrer-Vincent and biology professor Christy Carello col- laborated on a biology laboratory course, devising an overarching plan that included specific learning outcomes and assessment activities.26 Library instruction included the following conceptual areas: scholarly vs. popular sources, primary vs. secondary sources, selection of appropriate databas- es, procurement of scholarly articles and identification of articles as peer reviewed. Teaching strategies consisted of pretask instruction, a ten-minute group presen- tation based on a research assignment, and assessment via an end-of-semester survey.27 Each element of this overarching plan was also part of the UH 290 course. In the end-of-semester feedback, Ferrer-Vincent and Carello’s students self-reported an increase in their use of subject-specific databases for scholarly information.28 However, student difficul- ties persisted in two areas: (1) procuring articles through the library catalog; and (2) successfully locating peer-reviewed articles.29 As discussed below, the authors’ experiences with UH 290 mirror the suc- cesses and lingering problems described by Ferrer-Vincent and Carello. II. UH 290 Collaborative Design: Pilot and Redesign First Semester: Classroom Activities, Surveys, Assignments To address its information literacy learn- ing outcome, UH 290 was designed to include instruction by librarians and a series of activities requiring students to identify their information need and to search and evaluate scholarly sources. The course also included several as- sessment activities to identify students’ achievement, their process and percep- tions, and any bottlenecks hindering this outcome. During the first semester of collabora- tion, the instructor presented several lec- tures on using scientific primary research articles. To augment the lectures, students read Gillen,30 which targets scientific information literacy skills, and Moran,31 which served as a model for homework assignments. The instructor asked the librarians to use one 50-minute session to cover how to select science databases and how to search for and procure schol- arly sources. The instructor additionally requested that the librarians also address: parts of a scientific article, popular vs. scholarly sources, primary vs. second- ary sources, and criteria for evaluating sources. The librarians prepared a lesson that included a series of short lecture seg- ments about each topic. As a direct measure of the effectiveness of the instruction session (and the instruc- tor ’s lectures and homework assign- ments), students were asked to complete a take-home group midterm exam with questions based on the aforementioned Information Literacy Instruction and Assessment 537 library research skills (see Appendix A). In general, students performed poorly: 40 percent of student teams (groups of 4 stu- dents) could not locate scholarly articles using library resources; comments in their online threaded discussions revealed that students were frequently searching Google because they did not know how to locate resources through the library. The students’ exam performance was disap- pointing not only because it came directly after a library instruction session but also because, on a precourse self-assessment survey, 50 percent of the students had reported being “comfortable conduct- ing a literature search.” The high self- assessment of honors students can mask their real skill level and pose challenges to library instruction if it is perceived to be unnecessary or even remedial. Assessment activities also showed that students had difficulty classifying the articles they found. On the precourse self-assessment, students were asked to respond to this open-ended prompt: “In your own words, define a primary research article and explain its purpose in the scientific world.” They did so with a high rate of success; eight of the fourteen students provided quality explanations, four had some correct elements, and only two were wholly incorrect. Thus, the vast majority at least provided the basic idea that a primary research article conveys results of the author’s empirical research. Yet, while students could sup- ply definitions, they were not nearly as successful applying that knowledge. One homework assignment asked students to select an article from the bibliography of the Moran article. Only ten of the fifteen students correctly identified their article as primary or secondary; all five errors resulted from incorrectly naming primary sources as secondary sources. Upon dis- cussion with the students, the instructor/ librarians learned that the presence of a literature review in the article revealed a core misunderstanding; the students thought the article must be secondary because a literature review is an analysis of research in which the author(s) of the article did not engage. In light of student difficulties locating and classifying scholarly resources, the instructor and librarians modified the syl- labus, adding two additional instructional sessions. During these teaching lessons, the librarians took the time to demon- strate how to find scholarly articles from the item’s citation and showed examples of primary and secondary articles in the field of evolutionary biology: in effect, modeling these key information literacy skills. Students’ responses on subsequent assignments and exams showed marked improvement (discussed below). On the postcourse survey, 82 percent of students reported that they could retrieve elec- tronic resources through the library. This represents a substantial increase (31%) over the precourse self-assessment of this skill. The precourse survey included two other questions related to information literacy: (1) explain how scientists com- municate findings with their peers; and (2) explain the differences among primary research articles, research review articles, and research articles in Scientific American. Concerning how scientists communicate findings, 93 percent of the students’ pre- course answers were on target. As one might imagine, the postcourse answers were correct as well, but this time they included relevant vocabulary terms/ phrases such as “peer-reviewed” and “primary research article.” Regarding the second question and the definition of a “research review article,” about one- third of the students had initially falsely stated that a review article was a critical outside examination of one other study/ primary article, instead of a synthesis of multiple articles in the same research field or subfield. First Semester: Exams During the first collaborative semester, student performance on group take-home exams (see Appendix A) demonstrated some successful applications of target 538 College & Research Libraries November 2011 information literacy skills. The first exam challenged students to locate an original, peer-reviewed article that had been the focus of a news wire story. The students were then asked 10 subquestions (parts A–J), several of which required informa- tion literacy. Part A asked the students whether the article was a peer-reviewed publication and what evidence would allow them to answer with “absolute certainty.” All five groups said correctly that the article was peer reviewed but two of the groups did not indicate that using the journal’s Web site or Ulrich’s Periodicals Directory was the way to be certain of this fact. In Part I, the students had to classify a particular article from the bibliography by type of source. Only two of the five groups correctly identified the paper as a review article. Retrieval of another paper via library resources in the bibliography was the task in Part J. Four of the five groups successfully found the article and were able to provide an ac- curate summary of the piece. First Semester: Culminating Class Project In addition to the three library instruction sessions in the early and middle parts of the semester, the librarians worked with the class on a five-day case study about the Galápagos Islands.32 The case study features events from the life of Kate, a graduate student new to the islands and seeking a research study for her doctoral work. The first two parts of the three-part activity challenge the students to under- stand the geological past and present of the islands, explore the development of its tropical flora and fauna, and define a host of evolutionary and ecological terms. The students worked in groups of four to five to tackle nineteen short-answer questions on these topics. The librarians aided the students’ efforts by providing a small collection of relevant print refer- ence works and showing them pertinent scholarly studies. The students were to cite their sources for each answer and, in the case of the use of open Web resources, provide information about why the source was credible. Part III of the case study is entitled “The Tortoise and the Sea Cucumber.” After many years of conflicting interests among constituent groups (scientists, tour guides, environmental groups, and sea cucumber fishermen), a sea cucumber crisis erupted in the Galápagos in the early to mid-1990s. The students worked in groups, research- ing positions for each of these constituent groups and, ultimately, through class discussion and negotiation, devising a compromise solution. The librarians taught the students about newspaper and magazine databases they would find fruitful in researching this topic. The groups’ final case study reports illustrated research sophistication at a much higher level than the beginning of the semester. Just over three-quarters (76%) of resources cited in the students’ final case study de- rived from library sources. More than half (53%) of their citations were to scholarly journal sources, and only 13 percent from nonacademic Web sites. Student responses on the final course evaluations reflected their view that FIGURE 1 First Semester Course Evaluations: Most Significant Concepts Learned I learned how to use the library system and look up journal articles relevant to my research. To learn how to work with others effectively. Also, how to find articles through the libraries system. I got experience reading scientific papers… Learning how to read scientific papers. This course taught me to critically analyze scientific research… Information Literacy Instruction and Assessment 539 information literacy was a central com- ponent of the course. When asked their judgment about the most significant course outcomes, 5 of the 14 students specifically mentioned understanding and using scientific research (see figure 1 for the responses). Most students also gained confidence in doing quality litera- ture searches (see figure 2). Second Semester: Classroom Activities, Surveys, Assignments Assessment of student performance data and student feedback from the first se- mester guided several key changes in the course design and instructional approach, including library instruction, during the second semester. Changes included more modeling of target skills combined with increased hands-on practice and timely feedback. In addition, activities were more carefully scaffolded during the semester for improved skill development. The experiences of the first semester led the instruction team to incorporate the expanded set of four library instruc- tion sessions for the second semester, adjust the instructional approach, and add practice activities with feedback. The librarians focused on proactively teaching skills to search and evaluate scholarly ar- ticles, the two primary student problems from the first semester. To address previous troubles that the students had using library resources to find articles, the librarians demonstrated the process using recent scientific articles, monitored students as they found several articles in class, gave feedback as needed, and also added a new short homework assignment requiring students to find an- other article. All 18 students successfully found an article. In the second semester, only 27 percent of students were unable to find the Moran homework article, an improvement from 60 percent in the first semester. A new activity was added to the course to address the difficulties students had categorizing articles as primary or sec- ondary. First, the students accessed a WSU Libraries’ Web page (www.wsulibs. wsu.edu/usered/UH290.html) that pro- vides a comprehensive overview of the differences between primary/secondary/ tertiary sources and review articles. After reading a specified subset of the learning modules, the full class debriefed about knowledge gained. Only two of the 20 students subsequently erred on the home- work assignment by labeling primary sources as secondary, an improvement over the one-third who answered incor- rectly the first semester. However, the teaching experience still needed enhance- ment as the two students faltered due to the misapprehension that inclusion of a literature review makes an article second- ary (the same issue as the first semester). Additionally, second semester home- work assignments were changed to require use of CSE citation style. The instruction team wanted the students to have experi- ence with the citation style most closely related to the disciplinary focus of the course. Although the librarians provided the URLs for several online CSE guides, the students struggled to produce accu- rate citations. First, the librarians realized they had neglected to explain that CSE has two formats; this fact created student confusion. Second, students had numerous FIGURE 2 First Semester Course Evaluations: New Skills Acquired 100% agree that in this course they made judgments about the value of information. 84% of students reported that they felt comfortable searching for primary scientific literature (up 35% from precourse survey). 100% of students reported that they could locate related scholarly information. 71% of students reported that the library sessions helped them learn how to find and retrieve electronic resources through the WSU Libraries. 540 College & Research Libraries November 2011 issues with author name abbreviation and order, as well as article title capitalization. Third, students largely cited articles as if they were retrieved in print, when they should have had included electronic database information. Fourth, and most important, the librarians had not scaf- folded the process, leaving out the steps of modeling the skill and then providing guided in-class practice with feedback. During the second semester, the li- brarians added a library research skills self-assessment at the beginning of the first library instruction session, which provided additional information regard- ing students’ prior experience and under- standing of specific information literacy skills. (This tool was not used during the first semester because of lack of time.) The key results of this self-assessment are presented in figure 3. Item 2 from figure 3 below points to why the students struggled to find scholarly articles through discipline- specific article databases: nearly half had not previously used them. Item 3 illus- trates that students largely did not know what a review article was. However, “primary source” was a vocabulary term in the list that 15 out of the 16 students believed they could confidently define. This matches the precourse survey results from both semesters, in which nearly all students correctly defined the term. De- spite students’ ability to define the term “primary source,” and despite their past library instruction experiences, the overall assessment results illustrated that they needed to learn more about accessing and using library materials and services. Second Semester: Exams and Culminating Class Project The group take-home exam from the sec- ond semester (see Appendix B), though slightly different from the first semester exam, required students to demonstrate many of the same skills. Question 1, Part A (RE: scientist Randy Thornhill), chal- lenged students to classify an article as primary or secondary. Given all of the prior in-class and out-of-class work on this topic earlier in the semester, it was gratifying that all five groups made the correct identification. Question 2, Parts A and B, asked whether an article was peer reviewed and what evidence would allow the students to answer with “absolute certainty.” All five groups said correctly that the article was peer reviewed (same as the first semester) and only one of the groups did not indicate that using the journal’s Web site or Ulrich’s Periodicals Directory was the way to be certain of this fact (which two groups had missed on the first-semester exam). FIGURE 3 Second Semester Pre-Library Instruction Student Self-Assessments 1. 15 of the 16 students had had a prior library instruction session at WSU 2. 7 of 16 had not used library resources beyond the catalog and our general multidisciplinary database 3. less than half of the students knew basic library research vocabulary such as serial, manuscript, literature search, review article 4. only 3 of the 16 knew where to find library subject guides 5. none of the students knew the four ways to contact reference librarians 6. only 25% knew the basic difference between material in our catalog vs. article databases 7. only 1 of 16 was familiar with WSU’s central article interlibrary loan service: Article Reach 8. only half knew how to use truncation in searching 9. only 6 of 16 could put three Library of Congress call numbers in shelf order Information Literacy Instruction and Assessment 541 The latter parts of Question 1 focused on functionalities of Web of Knowledge (Sci- ence) and contextual use of criteria to evalu- ate scientific authority. Question 1, Part E, read as follows: “Search Steve Gangestad using the Web of Science ‘Cited Reference Search’ from 2000–2009 (scroll down to set search limits). How many of his publica- tions have been cited? What are his two most cited publications?” With hindsight, the instruction team realized that word- ing of the exam question was problematic. First, there are actually two date limiters on the Cited Reference Search interface, and the instruction team neglected to specify which date limiter to use. One limits the search to articles that have received cites during the specified years, and the other only includes articles published during the specified years and receiving cites during those same years. Four of the five student groups opted to use the latter search func- tionality. Second, students were confused by the term “publication.” The Cited Refer- ence Search results list contains a column titled “Cited Work” that features the name of the journal for that article record. A subsequent click in a separate column is necessary to show the title of the article. In the exam question, the instruction team meant “publication” as individual article titles, but one group interpreted “publica- tion” to mean the journal title. To answer the question correctly, students had to grapple with the issue of the number of S. Gangestads in the Web of Science database, at which time there were six. Three of the five groups examined the Web of Science results and “our” Gangestad’s Web site closely enough to discover his middle initial is “W” and is used in his professional work. The instruction team was impressed that 60 percent of the groups were able to successfully navigate name variations in the Web of Science even though this particular issue was not discussed in any library instruction session. Overall, the instruction team learned a few important lessons from this expe- rience. First, although the librarians had demonstrated topic searching in Web of Science and shown an example of how article records can be a launching point to prior works (from the bibliography) and future works (cited references), the librarians had modeled neither use of the cited reference search feature nor, consequently, its set of limiters. Also, the librarians could have shown the “Au- thor Finder” feature, which would have helped as well. The exam question asked students to stretch their knowledge to a new area of Web of Science, but unfortu- nately the wording of the question lacked specificity. In the future, the librarians and instructor will collaborate on question creation more closely. Question 1, Part F, read: “Find Randy Thornhill’s webpage and scrutinize his curriculum vitae. By what criteria would you evaluate his scientific authority? THINK. Cite detailed evidence to support your evaluation—use specific criteria and evaluate his performance according to those criteria.” Figure 4 shows the grading criteria that were used for this question. The students did well with this ques- tion in terms of listing facets of his educational background and current employment. All five groups noted the quantity of his publications; three of the five outlined quality measures as well, including Journal Impact Factors and peer-reviewed status. The students largely missed the length of his career and exceptional consistency and high impact of his work as indicators of strong scien- tific authority. Overall, out of 30 points available for this question, the groups earned an average total of 26. In the fu- ture, the instruction team will design a class activity on comparison of strong vs. weak researcher qualities as a measure of scientific authority. The Galápagos case study was the cul- minating project both semesters. Changes made to the second semester’s instruction included the following: students received a specific handout outlining ways to search for the Galápagos project in the newspaper and magazine databases, and 542 College & Research Libraries November 2011 the instructor highlighted the importance of the Galápagos research presentations. These were held in the Honors College Lounge (fancier venue than the regular classroom) and a set of teaching faculty members and librarians were invited to hear the presentations and ask ques- tions. A student evaluation of the library instruction portion of the Galápagos experience was added, with mixed re- sults. Although some students classified the library instruction as repetitive from prior sessions and library instruction sessions from other courses, many were very appreciative of guidance concern- ing the specific articles indexes for the Galápagos work. First and Second Semester Comparative Summary The second-semester syllabus maintained the increased number of library instruc- tion sessions from the first semester. From the student comments above, it seems important that the librarians poll the class beforehand and then tailor instruction and activities to (1) ensure these sessions are not perceived as repetitive; (2) clearly articulate the sophisticated skill devel- opment required for completion of the culminating case study; and/or (3) design practice activities to accommodate a vari- ety of skill levels. Citation style formatting (CSE) and the features of Web of Science are two areas in which students could use more instruction (see above). The groups’ final case study reports illustrated a high level of research sophistication. During the first semester, 76 percent of resources cited in the students’ final case study de- rived from library sources, and 53 percent of their citations were to scholarly journal sources, with only 13 percent from Web sites. In the second semester, these values were 77 percent, 43 percent, and 23 per- cent, respectively. In both semesters, when students as- sessed the course as a whole, they identi- fied information literacy as a course focus, specifically noting the value of learning how to read and interpret the different sections of a science research article and how to find credible sources when doing research. III. Discussion/Conclusion It has been shown repeatedly that under- graduate students underestimate their need for advanced library instruction. Perhaps buoyed by their high school research experience, they enter college convinced that they know all they need and express confidence regarding their research skills. Testing this entry-level mindset against actual college-level re- search often shows that they lack the skills FIGURE 4 Second Semester Exam Question 1, Part F, Answer Key (1) Professional affiliation: U NM, Distinguished Professor (1) Education: advanced degrees (BS, MB, PhD) in zoology or related science (.5) Grad students (there are more than those listed) (4) Selected pubs (there are more than those listed): (.5) length of career: began in 1980 (29 yrs ago) (.25) consistency of scientific output: pubs every year (1) academic discipline: human sexuality, evolution of sexuality, etc. (1) primary source quality & number: numerous v. high impact, peer reviewed (1) review articles quality & number: many v. high impact (.25) other scientific pubs, quality & number: numerous book chapters in best university presses (Cambridge U, Oxford U, Harvard U) (.5) Research featured in prominent, respected popular outlets (Nat. Geog, BBC, many national and international radio and TV programs) Information Literacy Instruction and Assessment 543 from newspapers and magazines were appropriate to the opinion analysis the Galápagos project entailed. A total of 29 percent of the bibliographic entries were noted as excellent by the professor. This was a great improvement from roughly half the students not being able to find articles in the library and the initial inabil- ity of many students to judge quality and authority of resources at the beginning of each semester. It was gratifying that, according to postcourse surveys from both semesters, students strongly agreed/agreed that they are now comfortable conducting a litera- ture search for primary scientific articles and that they have enhanced their ability to critically analyze a scientific paper. The authors feel that this final result was a reflection of our truly integrative and collaborative instruction. The primary course instructor, librari- ans, and assessment expert were involved in all facets of the course from the outset, with librarians contributing to course design. Library instruction sessions were integral to the course: librarians provided instruction, in-house practice with feed- back, and selected assignments. Library homework assignments were integral to course goals and content, not just “add- ons” for the purpose of teaching library skills. This close collaboration enabled the instruction team to make effective midcourse corrections, addressing defi- ciencies in students’ understanding and performance. Over the two semesters, the instruction team implemented scaffolded instruction, and the assessment specialist helped ensure that multiple measures were complementary and that changes were assessed. Advice to Teaching Librarians • Don’t be afraid to ask to become more involved in the course on which you are collaborating. You will be more effective if your library sessions are integral to the course rather than merely appended. • Look over assignments and offer to locate scholarly resources. Honors students in UH 290 fit this pattern. Early assessment of students’ collaborative on- line work revealed that 40 percent of the groups could not locate scholarly journals using library resources. Students were searching with Google because they did not know how to locate resources through the WSU libraries. In this course, frequent assessment of student skills revealed gaps, mispercep- tions, and bottlenecks that limited success in target information literacy outcomes. Timely assessment results gave the in- structional team information that they used to make instructional changes. Re- alizing that the students had not learned core information literacy techniques, the instructional team engaged in midcourse corrections to the syllabus and learning activities, resulting in significant im- provements in student learning. Multiple assessment techniques were integrated into the course design. Per- formance-based homework assignments and exam questions required students to apply and explain information literacy skills, providing direct measurements. Pre- and post-tests allowed the instruc- tor and librarians to measure changes in student comprehension. Student surveys were used to gauge students’ self-perceptions of their knowledge and skills. Following online threaded group discussions allowed the instructor and librarians to immediately see mispercep- tions and bottlenecks and to address them via timely feedback and adjustments to the syllabus, teaching approach, and learning activities. In the final research project, students demonstrated their increased under- standing of the types of material suitable for academic papers and their ability to locate the material. The final works-cited lists for each semester contained, respec- tively, 53 percent/43 percent citations from scholarly sources and 76 percent/77 percent from library sources. The Web sites used (13%/23%) were authorita- tive and reliable. The remaining sources 544 College & Research Libraries November 2011 suggestions. By including informa- tion literacy elements in many as- signments, the students will start to equate good library research skills with good practice in the science they are studying. This also offers more opportunities to assess their working skills. • Don’t be apprehensive about sug- gesting information literacy ques- tions on exams that call on students to apply their research skills to the discipline. These types of questions illustrate students’ authentic use of research techniques. • Major course projects offer a per- fect opportunity to ensure that the students can effectively incorporate library research with their science while impressing on them the critical role of library research in the scien- tific process. • Assessment is not just a way to analyze how successful you were at teaching library skills. It is a method for evaluating student performance and refining your approaches. • Assessment techniques should be varied: for example, pre- and post- course surveys, graded and un- graded class assignments, and ex- ams (timed, take-home, individual, group). • Don’t become disheartened. Next semester offers the chance to do it better! Information Literacy Instruction and Assessment 545 Appendix A: Question Four from First Semester Take-Home Exam 4. Go to the following link: http://www.reuters.com/article/email/idUSN0742189220080508 and read about “sexy orchids.” Then, locate the original publication (via WSU Librar- ies) and answer the following questions: a. Is this a peer-reviewed publication? What evidence can you provide that allows you to answer this question with absolute certainty? (Hint: Remember our trip to Owen Science Library.) b. From the Introduction: What two null hypotheses do the authors test here? (Note: I am not referring to their meta-analyses, and your answer must be in the form of null hypothesis statements.) c. How many blobs of sperm ejaculate were brought from the field to the lab for analysis? d. How many total L. excelsa did the researchers use to generate the data presented in Fig. 2? e. Where did the researchers obtain the data that they used to conduct their first meta-analysis? f. Figure 3 states, “Orchid species causing ejaculation…have higher pollination success than orchids stimulating less extreme sexual behavior…” Is this state- ment supported by statistically significant data? On what do you base your opinion? g. In the Discussion section, the researchers state, “Pollinators of Cryptostylis… do learn to avoid sexually deceptive orchids.” Did they demonstrate this in their study? On what specifically do you base your opinion? h. In their Abstract the researchers state, “…female insects deprived of matings by orchid deception could still produce male offspring, which may even en- hance orchid pollination.” What evidence do they present in support of that hypothesis? i. In the Literature Cited section, what type of source is the publication by Wedell et al. (2002)? j. The authors cite a publication by Schiestl et al. (1999). Retrieve this via WSU Libraries. In your own words (50 or less!), what did Schiestl et al. demonstrate? 546 College & Research Libraries November 2011 Appendix B: Question One from Second Semester Take-Home Exam Information Literacy, re: Human evolutionary biology. Retrieve these articles via WSU Libraries and answer the questions that follow. • Thornhill and Gangestad. 1996. Trends in Ecology & Evolution 11(2):98–101. a) Is this a primary or secondary article? b) What is this journal’s impact factor? c) from p. 98: What do studies in evolutionary psychology indicate about the sexual psyches of heterosexual men vs women? (Use specific examples and state in your own words.) Why does theory predict the evolution of these behavioral differences? d) from p. 100: In plain English, what is fluctuating asymmetry (FA)? Name 2 things that cause it to increase, 4 different things that it predicts, and explain why physical attractiveness may be of evolutionary significance. e) Search Steve Gangestad using the Web of Science “Cited Reference Search” from 2000–2009 (scroll down to set search limits). How many of his publica- tions have been cited? What are his two most cited publications? f) Find Randy Thornhill’s Web page and scrutinize his curriculum vitae. By what criteria would you evaluate his scientific authority? THINK. Cite detailed evi- dence to support your evaluation—use specific criteria and evaluate his performance according to those criteria. • Koehler et al. 2002. Animal Behavior 64:233–238. [Note: you only need p. 233] a) Is the journal peer reviewed? How do you know for certain? b) State the null hypothesis for this study. c) Did these researchers accept or reject the null? d) What is this journal’s impact factor? • Kuukasjärvi et al. 2004. Behav. Ecol. 15(4):579–584. a) What is concealed ovulation? b) Do dogs have this? What’s your evidence? c) What is the reason the study includes women who were on oral contraceptives? d) What is the reason for including women as raters? e) In the Methods section, “Odor rating sessions,” why do the authors state that the supervising researchers did not know who had worn the T-shirts? f) Which of the regressions (a through d) depicted in Figure 1 is/are significant? How do you know this? g) What is this journal’s impact factor? • Miller et al. 2007. Evolution and Human Behavior 28:375–381. a) What is the impact factor for this journal? b) from Introduction: What are the two competing views regarding human female estrus? Summarize the findings of the four “real-world” situation studies cited by Miller et al. What critiques do the current authors note regarding these studies? c) from Introduction: Why do Miller et al. argue that estrus attractiveness effects may be stronger in their study than in other kinds of psychology research studies? d) from Results, 4.3: In your own words, what two planned contrasts did the re- searchers make (Fig. 2), and which were statistically significant? c) from Discussion: According to the authors, what do their findings suggest about the concealed ovulation model? d) For fun: The authors accept a 2008 IgNobel award for their research (see lower video @ time=1:18:00): http://improbable.com/ig/2008/webcast/ Information Literacy Instruction and Assessment 547 Notes 1. Presidents Teaching Academy, Washington State University, available online at http://vpue. wsu.edu/overview/sixgoals/ [accessed 22 June 2010]. 2. Abigail Loomis and Patricia Herrling, “Course-Integrated Honors Instruction: Pros and Cons,” in “What Is Good Instruction Now?” Ann Arbor, MI: Pierian Press, 1993: 84. 3. Megan Oakleaf, “Dangers and Opportunities: A Conceptual Map of Information Literacy Assessment Approaches,” portal: Libraries and the Academy 8, no.3 (2008): 233–34. 4. Loomis and Herrling, “Course-Integrated Honors Instruction,” 84. 5. Debra Cox Rollins, Jessica Hutchings, Melissa Ursula, Dawn Goldsmith, and Anthony J. Fonseca, “Are We There Yet? The Difficult Road to Re-Create Information Literacy,” portal: Libraries and the Academy 9, no. 4 (2009): 454, 464. 6. Ibid., 456–57. 7. Catherine Frasier Riehle, “Partnering and Programming for Undergraduate Honors Students,” Reference Services Review 36, no. 1 (2008): 49; Renee B. Bush and Margaret R. Wells, “Bibliographic Instruction for Honors Students: The University at Buffalo Experience,” Research Strategies 8, no. 3 (1990): 137. 8. Oakleaf, “Dangers and Opportunities,” 234. 9. Rollins et al., “Are We There Yet?” 463. 10. American Association for Higher Education and Accreditation (AAHEA), “9 Principles of Good Practice for Assessing Student Learning” (1996), available online at http://assessment. uncg.edu/9Principles.pdf [accessed 22 June 2010]. 11. Ibid. 12. Oakleaf, “Dangers and Opportunities,” 237. 13. Ibid., 238. 14. Ibid., 240. 15. Loanne L. Snavely and Carol A. Wright, “Research Portfolio Use in Undergraduate Honors Education: Assessment Tool and Model for Future Work,” The Journal of Academic Librarianship 29, no. 5 (2003): 298–303. 16. Elizabeth W. Kraemer, “Developing Information Literacy Instruction for Honors Students at Oakland University: An Information Consulting Approach,” College & Undergraduate Libraries 14, no. 3 (2007): 64. 17. Loomis and Herrling, “Course-Integrated Honors Instruction,” 83. 18. Ibid., 85. 19. Ibid., 85, 89. 20. Ibid., 85, 86. 21. Ibid., 87. 22. Ibid., 86, 87. 23. Kraemer, “Developing Information Literacy Instruction,” 64, 65; Snavely and Wright, “Research Portfolio Use,” 299. 24. Kraemer, “Developing Information Literacy Instruction,” 66, 67. 25. Kraemer, “Developing Information Literacy Instruction,” 70. 26. Ignacio J. Ferrer-Vincent and Christy A. Carello, “Embedded Library Instruction in a First- Year Biology Laboratory Course,” Science & Technology Libraries 28, no. 4 (2008): 326. 27. Ibid., 329–31. 28. Ibid., 335. 29. Ibid., 335, 337. 30. Christopher M. Gillen, Reading Primary Literature: A Practical Guide to Evaluating Research Articles in Biology (San Francisco: Pearson Education, Inc., 2007, 44 pp.) 31. Amy L. Moran, “Egg Size Evolution in Tropical American Arcid Bivalves: The Comparative Method and the Fossil Record,” Evolution 58, no. 12: 2718–33. 32. Nancy A. Schiller and Clyde Freeman Herreid, “The Galápagos,” available online at http:// ublib.buffalo.edu/libraries/projects/cases/galapagos1.html [accessed 22 June 2010].