Mark.p65 480 College & Research Libraries November 2003 480 Information Literacy and Student Engagement: What the National Survey of Student Engagement Reveals about Your Campus Amy E. Mark and Polly D. Boruff-Jones Amy E. Mark is Head of Library Instruction and Assistant Professor at the University of Mississippi; e- mail: aemark@olemiss.edu. Polly D. Boruff-Jones is Reference Team Leader and Assistant Librarian at Indiana University-Purdue University Indianapolis; e-mail: pboruffj@iupui.edu. The annual National Survey of Student Engagement (NSSE) measures undergraduate “participation in programs and activities that institutions pro- vide for their learning and personal development.”1 Each item on the survey correlates to one of five benchmarks of “empirically confirmed ‘good prac- tices’ in undergraduate education.”2 The NSSE is an excellent diagnostic fit with the Information Literacy Competency Standards for Higher Education because learning outcomes can be correlated with student engagement. This article presents case studies from the University of Mississippi and Indiana University-Purdue University Indianapolis to demonstrate how li- brarians can apply NSSE results for the purpose of assessment.3 he National Survey of Student Engagement (NSSE) annually surveys randomly selected fresh- men and seniors at four-year col- leges and universities regarding their en- gagement in the college experience.4 The survey is administered through the Indiana University Center for Postsecondary Re- search and Planning in cooperation with the Indiana University Center for Survey Re- search, under the direction of George Kuh. At completion of the most recent NSSE survey (2002), 285,000 students at 618 in- stitutions had participated in the sur- veys.5 Schools pay a participation fee, determined by FTE, which ranges from $1,500 to $7,500. There is no charge to the students who complete the survey or to libraries for using the data. Generalized data, in the form of na- tional or annual reports, are available free on the NSSE Web site (see the National Survey of Student Engagement: The Col- lege Student Report home page at http:/ /www.iub.edu/~nsse/). This Web site includes an excellent overview of the sur- vey project, a list of participating institu- tions, information on obtaining institu- tional reports, and the survey instrument. The survey method is sound and pro- duces statistically significant results that have generated benchmarks for compari- son groups. Each survey question is associated with one of five National Benchmarks of Effective Educational Practice: level of academic challenge, active and collabo- rative learning, student interactions with Information Literacy and Student Engagement 481 faculty, enriching educational experi- ences, and supportive campus environ- ment.6 The authors believe that the NSSE pro- vides a compelling, but underused, tool for measuring the extent to which infor- mation literacy is incorporated into the curriculum on a particular campus. This article shows how to use the NSSE for more effective benchmarking within Carnegie peer groups in order to compare student learning. The NSSE also is ben- eficial to measure student learning out- comes in the acquisition of information literacy skills. A core idea of the NSSE is that of ana- lyzing student learning through student engagement. This article shows how in- struction librarians can demonstrate the extent to which institutional curriculum incorporates information literacy experi- ences by examining levels of student en- gagement. Readers will learn how to frame local NSSE results to have more meaningful conversations with campus constituencies about the importance of information literacy for student engage- ment. Literature Review: Information Literacy Assessment on a Programmatic Level Three areas have been indicated for fur- ther research on information literacy as- sessment: evaluation of instructors and programs; assessment of learning out- comes; and transferability.7 This review considers the trends in library literature on assessment regarding five points that correlate to this research agenda. The first section, “Reliable and Universal Survey Instrument,” relates to the “evaluation of instructor and programs” directive, which includes an outcome for discovering “the most effective tools for assessing the im- pact of library instruction,” as well as “transferability.” The second section, “Us- ing Outcomes for Assessment,” relates to the charge for “assessment of learning outcomes.” The third and fourth sections, “Stakeholders and Alignment with Insti- tutional Goals” and “Campuswide In- volvement,” also fall under the research agenda of “assessment of learning out- comes.” These outcomes include integrat- ing information literacy into institutional assessment efforts. The fifth section, “Benchmarking,” includes the need for effective benchmarking tools. Reliable and Universal Survey Instrument There is more literature on assessing li- brary instruction than there is on specifi- cally addressing programmatic assess- ment of information literacy. Examining the literature that touches on program- matic-level assessment indicates that the two biggest barriers to success are design- ing a reliable survey instrument and lo- cating a survey instrument applicable in a variety of campus environments and levels of instruction programming. There is a wide call for sound assess- ment methods. Nancy Wootton Colborn and Rosanne M. Cordell discuss both the move toward more objective methods and data problems in the assessment of library instruction.8 Lorrie A. Knight delineates four steps for systemic assessment, in- cluding the development of assessment devices but, like many, focuses on a pre- and posttest.9 Ralph Catts notes that poor and invalid assessment practices are prevalent.10 This seems to be the consen- sus both inside and outside the profes- sion. In the case of psychology students, Eric R. Landurum and Diana M. Muench discuss librarians’ constant reinvention of the assessment wheel instead of using proven methods for evaluation and write “that many of these scales lack important psychometric properties such as validity and reliability.”11 Lisa G. O’Connor and others address the need for both programmatic assess- ment and a reliable assessment instru- ment. The authors encourage instruction librarians to work together on “a project to develop an instrument for program- matic-level assessment of information lit- eracy skills that is valid—and thus cred- ible—to university administrators and other academic personnel.”12 They make an excellent argument for a valid instru- 482 College & Research Libraries November 2003 ment and point out how difficult it will be for librarians attempting to create an instrument applicable to a wide range of information literacy programs. Building on Richard Hume Werking’s 1980 literature review, Christopher Bober, Sonia Poulin, and Luigina Vileno note that the need for assessment and accountabil- ity has been continually reflected in the library literature from 1980 to 1993 and that “the need to justify expenditures on user education programs will necessitate a more systemic approach to evalua- tion.”13,14 Bober concludes that informal assessment methods still predominate. Christine Bruce examines the literature on information literacy research to trace signs of a collective consciousness in the body of research. Her outlook on the fu- ture of research is “a firmer, more con- solidated, research agenda.”15 ACRL’s In- formation Literacy and Assessment Web site encourages librarians to develop an assessment program that “should reach all students, pinpoint areas for further program development, and consolidate learning goals already achieved.”16 The NSSE has not been referred to in any library literature surveyed with the exception of an article by Cecilia Lopez that discusses how surveys provide only a biased view of the learning experience. She states that the NSSE is a successful instrument “because it has the potential to influence how institutions can improve specific educational practices that pro- mote student engagement in the learning process.” 17 Lopez also discusses the strong points of the NSSE, including benchmarking and institutional reflec- tion.18 Using Outcomes for Assessment General discussion of learning outcomes is frequent in library literature, though not all authors define or discuss learning outcomes. Lois M. Pausch and Mary Pagliero Popp frame learning outcomes as three questions, What should students learn? How well are they learning it? How does the institution know?19 Mark Battersby defines learning outcomes as contextual and conceptual rather than formulaic.20 Similar to Carol Kuhlthau’s 1993 definition of information literacy as a “way of learning,” Battersby states that “outcomes are not discrete skills or mere collections of knowledge but the inte- grated complexes of knowledge, abilities and attitudes.”21,22 Kenneth R. Smith sees learning outcomes as a trend because in- stitutions and accrediting bodies have been shifting their focus from input mea- sures (faculty, courses, books) to outcome measures (what students learn).23 Bonnie Gratch-Lindauer broadens the definition of the word outcomes to include “the real- ized goals valued by various campus con- stituents, also called stakeholders” and addresses the need to assess campuswide outcomes.24 She notes that the library lit- erature is “internally focused” and that academic libraries need to find measures or methods that assess their impact on campuswide educational outcomes.25 Most outcome-focused assessments are not done on a programmatic level, look- ing only at a specific class.26–29 These of- ten used a pre-/posttest method, ques- tioning the students’ perceptions and opinions about the library rather than determining proof of learning. 30,31 Hannelore B. Rader ’s review on informa- tion literacy over the past thirty years notes that the amount of literature on as- sessment is relatively small. In early lit- erature, the focus is on librarians as teach- ers and how well students compile bibli- ographies.32 A recent shift in the literature has focused more on student learning outcomes, possibly due to pressure from various external evaluation bodies for higher education requiring evidence of quality as measured by outcomes.33 Stakeholders and Alignment with Institu- tional Goals The AAHE’s Principles of Good Practice of Assessing Student Learning stresses the importance of using assessment for ac- countability to stakeholders, through which educators meet responsibilities to students and to the public.34 Assessment as a means to assist with institutional ac- Information Literacy and Student Engagement 483 creditation and with accountability to state legislatures and other stakeholders has become a major trend.35–43 Many articles emphasize the practical- ity of libraries aligning assessment with the mission of the institution.44–52 “Ide- ally,” says Ilene F. Rockman, “libraries want to be able to show that the role of the library has a strong impact on cam- pus mission and goals by strengthening the quality of a student educational ex- perience.”53 Gratch-Lindauer examines the need for an external focus when as- sessing information literacy, showing that librarians often do not use data, measures, nor language that are common to aca- demic administrators and accreditation teams.54 This is a valuable point with re- sources being scarce: assessment takes time and should have many outcomes, one being clearly communicating library successes and needs to administrators. Iannuzzi agrees that the most meaning- ful assessment aligns information literacy outcomes with general education out- comes of the campus.55 Lopez and Catts both note that good assessment can re- sult in increased and sustained fund- ing.56,57 When libraries are involved on an institutional level and demonstrating re- sults, priority funding should be forth- coming. Campuswide Involvement Linked to the importance of assessment to stakeholders is a noted emphasis on campuswide involvement in information literacy assessment. The AAHE’s Principles of Good Practice of Assessing Student Learning notes that “assessment fosters wider improvement when representatives from across the educational community are involved” and that “assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change.”58 A common snapshot of informa- tion literacy on campuses has been the In- formation Literacy IQ (Institutional Quo- tient) Test.59 This thirteen-question IQ test is excellent as a quick tool for librarians to measure the integration of information lit- eracy. Compared to the Information Lit- eracy IQ, the NSSE takes the assessment even further out of the librarians’ viewpoint and into the realm of student learning. Peggy L. Maki says that assessment needs to be systemic and conducted with participation across the institution with a shared commitment among faculty, staff, and administrators.60 Gratch- Lindauer sees assessment and the ACRL standards documents as a necessary tool not just for assisting with accreditation, but also as a promotional tool to inform colleagues that librarians share responsi- bility for educating students.61 When dis- cussing the difficulty of measuring out- comes, Patricia Davitt Maughan notes that “although information competencies are easier to assess, the assessment of in- formation literacy outcomes, by contrast, must be a shared responsibility between librarians and faculty.”62 Successful out- comes have multiple perspectives. Librarians also can use assessment to be change agents at their institutions. More than just a way to report results, assessment can change a campus cul- ture—if there is campuswide involve- ment. Smith uses the learning outcomes included in assessment to ask “whether student learning is enhanced by the way we teach, by the organization of the uni- versity, by the structure of the academic program, and by the activities of faculty and other professionals.”63 Oswald M. T. Ratteray outlines the difficulties reported by librarians in the Middle Atlantic States in trying to affect a change in campus culture where information literacy pro- grams were stalled.64 One explanation is that information literacy is seen as only a librarian’s responsibility. A new hand- book for the Middle Atlantic States group has encouraged greater collaboration. Iannuzzi argues that librarians do not have to carry the full burden of informa- tion literacy; it is more than library in- struction and includes both critical think- ing and IT education.65 Iannuzzi pushes for institutional involvement and states that “when considering campuswide in- formation literacy assessment, it may be difficult to separate information literacy 484 College & Research Libraries November 2003 from the overarching goals of under- graduate education and the overall as- sessment of student learning. So do not separate it!”66 Smith suggests a link be- tween assessment success and shared needs.67 Going beyond traditional library goals and viewing assessment from other departments’ viewpoints creates oppor- tunities for librarians. Benchmarking Why measure information literacy com- petencies? Maughan states, “to establish a baseline of student skills around which an information literacy program might be built; to assess the effectiveness of particu- lar library instruction sessions or ap- proaches to instruction; to determine the impact of library instruction programs on student information literacy skills and academic success; and to generate data with which to communicate with fac- ulty.” 68 All of these points include benchmarking, at either the class, insti- tutional, or national level. Iannuzzi lays out four levels at which we can assess information literacy outcomes: within the library; in the classroom; on campus; and beyond the campus.69 One of the ALA’s primary agendas reflects the necessity of assessing information literacy on a pro- grammatic level.70 Because library instruc- tion does not have an undergraduate pro- gram, compared to assessing student learning of psychology or biology majors, one route for programmatic assessment is to take a snapshot of information lit- eracy at an institutional level. Compre- hensive evaluation of student learning can reveal the overall level of student in- formation literacy in a given year and present insights as to which learning out- comes students have and have not mas- tered. Reliance on outside peer review is one type of current benchmarking. Ratterary suggests outside reviews where “thou- sands of volunteer peer reviewers moni- tor periodically how [each] institution defines and implements its own solu- tions.”71 Consolidating the few regions that have such programs would be diffi- cult. A national comparison of similar in- stitutions is ideal. Maughan describes a need for more systematic and widespread assessment that is shared—from library to library and from institution to institu- tion.72 Perhaps part of the difficulty in cre- ating a reliable survey instrument has been the lack of a shared form of assess- ment. The NSSE Results Institutions of higher education may use the student engagement survey to com- pare themselves to peer institutions and to the national benchmarks.73 The 2002 NSSE reports that schools are using their results in many productive ways for as- sessment and improvement, curricular reform, benchmarking, advising, institu- tional research and advancement, reten- tion, self-studies, accreditation, and per- formance reviews. To this list the authors add institutional information literacy. Survey results have shown that “stu- dent engagement results appear to have the best chance of guiding institutional change efforts when: a. Faculty and staff understand the concept of student engagement. b. Enough results are available to use the information at the department or unit level. c. Institutions understand what stu- dent engagement data represent and use the results wisely. d. Institutional performance is re- ported in a responsible way. e. Results are placed in the proper context and interpreted carefully. f. Results are examined from multiple perspectives. g. Results are linked to other informa- tion about the student experience and in- stitutional performance. h. Institutions form consortia or other collaborative arrangements to work on improvement initiatives.”74 To apply the NSSE as an assessment tool, you will need to find out whether your institution participates, obtain your local results, and familiarize yourself with the national trends. Be sure to note in Information Literacy and Student Engagement 485 which group your institution is benchmarked (e.g., the University of Mis- sissippi is a Research I & II university). Each school receives a confidential in- stitutional report. This report is often on file in your school’s office of institutional research. Request permission to borrow the confidential report to copy or request your institution’s password to access it on the NSSE Web site. Ask if your institu- tion already has used its individualized report to create a contextualized internal report or to make recommendations to school administrators. Evidence of Student Learning Outcomes and Acquisition of Information Literacy Skills From the five benchmarks, two are easily applicable to the assessment of informa- tion literacy: Level of Academic Chal- lenge, and Active and Collaborative Learning. The authors chose the latter to use as an example. In the NSSE annual report, Active and Collaborative Learning is described as fol- lows: “Students learn more when they are intensely involved in their education and are asked to think about and apply what they are learning in different settings. Col- laborating with others in solving problems or mastering difficult material prepares students to deal with the messy, unscripted problems they will encounter daily dur- ing and after college.”75 Survey questions created to measure ac- tivities associated with this benchmark are: “[A]bout how often have you done each of the following? • Asked questions in class or contrib- uted to class discussions • Made a class presentation • Worked with other students on projects during class • Worked with classmates outside of class to prepare class assignments • Tutored or taught other students • Participated in a community-based project as part of a regular course • Discussed ideas from your reading or classes with others outside of class (stu- dents, family members, co-workers, etc.)”76 Methods To correlate the NSSE results with the ACRL Information Literacy Competency Standards for Higher Education, the au- thors selected the five survey questions from the seven associated with the Active and Collaborative Learning benchmark that are most closely related to informa- tion literacy concepts. These questions were: asked questions in class or contrib- uted to class discussions, made a class presentation, worked with other students on projects during class, worked with classmates outside class to prepare class assignments, and discussed ideas from your reading or classes with others out- side class. Because the questions are very specific, the questions were correlated not only with the ACRL standards, but also, more narrowly and specifically, to perfor- mance indicators and outcomes. In antici- pation of using the NSSE scores not just as a tool for assessment of information literacy, but also as a focal point for im- provement, the questions and standards were correlated with Bloom’s Taxonomy. (See table 1.) The authors charted their institutional mean scores with peer group mean scores and the national mean scores for compari- son, including both the first-year scores and the senior scores. (See table 2.) The authors expanded on the tax- onomy by listing student skills appropri- ate to the taxonomic level and level-ap- propriate verbs to be incorporated into assignments. (See table 3.) This single benchmark covers several levels or orders of thinking in the taxonomy and it is im- portant to remember that: the term information literacy in- struction, in addition to the lower- order skills, includes higher-order abilities such as assessing search results for quality and relevance; evaluating the reliability, validity, authority, and timeliness of re- trieved information; and applying new information to the planning and creation of scholarly and pro- fessional projects and products77 486 College & Research Libraries November 2003 TABLE 1 NSSE Survey Questions, ACRL Standards, and Bloom�s Taxonomy Corresponding NSSE ACRL Standards and Bloom�s 2001 Survey Performance Indicators and Taxonomy Questions Possible ACRL Outcome(s) Asked questions in class or contributed to class discussions Made a class presentation Worked with other students on projects during class 3.6.a: Participates in classroom and other discussions 4.3.d: Communicates clearly and with a style that supports the purposes of the intended audience 4.1.a: Organizes the content in a manner that supports the purposes and format of the product or performance (e.g., outlines, drafts, storyboards) 4.1.b: Articulates knowledge and skills transferred from prior experiences to planning and creating the product or performance 4.1.c: Integrates the new and prior informa- tion, including quotations and paraphrasings, in a manner that supports the purposes of the product or performance 4.1.d: Manipulates digital text, images, and data, as needed, transferring them from their original locations and formats to a new context 4.3.b: Uses a range of information technol- ogy applications in creating the product or performance 4.3.c: Incorporates principles of design and communication 4.3.d: Communicates clearly and with a style that supports the purposes of the intended audience 1.1.a: Confers with instructors and partici- pates in class discussions, peer workgroups, and electronic discussions to identify a research topic or other information need 4.1.a: Organizes the content in a manner that supports the purposes and format of the product or performance (e.g., outlines, drafts, storyboards) 4.1.b: Articulates knowledge and skills transferred from prior experiences to planning and creating the product or performance 4.1.c: Integrates the new and prior informa- tion, including quotations and paraphrasings, in a manner that supports the purposes of the product or performance Knowledge, comprehension, application, analysis Application, analysis Analysis, synthesis Comprehension, application Comprehension, application, analysis Information Literacy and Student Engagement 487 TABLE 1 (CONTINUED) NSSE Survey Questions, ACRL Standards, and Bloom�s Taxonomy Corresponding NSSE ACRL Standards and Bloom�s 2001 Survey Performance Indicators and Taxonomy Questions Possible ACRL Outcome(s) Worked with other students on projects during class, con�t. Worked with classmates outside of class to prepare class assign- ments Discussed ideas from your readings or classes with others outside of class (students, family members, coworkers, etc.) 4.1.d: Manipulates digital text, images, and data, as needed, transferring them from their original locations and formats to a new context 4.3.a: Chooses a communication medium and format that best supports the purposes of the product or performance and the intended audience 4.3.b: Uses a range of information technol- ogy applications in creating the product or performance 1.1.a: Confers with instructors and partici- pates in class discussions, peer workgroups, and electronic discussions to identify a research topic or other information need 4.1.a: Organizes the content in a manner that supports the purposes and format of the product or performance (e.g., outlines, drafts, storyboards) 4.1.b: Articulates knowledge and skills transferred from prior experiences to planning and creating the product or performance 4.1.c: Integrates the new and prior informa- tion, including quotations and paraphrasings, in a manner that supports the purposes of the product or performance 4.1.d: Manipulates digital text, images, and data, as needed, transferring them from their original locations and formats to a new context 4.3.a: Chooses a communication medium and format that best supports the purposes of the product or performance and the intended audience 4.3.b: Uses a range of information technol- ogy applications in creating the product or performance 3.6.b: Participates in class-sponsored electronic communication forums designed to encourage discourse on the topic (e.g., e- mail, bulletin boards, chat rooms) 3.6.c: Seeks expert opinion through a variety of mechanisms (e.g., interviews, e-mail, listservs) 4.3.d: Communicates clearly and with a style that supports the purposes of the intended audience Comprehension, application Analysis, synthesis Application, analysis Analysis, synthesis 488 C ollege & R esearch L ib raries N ovem b er 2003 TABLE 2 NSSE Survey Questions, ACRL Standards, and Institutional Scores Corresponding NSSE ACRL Standards & Scores IUPUI 2000 Urban 2000 UM 2000 Research National 2001 Survey Questions Performance 1=never Mean Score Mean Score Mean Score I & II 2000 Indicators and 2=ocassionally Mean Score Mean Score Possible ACRL 3=often Outcome(s) 4=very often Asked questions in class or 3.6.a First Year 2.64 2.70 2.73 2.54 2.75 contributed to class discussions 4.3.d Senior 3.02 2.99 2.81 2.83 3.05 Made a class presentation 4.1.a-d First Year 2.08 2.06 2.08 1.94 2.14 4.3.b-d Senior 2.52 2.68 2.57 2.55 2.76 Worked with other students 1.1.a First Year 2.48 2.47 2.49 2.37 2.42 on projects during class 4.1.a-d Senior 2.42 2.49 2.51 2.43 2.49 4.3.a-b Worked with classmates outside 1.1.a First Year 1.99 2.15 2.37 2.36 2.39 of class to prepare class assignments 4.1.a-d Senior 2.38 2.51 2.82 2.78 2.71 4.3.a-b Discussed ideas from your readings 3.6.b-d First Year 2.55 2.66 2.71 2.37 2.74 or classes with others outside of Senior 2.75 2.87 2.84 2.57 2.88 class (students, family members, coworkers, etc.) Information Literacy and Student Engagement 489 The NSSE scores for first-year students and seniors differ, and assignments cre- ated for information literacy instruction at different academic levels should reflect the development in information literacy sophistication from freshman year to se- nior year. Discussion Although the correlation between the TABLE 3 Student Skills & Bloom�s Taxonomy Taxonomy Skills Verbs Knowledge Comprehension Application Analysis Synthesis Evaluation � Understand terminology � Recall of major facts � Knowledge of dates, events, places � Knowledge of trends � Translation into other terms or form of communication � Interpretation of ideas � Reordering of ideas � Determine implications and consequences � Apply principles � Use methods, concepts, theories in new situations � Use abstraction to solve a problem � Determine relationships, connec- tions, and interactions � Distinguish fact from hypotheses � Identify components � Recognize form and pattern � Generalize from given facts � Recognize subjectivity � Redict consequences � Draw conclusions � Compare major theories and facts � Relate knowledge from several areas � Apply given criteria � Judge by appropriate external standards Arrange, define, examine, identify, label, list, memorize, name, recognize, record, recall, select Choose, contrast, demon- strate, describe, discuss, distinguish, employ, estimate, interpret, solve, summarize Apply, change, classify discover, explain, identify, illustrate, locate, modify, report, restate, review, select, show Analyze, arrange, calculate, categorize, compare, criticize, differentiate, discriminate, examine, order, question, test Assemble, compose, construct, create, design, develop, integrate, manage, modify, organize, plan, propose, rearrange, rewrite, solve, substitute Appraise, argue, assess, conclude, convince, decide, defend, estimate, evaluate, judge, measure, rate, rank, recommend, summarize, support, value Adapted from: B.S. Bloom, �Taxonomy of Educational Objectives: The Classification of Educa- tional Goals,� in Handbook I, Cognitive Domain (New York: David McKay, 1974). 490 College & Research Libraries November 2003 NSSE survey questions and the ACRL standards do not unequivocally demon- strate that the rankings were necessarily enhanced by library instruction, the au- thors’ intent is to correlate NSSE bench- marks with the ACRL standards in order to identify areas of strengths and weak- ness. These areas may be addressed through an applied information literacy program, which concentrates on the ACRL standards and associated learning outcomes. This is not to imply knowledge of what role an information literacy pro- gram might have played in an institution’s NSSE results but, rather, to begin to address those issues related to information literacy and correlate future results to those efforts. Furthermore, the authors are suggesting that one can de- velop teaching strategies and assignments aimed at addressing those areas of infor- mation literacy weakness by employing Bloom’s Taxonomy of Educational Objec- tives. Leadership Initiatives: Sharing Results When the state of information literacy on a particular campus has been determined, what should be done with the results? Maki sums up the general recommenda- tions of the library literature on assess- ment into three crucial stages: first, de- termining your institution’s expectations; second, determining timing, identifying cohort(s), and assigning responsibility; and third, interpreting and sharing results to enhance institutional effectiveness.78 The primary stakeholders are fellow li- brarians and university administration. Begin by preparing a brief report that can be presented to both. The report should include an explanation of information lit- eracy and the contribution information literacy can make in graduating lifelong learners. Note the impact that a sound assessment of information literacy can have on both library goals and the mis- sion and vision of the university as a whole. The report should touch on the cred- ibility of the NSSE and briefly explain the correlation of benchmark areas to infor- mation literacy outcomes. Relate a few areas of success to library and adminis- trative goals. Mention areas where the university lags behind peer institutions. Suggest that these weaknesses can be overcome through cooperation across the university. Frame local results in the con- text of assisting with annual departmen- tal assessments and with upcoming re- gional and state accreditation. Finally, outline the desire to build a collaborative relationship among the library, the ad- ministration, and academic departments. With the support of these two contin- gencies, determine a campus partner, per- haps congenial faculty in the English de- partment or the chair of an area where the library has had previous success in providing library instruction and present your results again. Collaboration is the key to taking a leadership role. Conclusion: The Future of Student Engagement The authors see the possibility of a plethora of continuing research stemming from stu- dent engagement, information literacy, and assessment. The first is expanding the application of information literacy out- comes to the rest of the NSSE benchmark areas. After Active Learning, the next benchmark most applicable to information literacy outcomes is Level of Academic Challenge. This standard fits well with ACRL Standard 2, Accessing Information Effectively and Efficiently. Another bench- mark worth investigating is Supportive Campus Environment. This benchmark focuses on a topic being given increasing consequence in library literature: diversity. Correlating information literacy outcomes to this benchmark takes on importance as the face of higher education becomes more diverse. Next, because library research competes timewise with other student in- terests, research correlating information literacy learning outcomes to the bench- mark Student Interactions with Faculty Members has implications for reference librarians, in addition to instruction librar- ians. Information Literacy and Student Engagement 491 A second area of research lies in exam- ining surveys. The NSSE is interested in developing questions that test student en- gagement in all areas. This opens oppor- tunities for instruction librarians to ask which information literacy outcomes are not being measured by the NSSE. Sugges- tions are always welcome by the NSSE. Researchers also can investigate custom- izing research by institution. The NSSE can help librarians create institution-level questions to address specific local library issues. A third and final area of research stem- ming from student engagement, informa- tion literacy, and assessment is using the NSSE to look beyond institutional results. One option would be to compare institu- tional-level data with peer libraries. An- other promising area to study is why some information literacy programs are so successful in integrating certain infor- mation literacy outcomes on their cam- pus. A survey could be administered in- quiring which information literacy pro- grams have been successful in relating their assessment results to institutional missions and goals. The most important thing is for library research to continue to strive to find sound measures of information literacy outcomes. Notes 1. George D. Kuh, The College Student Report (Bloomington, Ind.: Indiana University, Na- tional Survey of Student Engagement, Center for Postsecondary Research and Planning, 2002), 5. 2. ———, The College Student Report (Bloomington, Ind.: Indiana University, National Sur- vey of Student Engagement, Center for Postsecondary Research and Planning, 2000), 25. 3. Permissions to present data granted from the University of Mississippi Office of Institu- tional Research and from the Indiana University-Purdue University Indianapolis Office of Infor- mation Management and Institutional Research. 4. Kuh, The College Student Report (Bloomington, Ind.: Indiana University, National Survey of Student Engagement, Center for Postsecondary Research and Planning, 2001). 5. ———, The College Student Report (2002). 6. Ibid. 7. ACRL Research and Scholarship Committee, Instruction Section, “Research agenda for Library Instruction and Information Literacy: The Updated Version,” College & Research Libraries News 64 (Feb. 2003): 108–13. 8. Nancy Wootton Colborn and Rosanne M. Cordell, “Moving from Subjective to Objective Assessments of Your Instruction Program,” Reference Services Review 26 (fall/winter 1998): 125– 37. 9. Lorrie A. Knight, “The Role of Assessment in Library User Education,” Reference Services Review 30 (2002): 15–24. 10. Ralph Catts, “Some Issues in Assessing Information Literacy,” in Information Literacy around the World: Advances in Programs and Research,” comp. Christine Bruce and Philip Candy (New South Wales: Center for Information Studies, 2000). 11. Eric R. Landrum and Diana M. Muench, “Assessing Students’ Library Skills and Knowl- edge: The Library Research Strategies Questionnaire,” Psychological Reports 75 (1994): 1620. 12. Lisa G. O’Connor, et al., “Applying Systems Design and Item Response Theory to the Problem of Measuring Information Literacy Skills,” College and Research Libraries 63 (Nov. 2002): 528 13. Richard Hume Werking, “Evaluating Bibliographic Instruction: A Review and Critique,” Library Trends 29 (summer 1980): 153–72. 14. Christopher Bober, Sonia Poulin, and Luigina Vileno, “Evaluating Library Instruction in Academic Libraries: A Critical Review of the Literature, 1980–1993,” Reference Librarian 51–52 (1995): 66. 15. Christine Bruce, “Information Literacy Research: Dimensions of the Emerging Collective Consciousness,” Australian Academic and Research Libraries 31 (June 2000): 95. 16. ACRL, “Information Literacy and Assessment,” in Information Literacy Competency Stan- dards for Higher Education. Available online from http://www.ala.org/Content/NavigationMenu/ ACRL/Standards_and_Guidelines/Standards_and_Guidelines.htm. 17. Cecilia Lopez, “Assessment of Student Learning: Challenges and Strategies,” Journal of Academic Librarianship 28 (2002): 361. 18. Ibid. 492 College & Research Libraries November 2003 19. Lois M. Pausch and Mary Pagliero Popp, “Assessment of Information Literacy: Lessons from the Higher Education Assessment Movement.” Available online from http://www.ala.org/ Content/ContentGroups/ACRL1/Nashville_1997_Papers/Pausch_and_Popp.htm. 20. Mark Battersby and the Learning Outcomes Network Centre for Curriculum, Transfer and Technology, “So, What’s a Learning Outcome Anyway?” Available online from http:// www.c2t2.ca/goodpractice/Exchange/curriculum-frameworks/sowhatsa.html 21. Carol Kuhlthau, “A Principle of Uncertainty for Information Seeking,” Journal of Docu- mentation 49 (Dec. 1993): 339–55. 22. Battersby, “So, What’s a Learning Outcome Anyway?” 1. 23. Kenneth R. Smith, “New Roles and Responsibilities for the University Library: Advanc- ing Student Learning through Outcomes Assessment,” ARL: A Bimonthly Newsletter of Research Library Issues and Actions 213 (Dec. 2000): 2–5. 24. Bonnie Gratch-Lindauer, “Defining and Measuring the Library’s Impact on Campuswide Outcomes,” College and Research Libraries 59 (1998): 550. 25. Ibid. 26. Donald A. Barclay, “Evaluating Library Instruction: Doing the Best You Can with What You Have,” RQ 33 (winter 1993): 195–202. 27. Elizabeth W. Carter, “Doing the Best You Can with What You Have: Lessons Learned from Outcomes Assessment,” Journal of Academic Librarianship 28 (Jan./Mar. 2002): 36–41. 28. Knight, “The Role of Assessment in Library User Education.” 29. Ilene F. Rockman, “Strengthening Connections between Information Literacy, General Education and Assessment Efforts,” Library Trends 51 (fall 2002): 185–98. 30. Carter, “Doing the Best You Can with What You Have.” 31. Knight, “The Role of Assessment in Library User Education.” 32. Hannelore B. Rader, “Information Literacy 1973–2002: A Selected Literature Review,” Li- brary Trends 51 (fall 2002): 242–59. 33. Patricia Iannuzzi, “We Are Teaching, But Are They Learning? Accountability, Productiv- ity, and Assessment,” Journal of Academic Librarianship 25 (1999): 304–5. 34. “AAHE’s Principles of Good Practice of Assessing Student Learning.” Appendix A in Effective Grading: A Tool for Learning and Assessment, comp. Barbara E. Walvoord and Virginia Johnson Anderson (San Francisco: Jossey-Bass, 1998). 35. Ronald L. Baker, “Evaluating Quality and Effectiveness: Regional Accreditation Principles and Practices,” Journal of Academic Librarianship 28 (Jan.–Mar. 2002): 3–7. 36. Catts, “Some Issues in Assessing Information Literacy.” 37. Gratch-Lindauer, “Defining and Measuring the Library’s Impact on Campuswide Out- comes.” 38. ———, “Comparing the Regional Accreditation Standards: Outcomes Assessment and Other Trends,” Journal of Academic Librarianship 28 (Jan./Mar. 2002): 14–25. 39. Arlene Greer, Lee Weston, and Mary Alm, “Assessment of Learning Outcomes: A Mea- sure of Progress in Library Literacy,” College and Research Libraries 52 (1991): 549–57. 40. Iannuzzi, “We Are Teaching, But Are They Learning?” 41. Lopez, “Assessment of Student Learning.” 42. Pausch and Popp, “Assessment of Information Literacy.” 43. Oswald M. T. Ratteray, “Information Literacy in Self-study and Accreditation,” Journal of Academic Librarianship 28 (2002): 368–75. 44. Baker, “Evaluating Quality and Effectiveness.” 45. Catts, “Some Issues in Assessing Information Literacy.” 46. Gratch-Lindauer, “Defining and Measuring the Library’s Impact on Campuswide Out- comes.” 47. Greer, Weston, and Alm, “Assessment of Learning Outcomes.” 48. Iannuzzi, “We Are Teaching, But Are They Learning?” 49. Peggy L. Maki, “Moving from Paperwork to Pedagogy: Channeling Intellectual Curiosity into a Commitment to Assessment,” AAHE Bulletin (May 2002). Available online from http:// www.aahebulletin.com/public/archive/paperwork.asp. 50. O’Connor, et al., “Applying Systems Design and Item Response Theory to the Problem of Measuring Information Literacy Skills.” 51. Rockman, “Strengthening Connections between Information Literacy, General Education and Assessment Efforts.” 52. Smith, “New Roles and Responsibilities for the University Library.” 53. Rockman, “Strengthening Connections between Information Literacy, General Education and Assessment Efforts,” 192. 54. Gratch-Lindauer, “Defining and Measuring the Library’s Impact on Campuswide Out- comes.” Information Literacy and Student Engagement 493 55. Iannuzzi, “We Are Teaching, But Are They Learning?” 56. Lopez, “Assessment of Student Learning.” 57. Catts, “Some Issues in Assessing Information Literacy.” 58. AAHE, “AAHE’s Principles of Good Practice of Assessing Student Learning.” 59. Cerise Oberman, et al., “Integrating Information Literacy into the Curriculum: How Is Your Library Measuring Up?” C&RL News 59 (May 1998): 347–52. 60. Peggy L. Maki, “Developing an Assessment Plan to Learn about Student Learning,” Jour- nal of Academic Librarianship 28 (Jan./Mar. 2002): 8–13. 61. Gratch-Lindauer, “Comparing the Regional Accreditation Standards.” 62. Patricia Davitt Maughan, “Assessing Information Literacy among Undergraduates: A Dis- cussion of the Literature and the University of California-Berkeley Assessment Experience,” Col- lege & Research Libraries 62 (Jan. 2001): 75. 63. Smith, “New Roles and Responsibilities for the University Library,” 2. 64. Ratteray, “Information Literacy in Self-study and Accreditation.” 65. Iannuzzi, “We Are Teaching, But Are They Learning?” 66. Ibid., 305. 67. Smith, “New Roles and Responsibilities for the University Library.” 68. Maughan, “Assessing Information Literacy among Undergraduates,” 74. 69. Iannuzzi, “We Are Teaching, But Are They Learning?” 70. Patricia Senn Breivik, et al., “A Progress Report on Information Literacy: An Update on the American Library Association Presidential Committee on Information Literacy: Final Re- port,” (Chicago: ACRL, 1998). Available online from http://www.ala.org/Content/ N a v i g a t i o n M e n u / A C R L / P u b l i c a t i o n s / W h i t e _ P a p e r s _ a n d _ R e p o r t s / A_Progress_Report_on_Information_Literacy.htm. 71. Ratteray, “Information Literacy in Self-study and Accreditation,” 369. 72. Maughan, “Assessing Information Literacy among Undergraduates.” 73. Kuh, The College Student Report (2001). 74. ———, The College Student Report (2002). 75. Ibid. 76. Ibid. 77. Maughan, “Assessing Information Literacy among Undergraduates,” 73. 78. Maki, “Developing an Assessment Plan to Learn about Student Learning.”