611 Testing Future Teachers: A Quantitative Exploration of Factors Impacting the Information Literacy of Teacher Education Students Samantha Godbey* This study assesses the information literacy skills of a sample of under- graduate teacher education students, as measured by the iSkills assess- ment, and aims to determine student demographic and academic charac- teristics that may predict success on this assessment. The study repeats the methodology of a study of first-year students at the same institution two years before to provide insight into the information literacy proficiency of future teachers. Using hierarchical multiple regression analysis, transfer credits were found to be a statistically significant predictor of higher iSkills performance. Results are also discussed in the context of the adoption of the ACRL Framework for Literacy for Higher Education. Introduction A widely accepted definition of information literacy, from a now twenty-seven-year- old report from the Presidential Committee on Information Literacy, refers to an information-literate person’s ability “to recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information.”1 This same report stresses the importance of information literacy to lifelong learning and one’s ability to navigate information needs in a rapidly changing society. Further, it encourages educational institutions to play a leadership role in integrating information literacy into their programs. The prevalence of information and technology “in every possible setting” means that information literacy remains an essential skill for all,2 a requirement for “full participation in contemporary Western societies.”3 Likewise, information literacy has retained its prominence in higher education, bolstered in part by the widespread use by librarians of the Association of College and Research Libraries’ (ACRL) Information Literacy Competency Standards for Higher Education.4 Readers will note, of course, the rescinding of these standards in June 2016 in favor of the Framework for Informa- tion Literacy for Higher Education (Framework), a theoretical framework consisting of “interconnected core concepts” rather than standards or outcomes.5 The adoption * Samantha Godbey is Education Librarian, Assistant Professor at University of Nevada, Las Vegas Libraries; e-mail: samantha.godbey@unlv.edu. Special thanks to Jennifer Fabbi for her guidance in the planning and implementation of this study. ©2018 Samantha Godbey, Attribution-NonCommercial (http://creativecom- mons.org/licenses/by-nc/4.0/) CC BY-NC. mailto:samantha.godbey@unlv.edu http://creativecommons.org/licenses/by-nc/4.0/ http://creativecommons.org/licenses/by-nc/4.0/ 612 College & Research Libraries July 2018 of this new document has only increased the discourse among academic librarians around information literacy, defined in the Framework as “the set of integrated abili- ties encompassing the reflective discovery of information, the understanding of how information is produced and valued, and the use of information in creating new knowl- edge and participating ethically in communities of learning.”6 Information literacy is widely discussed and taught in higher education institutions and is unlikely to be abandoned.7 While the extent to which this is true varies among institutions, informa- tion literacy has been integrated into disciplinary coursework, stand-alone information literacy classes, and cocurricular workshops.8 Nonetheless, higher education students can experience what has been referred to as “an illusory comfort within vast pools of information.”9 In today’s information- and technology-filled world, “Helping students become information literate is more critical than ever before.”10 This study assesses the information literacy skills of a sample of undergraduate teacher education students, as measured by the iSkills assessment, an online assess- ment developed by the Educational Testing Service. It also aims to determine student demographic and academic characteristics that may predict success on this assessment. The current study is a follow-up to a study of first-time college freshmen at this same institution11 and was initiated in collaboration with the author of that previous study. While the previous study examined the information literacy skills and possible predic- tors of information literacy skills among first-year students who had not yet declared a major, this study focuses on teacher education students in junior-level courses. This enabled the researcher to assess the information literacy skills of teacher education students in particular, as well as to compare the results among students further along in their studies. The iSkills assessment emphasizes critical thinking and measures the real-time use of problem-solving skills. This study is grounded in constructivist theory, a learner- centered theory that describes learning as the active construction of knowledge and not its passive acquisition. As in the original study, the assessment and theoretically important variables were chosen based on the assumption that students increase proficiency with information literacy through active engagement with higher-order thinking activities. In particular, this study addresses the following research questions: 1. What information literacy skills, as measured by the iSkills assessment, do preservice teachers possess? 2. To what extent is preservice teachers’ performance on the iSkills assessment predicted by background and academic characteristics? Literature Review Standards for pre-kindergarten to twelfth grade (PK–12) students incorporate informa- tion literacy, such as the Common Core State Standards and the American Association of School Librarians’ Standards for the 21st-Century Learner.12 Despite these guidelines, high school students and entering college freshmen often lack information literacy skills13 or experience conducting library research.14 For college students preparing to become teachers, proficiency with these skills becomes even more important given their future role as facilitators of student learning. Farmer, for example, stresses the importance of information literacy in teacher education programs due to the fact that, in addition to helping preservice teachers develop skills they will need for their own professional growth, they must also develop the skills they will need to teach PK–12 students to become information literate.15 Studies have shown the importance of teacher preparation for student achievement and have emphasized the importance of training teachers to convey higher-order think- Testing Future Teachers 613 ing skills or critical thinking skills.16 In the PK–12 literature, the terms “critical thinking” or “higher-order thinking” are used more commonly than “information literacy” skills. However, this is where information literacy and PK–12 standards intersect. It has been noted that, “While critical thinking skills provide the theoretical basis for the process, information literacy provides the skills for practical, real world application.”17 For example, the Interstate Teacher Assessment and Support Consortium (InTASC) Model Core Teaching Standards state that a teacher “understands critical thinking processes and knows how to help learners develop high level questioning skills.”18 Among librar- ians, these questioning skills would be classified within information literacy. Further, the InTASC Standards state that, in addition to deep content knowledge, teachers must be able to “work with learners to access information, [and] apply knowledge in real world settings.”19 Accessing information and applying knowledge or information would likewise easily be labeled critical thinking or information literacy. Nonetheless, research has shown that PK–12 teachers often lack the information literacy skills and knowledge required for their work.20 Teachers are not necessarily familiar with information literacy as a concept; they are also not prepared to teach information literacy to their students and do so inconsistently.21 Even school librar- ians, who are better versed in information literacy as a concept, lack preparation in information literacy pedagogy.22 Research has demonstrated the importance of integrating information literacy into teacher education programs.23 Lee, Reed, and Laverty explored the degree to which one teacher education program had prepared preservice teachers for teaching informa- tion literacy and found than more than half of the participants had neither acquired new skills nor felt that they had the opportunity to improve research skills in their program.24 In a survey of education majors and school media specialists, Stockham and Collins found that many education students and recent graduates were unfamiliar with information literacy terminology and concepts.25 To ameliorate this deficiency, teacher educators and librarians alike have encouraged collaboration between the two to effectively teach information literacy skills to students.26 Nonetheless, a review of the literature reveals a lack of studies that provide a direct assessment of the information literacy skills among teacher education students. Studies that do address this topic use surveys in which these students self-report knowledge and familiarity with various information literacy skill areas and topics.27 Methodology This study explored factors affecting the information literacy competency of students studying to become elementary and secondary education teachers during the 2013–2014 academic year at the University of Nevada, Las Vegas (UNLV), as measured by their scores on the ETS iSkills Assessment. UNLV is a large public research university in the western United States recognized as a minority-serving institution. Undergradu- ate students primarily come from within the state, and most graduates of the teacher education program become teachers in the large, diverse school district surrounding the university. The purpose of the study was to assess the information literacy skills of students in the undergraduate teacher preparation programs at UNLV and to examine whether student demographic and academic characteristics predict success on the iSkills as- sessment. In addition to completing the iSkills assessment, participants completed a demographic survey and provided access to demographic and academic data, such as official grade-point average. The researcher acquired approval through the campus Institutional Review Board to collect data from the students and from the campus student information system. 614 College & Research Libraries July 2018 The study described in this article uses iSkills, an assessment of Information and Communications Technology literacy (ICT literacy skills) developed by the Edu- cational Testing Service (ETS). ICT literacy is defined as “using digital technology, communications tools, and/or networks to access, manage, integrate, evaluate, and create information in order to function in a knowledge society,”28 or information lit- eracy within the context of technology. Aligned with the Association of College and Research Libraries (ACRL) Information Literacy Competency Standards for Higher Education, this 60-minute test requires students to complete 14 scenario-based tasks that assess competency with information in seven skill areas: define, access, evaluate, manage, integrate, create, and communicate. The seven iSkills skill areas are defined by ETS as follows: • Define: Understand and articulate the scope of an information problem in order to facilitate the electronic search for information. • Access: Collect and/or retrieve information in digital environments. • Evaluate: Judge whether information satisfies an information problem by de- termining authority, bias, timeliness, relevance and other aspects of materials. • Manage: Organize information to help you or others find it later. • Integrate: Interpret and represent information using digital tools to synthesize, summarize, compare and contrast information from multiple sources. • Create: Adapt, apply, design or construct information in digital environments. • Communicate: Disseminate information tailored to a particular audience in an effective digital format.29 ETS provides a detailed document identifying iSkills performance indicators for each of the five ACRL Standards.30 As an example, for ACRL Standard One, “The informa- tion literate student determines the nature and extent of the information needed,” ETS identifies iSkills performance indicators such as the “specificity of terms/concepts used in research question or topic statement” and the “appropriateness of resource chosen while browsing to define a topic.” Test takers complete a series of tasks within the test interface, and their selections are evaluated according to the iSkills performance indicators. The iSkills assessment has been used extensively with first-year students31 and had been used at this institution previously.32 This assessment tool was selected in part due to its nature as a performance assessment that allows for the measurement of higher-order thinking, and, in contrast to other options for performance-based as- sessment such as portfolio assessment, the iSkills assessment is less time-consuming.33 As important, ETS allows access to a detailed download of data that can be analyzed using statistical software and provides reports that compare student results against a national cohort. Participants were drawn from students enrolled in junior-level courses in the UNLV Department of Teaching and Learning during the fall 2013 and spring 2014 semesters. This academic program provides undergraduate degrees in elementary and secondary education. Junior-level courses were chosen in an attempt to recruit students further along in the course progression for the undergraduate degrees. Instructors for these junior-level courses were invited to offer the iSkills assessment within their sections, and five instructors agreed to participate, with a total of nine sections of students participat- ing in the study. Class time was used for test administration; classes met in a computer classroom in the university libraries to complete the iSkills assessment. All students in participating sections were asked to complete the assessment, but participation in the study was optional. Participants were consented in the computer classroom prior to beginning the survey and assessment. Of the 163 students who took the assessment, 153 agreed to participate in the study. However, data for one participant, whose score Testing Future Teachers 615 fell below three standard deviations from the mean on the iSkills assessment, were removed from the analysis. This sample of 152 students represents 21.7 percent of the students majoring in Teaching and Learning during the 2013–2014 academic year. In this sample, 133 students (88.1%) were female, and 18 (11.9%) were male. When asked about language, 134 (88.7%) students responded that English only was their best language, and 17 (11.3%) students responded that either “English and another language” or “another language” was their best language. Regarding ethnicity, 96 par- ticipants (63.6%) identified as white, 28 (18.5%) as Hispanic, 9 (6.0%) as Asian, 6 (4.0%) as black or African American, and 12 (7.9%) as multiple ethnicities or other. Finally, approximately half of the sample indicated some coursework at multiple institutions of higher education, with 71 students (47.0 %) reporting having transfer credits. Variables The sole dependent variable in this study was iSkills score, which measures a person’s information literacy competency in a digital environment. Six variables were chosen as possible predictors of iSkills score—gender, best language, ethnicity, transfer credits, grade point average, and significant courses. Data for three of these variables (gender, best language, and transfer credits) was self-reported in the demographic survey at the time of the assessment. For “best language,” participants responded to the question, “What language do you know best?” “Transfer credits” refers to the number of credits earned at another higher-education institution that a student intends to apply toward a degree at this institution. Many students took courses at community colleges or other postsecondary institutions prior to or concurrent with enrollment at this institution. Data for the remaining variables (ethnicity, grade point average, and significant courses) were retrieved from the official campus student information system. For ethnicity, students self-identified ethnicity upon enrollment using categories em- ployed by the university. “Grade point average” (GPA) refers to the participant’s cumulative grade point average on a 4-point scale for courses completed at this university, at the time of testing. “Significant courses” refers to the total number of research- and library-intensive courses at this institution that a student has previously completed or in which that student is enrolled at the time of testing. This number includes research-intensive courses in the teacher preparation program, in addition to general education and college-specific courses with significant library involve- ment, as determined by library involvement in course design, assignment design, and/or provision of one or more course-integrated library instruction sessions. This includes, for example, the required educational psychology course, as well as the second-year seminar course for which the librarian worked with a group of faculty members to develop the syllabus. This number excludes courses completed at other postsecondary institutions. These variables were chosen to parallel the variables examined in a prior study of the information literacy skills of 93 first-year students at this same institution.34 In that study, Fabbi examined the potential impact of gender, best language, ethnicity, type of admission, high school grade point average, and number of honors classes and research assignments on participants’ information literacy skills as measured by the iSkills assessment. The selection for the present study of the variables gender, best language, and grade point average intentionally parallels this earlier study. In place of admission type (that is, whether a first-year student was an alternate admit or ex- ploring major), the variable of transfer credits was used. In place of number of honors classes and research assignments in high school, the current study used “significant courses” as defined above. 616 College & Research Libraries July 2018 Results/Analysis The purpose of this quantitative analysis is to both get a better sense of the information literacy skills of these students, as measured by the iSkills assessment, and to determine whether certain variables are predictive of performance on that assessment. Descriptive statistics and correlations among study variables were examined to provide an indica- tion of possible patterns and relationships in the data. Hierarchical linear regression analyses were conducted to examine the unique effects of demographic and academic variables as predictors of the overall performance on the iSkills assessment. Several of the predictor variables were recoded to be dichotomous. These variables included gender (0 = female, 1 = male), best language spoken (0 = English as best language, 1 = other language reported as best language), race (0 = Caucasian, 1 = other race reported), and transfer credits (0 = no transfer credits, 1 = has transfer credits). This coding for some of the variables differs from Fabbi due to the composition of the sample in this study. Among this study group, for example, the number of individuals in different ethnic groups was too small for unique comparisons. Table 1 presents descriptive statistics and bivariate correlations among variables. On the iSkills assessment, possible scores range from 0 to 500 in 10-point increments. The cut score, or minimum score for a test taker to be considered at a foundational level of ICT literacy skill, is designated by ETS as 260. For this sample, student scores ranged from 100 to 440, with a mean score below the cut score (M = 249.14, SD = 66.75). A total of 66 students, or 43.7 percent of the sample, received a score at or above the cut score of 260. Participants’ overall iSkills assessment score shared a statistically significant, positive correlation with transfer credits and cumulative GPA. This indicates that, as transfer credits and cumulative GPA increase, so do information literacy skills as measured by iSkills. Positive and significant correlations were also found between transfer credits and gender, as well as transfer credits and cumulative GPA. These correlations indi- cated that males were also more likely to have transfer credits in this sample and that students with transfer credits also had higher academic performance as measured by GPA. A negative relationship was found between transfer credits and significant courses. However, this relationship can be expected, as more transfer credits would indicate fewer courses taken at the current institution. For example, a transfer student will likely have completed the equivalent of the first- and second-year seminar courses, in which the library is heavily involved, at a different institution. TABLE 1 Descriptive Statistics and Correlations among Study Variables 1 2 3 4 5 6 7 1. iSkills Score – 2. Gender –.07 – 3. Best Lang. –.12 –.07 – 4. Race –.10 .15 .39** – 5. Transfer Credits .20* .19* –.04 –.04 – 6. Cum. GPA .19* –.14 .02 –.06 .22** – 7. Sig. Courses –.06 –.12 .15 –.02 –.42** .15 – Mean iSkills Score 249.14 .12 .11 .36 .47 3.35 1.53 SD 66.75 .33 .32 .48 .50 .47 1.09 Note. **P < .01. *P < .05. Testing Future Teachers 617 A hierarchical regression was also conducted, in an effort to determine the extent to which each variable might predict a participant’s iSkills score. Regression analysis allows a researcher to remove or control for the effects of distinct variables. Hierarchical regression requires that the researcher make a decision regarding the likely significance of each of the variables and, accordingly, the order of the regression. This hierarchical regression was conducted in six steps, with each variable representing a single model in an effort to isolate the relative contribution of each variable in predicting iSkills score after controlling for effects of the prior variables. In this case, the order of the regression was modeled after Fabbi’s. The first three predictors were demographic in nature; the remaining three predictors sought to describe the effects of academic history and performance on a student’s information literacy skills as measured by the iSkills assessment, after accounting for demographic characteristics. Regression results are depicted in table 2. Analysis revealed that including academic variables in the models explained significant variance in iSkills score. Specifically, transfer credit was a statistically significant and positive predictor of iSkills assessment scores after controlling for demographic effects. The strength of the effect of transfer credit was lessened after including cumulative grade point average and significant courses but remained significant at the level of .10 (β = .18, R2 = .09). The value of R2 is indicative of the variability in the dependent variable that is due to a particular predictor; therefore, while the number of transfer credits has a statistically significant effect, they account for just 7 percent of the variance in the fourth regression model. None of the other predictors had a significant effect on the models. The order of regression means that the researcher predicted that the variables intro- duced in later steps of the regression (in other words, transfer credits, GPA, and significant courses) would be the most significant. The transfer credit is the only one of these that were found to be significant; however, it was a surprise that this turned out to be a posi- tive correlation, meaning that having transfer credits was correlated with a higher iSkills score. In fact, when significant courses were added to the model in the final step of the hierarchical regression, there was no measurable effect on the model (R2 change = .00). In addition to the raw data, the researcher had access to two reports from ETS, which offer a comparison between the study group and a reference group. This ETS reference group is a sample of 642 college students (incoming freshmen and students transitioning to upper-level coursework) at 2- and 4-year programs, from 2006 through the time of the report in 2014. The Aggregate Task Performance Feedback Report shows the number and percentage of students who achieved the highest score for each of the components of the tasks in the assessment. The Institutional Skill Area Report shows the study group’s performance in each of the iSkills assessment areas (which are, the reader will remember: define, access, evaluate, manage, integrate, create, and communicate) as TABLE 2 Hierarchical Regression Analyses Predicting Performance on iSkills (N = 151) Step 1 Gender Step 2 Best Language Step 3 Race Step 4 Transfer Credits Step 5 Cum GPA Step 6 # Sign. Courses R2 .01 .02 .03 .07 .09 .09 F .78 1.54 1.26 2.77 2.73 2.26 R2 Change .01 .02 .01 .05 .02 .00 F Change .78 2.30 .69 7.14** 2.47* .01+ +P < .10; *P < .05; **P < .01 618 College & Research Libraries July 2018 compared to the reference group. Students who take insufficient time or complete fewer than four tasks in either section of the test are automatically excluded from the report. Both reports were run within the ETS administrator portal in the summer following test administration. With regard to the reports pro- vided by ETS, for the Institutional Skill Area Report, the median per- centage score on each skill area as compared to the reference group is provided. This is calculated using the student’s raw skill area score as a percentage of possible points for that skill area. This report is provided in an imprecise chart that does not reproduce well here, but it does provide a broad view of how students fared in each of the skills areas.35 For this cohort, students performed better than the reference group on the Evalu- ate skill area, comparable to the reference group on the Create and Communicate tasks, and less well than the reference group on the Define, Access, and Manage skill areas, with the ability to access information the lowest in compari- son to the reference group. The Aggregate Task Perfor- mance Feedback report provides detail regarding student perfor- mance on specific tasks within each of the skill areas. For example, on the task in which students were asked to evaluate a database to de- termine its usefulness for a project about opposing viewpoints, within the Evaluate skill area, 78 percent of the participants (in contrast to 14 percent of the reference group) correctly evaluated the usefulness of a database without needing ex- plicit criteria. This report identified student performance on each indi- vidual subcomponent of each of the fourteen scenario-based tasks. TA B L E 3 H ie ra rc hi ca l R eg re ss io n C oe ffi ci en ts fo r V ar ia bl es P re di ct in g P er fo rm an ce o n iS ki lls (N =1 51 ) St ep 1 St ep 2 St ep 3 St ep 4 St ep 5 St ep 6 V ar ia bl e B SE B β B SE B β B SE B β B SE B β B SE B β B SE B β G en de r –1 4. 79 16 .7 8 –. 07 –1 6. 48 16 .7 4 –. 08 –1 8. 32 16 .9 1 –. 09 –2 6. 52 16 .8 4 –. 13 –2 1. 27 17 .0 9 –. 10 –2 1. 28 17 .1 5 –. 10 B es t L an gu ag e –2 6. 00 17 .1 6 –. 12 –2 0. 03 18 .6 2 –. 10 –1 8. 68 18 .2 5 –. 09 –2 0. 14 18 .1 8 –. 10 –1 9. 86 18 .5 2 –. 09 R ac e –1 0. 32 12 .3 8 –. 07 –1 0. 31 12 .1 3 –. 07 –8 .5 0 12 .1 3 –. 06 –8 .5 9 12 .2 2 –. 06 Tr an sf er C re di ts 2 8. 94 10 .8 3 .2 2* * 24 .5 2 11 .1 4 .1 8* 24 .0 1 12 .6 1 .1 8+ C um ul at iv e G PA 18 .6 5 11 .8 6 .1 3 18 .9 4 12 .3 5 .1 3 Si gn ifi ca nt C ou rs es –. 49 5. 67 –. 01 +P < .1 0; * P < .0 5; * *P < .0 1 Testing Future Teachers 619 Discussion The primary goal of this study was to better understand the information literacy skills of this particular group of students. The researcher works closely with students in this program and was able to make adjustments to her instruction as a result of these findings. The descriptive statistics were useful, namely that the mean score for these students in junior-level courses was below the cut score, and less than half of the students reached the cut score. This was expected given that prior research has noted a lack of information literacy skills among PK–12 teachers36 and is in keeping with anecdotal experience with students in this program. The Institutional Skill Area report was also useful in breaking down areas of student need in broad categories, indicating a need for explicit instruction related to defining an information need and accessing information. However, the Aggregate Task Feedback report was too explicitly tied to specific subtasks within the iSkills assessment to be as useful to the researcher. For example, indicating that a certain percentage of test takers had answered a single question within a seven-part task correctly while a different percentage of test takers had answered a similar question within a different task cor- rectly was difficult to parse, especially given that ETS does not provide copies of the scenarios and test questions. Ultimately, the Institutional Skill Area Report was more useful for gaining a broad understanding of student skills for use in future semesters, as it provided insight into student performance on the seven skill areas rather than individual test items. Prior instruction in first- and second-year courses emphasized the evaluation of information, incorporating topics such as the difference between popular and scholarly sources or identifying criteria for the evaluation of information sources. The results of this study indicated that upper-level students are performing well in this area but less so in defining an information need and accessing that information. As a result, instruction in the lower grades has been adjusted to more deliberately address those skill areas highlighted by the iSkills results. The mean score for this study population (M = 249.14, SD = 66.75) is, it should be noted, higher than that of incoming freshmen as studied by Fabbi two years prior (M = 207.85, SD = 58.18). This is not the same group of students, and, as Fabbi studied students still exploring majors, it is unknown which majors that group of students selected or in which courses they enrolled. Whether the higher iSkills scores are due to any particular factors in the two years between the two studies, such as student exposure to certain assignments or instruction, or whether any individual students’ scores did or would have increased over time, is unknown. The second goal of this study was to determine student demographic and academic characteristics that may predict performance on the iSkills assessment. In the previous study on which this study was modeled, Fabbi found a significant correlation between iSkills score and three variables: best language, cumulative GPA, and their curricular track in high school. In this study, the researcher had hypothesized that GPA would be a significant predictor of score. A positive correlation was identified between the two (see table 1); however, the hierarchical regression is able to isolate the effect of individual variables while controlling for the effect of other variables. In this case, GPA was found not to have a significant effect. The researcher had also hoped the number of significant courses taken by a student would be predictive of iSkills performance. The researcher had hypothesized that a positive correlation would be found between iSkills score and student enrollment in courses with a heavy research component and/or library involvement (significant courses). As shown in table 2, however, this was not the case. This does not mean that there is no correlation between library involvement and student performance. In terms of this study, limitations should be acknowledged in that the researcher did not have 620 College & Research Libraries July 2018 data on student performance in those courses, only whether a student had passed the course or not. Additionally, there were no data available on whether students actually attended a library session in any of the applicable courses. A student’s potential lack of motivation in this kind of low-stakes testing37 must also be acknowledged. The only variable found to be statistically significant after controlling for demo- graphic effects was whether a student had transfer credits. These students earned higher scores on the iSkills assessment scores than students who had exclusively taken courses at UNLV. It was not surprising that many students in the sample would have transfer credits, as there are several other institutions within the Nevada System of Higher Education that offer cross-listed courses at lower cost. The researcher had anticipated that, if there was any effect, transfer credits would be a negative predictor, as students might experience inconsistent exposure to research and information literacy instruction if taking core courses at different institutions. Moreover, researchers have called for librarians to develop targeted programs to address the specific needs of transfer stu- dents.38 This unexpected result prompts additional questions that cannot be answered with data gathered here. This university has a high number of first-generation college students—are students taking courses at community colleges before entering UNLV better prepared for college work? There are also a high number of nontraditional students at this university—are there characteristics of these older students and their exposure to life experience that affect their performance on this assessment? Age was not considered as a variable in this study. Additionally, the study design meant that focus group discussions were held in the week following testing to keep students’ memory of the test experience recent. Had initial quantitative analysis been performed before the focus groups, those discussions might have given the researcher the opportunity to explore some of the additional questions raised by the data. Finally, it is important to note that the two academic years since this study was con- ducted, the standards on which the iSkills assessment is based have been rescinded, effective June 2016, and will be removed from the ACRL website in 2017. Although the information literacy standards are widely used, they are also widely criticized.39 It has been noted that it is problematic to reduce information literacy to a list of skills,40 which are precisely what the iSkills assessment measures, as it was developed in align- ment with the ACRL Standards. The Standards have been criticized for their tendency to “promote the idea that information literacy is a universal, coherent, and consistent process that good students can master.”41 Additionally, the Standards lack a social com- ponent that many consider to be an essential component of true information literacy.42 This researcher, and many others, consider the Standards to be a useful complement to more theoretical documents such as the Framework, but the organization at this point has committed to using the Framework, and the iSkills assessment is no longer being sold by ETS as of December 2016.43 Conclusion This study explores one method of assessing undergraduates’ information literacy skills. Although the hierarchical regression did not yield the correlations the researcher had predicted or hoped for, sharing the results of this study is important in exploring a potential approach to getting to know one’s students. Likewise, although the standards on which this particular assessment is based are in the process of being replaced, the debate about the utility of identifying and measuring skills is likely to continue well into the future. With the Framework comes an expanded definition of information literacy as “the set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued, and the use of information Testing Future Teachers 621 in creating new knowledge and participating ethically in communities of learning,” a definition intended to “emphasize dynamism, flexibility, individual growth, and com- munity learning.”44 Identifying ways to assess these integrated abilities will challenge librarians and campus partners to take an integrated approach to information literacy and information literacy assessment. Librarians can and will find ways to assess aspects of information literacy threshold concepts within one-shot instruction sessions, but engaging with students and student work beyond the scope of these sessions will be required to meaningfully approach the concepts included in the Framework. For ex- ample, an assessment aligned with the Framework is currently under development, the Threshold Achievement Test for Information Literacy (TATIL), which aims to provide “data-driven insights” into “the information literacy capabilities of their students” as defined by the Framework.45 This study was undertaken to explore the information literacy skills of future teachers. This sample suggests that students in these courses have a distinct need for improvement in information literacy skills in general, with particular skill areas in need of attention. Teachers must be adequately prepared to be effective educators. Perhaps an integrated approach using information literacy threshold concepts will enable us to effectively address both the affective and cognitive domains of learning. That said, our teachers also need to be competent in specific skills. These are the teachers who will be guiding our nation’s children until they reach us in our colleges and universi- ties, working with these children to develop the skills they need to be successful in school and life. This study confirmed that, at least among this study population, work is still needed to improve the information literacy competency of our future educators. Notes 1. American Library Association, “Presidential Committee on Information Literacy: Final Report” (1989), para. 3, available online at www.ala.org/acrl/publications/whitepapers/presidential [accessed 1 August 2017]. 2. Michael Eisenberg, “Information Literacy: Essential Skills for the Information Age,” DESIDOC Journal of Library & Information Technology 28 (2008): 39. 3. Heidi Julien and Susan Barker, “How High-School Students Find and Evaluate Scientific Information: A Basis for Information Literacy Skills Development,” Library and Information Science Research 31, no. 1 (2009): 13. 4. Association of College and Research Libraries (ACRL), Information Literacy Compe- tency Standards for Higher Education (2000), available online at www.ala.org/acrl/standards/ informationliteracycompetency [accessed 1 August 2017]. 5. Association of College and Research Libraries (ACRL), Framework for Information Literacy for Higher Education (2015), available online at www.ala.org/acrl/standards/ilframework [accessed 1 August 2017]. 6. Ibid. 7. James W. Marcum, “Rethinking Information Literacy,” Library Quarterly: Information, Com- munity, Policy 72, no. 1 (2002): 21. 8. Suzanne Lipu, “A Flying Start for Our Future Teachers: A Comprehensive Information Literacy Program for Pre-service Education Students at the University of Wollongong, Australia,” Behavioral & Social Sciences Librarian 22, no. 1 (2003): 47–73; Jennifer L. Branch, “Teaching, Learning and Information Literacy: Developing an Understanding of Pre-service Teachers’ Knowledge,” Behavioral & Social Sciences Librarian 22, no. 1 (2003): 33–46; Marlene M. Asselin and Elizabeth A. Lee, “‘I Wish Someone Had Taught Me’: Information Literacy in a Teacher Education Program,” Teacher Librarian 30, no. 2 (2002): 10–18; Mark Emmons, Elizabeth B. Keefe, Veronica M. Moore, Rebecca M. Sánchez, Michele M. Mals, and Teresa Y. Neely, “Teaching Information Literacy Skills to Prepare Teachers Who Can Bridge the Research-to-Practice Gap,” Reference & User Services Quarterly 49, no. 2 (2009): 140–50; Vanessa Earp, “Integrating Information Literacy into Teacher Education: A Successful Grant Project,” Behavioral & Social Sciences Librarian 28, no. 4 (2009): 166–78. 9. Todd J. Wiebe, “The Information Literacy Imperative in Higher Education,” Liberal Educa- tion 101/102, no. 4/1 (Fall 2015/Winter 2016): 54. http://www.ala.org/acrl/publications/whitepapers/presidential http://www.ala.org/acrl/standards/informationliteracycompetency http://www.ala.org/acrl/standards/informationliteracycompetency http://www.ala.org/acrl/standards/ilframework 622 College & Research Libraries July 2018 10. ACRL Information Literacy Competency Standards Review Task Force, Task Force Recom- mendations (2012), available online at www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/ ils_recomm.pdf [accessed 1 August 2017]. 11. Jennifer L. Fabbi, “Fortifying the Pipeline: A Quantitative Exploration of High School Factors Impacting the Information Literacy of First-Year College Students,” College & Research Libraries 76, no. 1 (2015): 31–42. 12. National Governors Association Center for Best Practices, Council of Chief State School Officers, Common Core State Standards (Chicago: National Governors Association Center for Best Practices, Council of Chief State School Officers, 2010), available online at www.corestandards. org/read-the-standards/ [accessed 1 August 2017]; American Association of School Librarians, Standards for the 21st-Century Learner (2007), available online at www.ala.org/aasl/sites/ala.org. aasl/files/content/guidelinesandstandards/learningstandards/AASL_LearningStandards.pdf [accessed 1 August 2017]. 13. Julien and Barker, “How High-School Students Find and Evaluate Scientific Information”; Jorden K. Smith, “Secondary Teachers and Information Literacy (IL): Teacher Understanding and Perceptions of IL in the Classroom,” Library & Information Science Research 35, no. 3 (2013): 216–22. 14. Alison J. Head, “Learning the Ropes: How Freshmen Conduct Course Research Once They Enter College,” in Project Information Literacy Research Reports: The Passage Studies (Dec. 5, 2013): 31, available online at http://www.projectinfolit.org/uploads/2/7/5/4/27541717/pil_2013_ freshmenstudy_fullreportv2.pdf [accessed 18 May 2018]. 15. Lesley Farmer, “Incorporating Information Literacy into Instructional Design within Pre-service Teacher Programs,” in Professional Development and Workplace Learning: Concepts, Methodologies, Tools, and Applications (IGI Global, 2016), 339. 16. Harold Wenglinsky, How Teaching Matters: Bringing the Classroom Back into Discussions of Teacher Quality (Milken Family Foundation and Educational Testing Service, 2000), available online at https://www.ets.org/Media/Research/pdf/PICTEAMAT.pdf [accessed 1 August 2017]. 17. Christina S. Doyle, Information Literacy in an Information Society: A Concept for the Information Age (Syracuse, N.Y.: ERIC Clearinghouse on Information & Technology, 1994), 4. 18. Council of Chief State School Officers, Interstate Teacher Assessment and Support Consortium (InTASC) Model Core Teaching Standards: A Resource for State Dialogue (2011), 14, available online at http://www.ncate.org/~/media/Files/caep/accreditation-resources/intasc-teacher-standards.pdf [accessed 18 May 2018]. 19. Ibid, 8. 20. Dorothy Williams and Louisa Coles, “Teachers’ Approaches to Finding and Using Research Evidence: An Information Literacy Perspective,” Educational Research 49, no. 2 (2007): 185–206; Yu-Mei Wang, “Riding to the Future: An Investigation of Information Literacy Skills of Students at an Urban University as Applied to the Web Environment,” International Journal on E-Learning 6, no. 4 (2007): 593–603. 21. Smith, “Secondary Teachers and Information Literacy.” 22. Marlene Asselin and Ray Doiron, “Whither They Go: An Analysis of the Inclusion of School Library Programs and Services in the Preparation of Preservice Teachers in Canadian Universi- ties,” Behavioral & Social Sciences Librarian 22, no. 1 (2003): 19–32. 23. Warren F. Crouse and Kristine E. Kasbohm, “Information Literacy in Teacher Education: A Collaborative Model,” Educational Forum 69, no. 1 (2005): 44–52; Thomas Scott Duke and Jennifer Diane Ward, “Preparing Information Literate Teachers: A Metasynthesis,” Library & Information Science Research 31, no. 4 (2009): 247–56; Vanessa Earp, “Integrating Information Literacy into Teacher Education: A Successful Grant Project,” Behavioral & Social Sciences Librarian 28, no. 4 (2009): 166–78; Jana Varlejs and Eileen Stec, “Factors Affecting Students’ Information Literacy as They Transition from High School to College,” School Library Research 17 (2014). 24. Elizabeth A. Lee, Brenda Reed, and Corinne Laverty, “Preservice Teachers’ Knowledge of Information Literacy and Their Perceptions of the School Library Program,” Behavioral & Social Sciences Librarian 31, no. 1 (2012): 3–22. 25. Marcia Stockham and Heather Collins, “Information Literacy Skills for Preservice Teachers: Do They Transfer to K–12 Classrooms?” Education Libraries 35, no. 1/2 (2012): 59–72. 26. Crouse and Kasbohm, “Information Literacy in Teacher Education”; Emmons et al., “Teaching Information Literacy Skills to Prepare Teachers”; Cindy Kovalik et al., “Information Literacy, Collaboration, and Teacher Education,” Communications in Information Literacy 4, no. 2 (2010): 145–69; Deborah M. Floyd, Gloria Colvin, and Yasar Bodur, “A Faculty–Librarian Col- laboration for Developing Information Literacy Skills among Preservice Teachers,” Teaching and Teacher Education 24, no. 2 (2008): 368–76. 27. Wang, “Riding to the Future.” 28. International ICT Literacy Panel, Digital Transformation: A Framework for ICT Literacy (2007): 2, available online at https://www.ets.org/Media/Tests/Information_and_Communication_ Technology_Literacy/ictreport.pdf [accessed 1 August 2017]. http://www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/ils_recomm.pdf http://www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/ils_recomm.pdf http://www.corestandards.org/read-the-standards/ http://www.corestandards.org/read-the-standards/ http://www.ala.org/aasl/sites/ala.org.aasl/files/content/guidelinesandstandards/learningstandards/AASL_LearningStandards.pdf http://www.ala.org/aasl/sites/ala.org.aasl/files/content/guidelinesandstandards/learningstandards/AASL_LearningStandards.pdf http://www.projectinfolit.org/uploads/2/7/5/4/27541717/pil_2013_freshmenstudy_fullreportv2.pdf http://www.projectinfolit.org/uploads/2/7/5/4/27541717/pil_2013_freshmenstudy_fullreportv2.pdf https://www.ets.org/Media/Research/pdf/PICTEAMAT.pdf http://www.ncate.org/~/media/Files/caep/accreditation-resources/intasc-teacher-standards.pdf https://www.ets.org/Media/Tests/Information_and_Communication_Technology_Literacy/ictreport.pdf https://www.ets.org/Media/Tests/Information_and_Communication_Technology_Literacy/ictreport.pdf Testing Future Teachers 623 29. “iSkillsTM Assessment Content,” Educational Testing Service, available online at https://www. ets.org/iskills/about/ [no longer available]. 30. “ETS Higher Education iSkills Assessment Fit with ACRL Standards,” Educational Testing Service, available online at https://www.ets.org/Media/Tests/ICT_Literacy/pdf/acrl_standards.pdf [accessed 28 July 2017]. 31. Boris Teske and Brian Etheridge, “Information and Communication Technology Literacy among First-Year Honors and Non-Honors Students: An Assessment,” Journal of the National Collegiate Honors Council 11, no. 1 (2010): 83–110; Jeanne M. Brown and Carrie A. Gaxiola, “Why Would They Try? Motivation and Motivating in Low-Stakes Information Skills Testing,” Journal of Information Literacy 4, no. 2 (2010): 23–36; Fabbi, “Fortifying the Pipeline.” 32. Brown and Gaxiola, “Why Would They Try?”; Fabbi, “Fortifying the Pipeline.” 33. Megan J. Oakleaf, “Dangers and Opportunities: A Conceptual Map of Information Literacy Assessment Approaches,” portal: Libraries and the Academy 8, no. 3 (2008): 233–54. 34. Fabbi, “Fortifying the Pipeline.” 35. See https://www.ets.org/Media/Tests/ICT_Literacy/pdf/iSKills_Institutional_Skill_Sample_ Report.pdf [accessed 23 May 2018] for a sample of this report. 36. Williams and Coles, “Teachers’ Approaches to Finding and Using Research Evidence”; Wang, “Riding to the Future.” 37. Brown and Gaxiola, “Why Would They Try?” 38. John C. Phillips and Thomas A. Atwood, “Transferring Skills, Transferring Students: A Call to Academic Libraries,” College & Undergraduate Libraries 17, no. 4 (2010): 331–48. 39. Emily Drabinski, “Toward a Kairos of Library Instruction,” Journal of Academic Librarian- ship 40, no. 5 (2014): 480–85; Nancy M. Foasberg, “From Standards to Frameworks for IL: How the ACRL Framework Addresses Critiques of the Standards,” portal: Libraries and the Academy 15, no. 4 (2015): 699–717; James W. Marcum, “Rethinking Information Literacy,” Library Quarterly: Information, Community, Policy 72, no. 1 (2002): 1–26; Christine Pawley, “Information Literacy: A Contradictory Coupling,” Library Quarterly: Information, Community, Policy 73, no. 4 (2003): 422–52. 40. Sheila Webber and Bill Johnston, “Conceptions of Information Literacy: New Perspectives and Implications, Journal of Information Science 26, no. 6 (2000): 381–97; Bill Johnston and Sheila Webber, “Information Literacy in Higher Education: A Review and Case Study,” Studies in Higher Education 28, no. 3 (2003): 335–52. 41. Foasberg, “From Standards to Frameworks,” 706. See Foasberg for a thorough discussion of the criticisms of / debate around the Standards in the context of the recent transition to the ACRL Framework. 42. Foasberg, “From Standards to Frameworks,” 706; Marcum, “Rethinking Information Lit- eracy”; Benjamin R. Harris, “Communities as Necessity in Information Literacy Development: Challenging the Standards,” Journal of Academic Librarianship 34, no. 3 (2008): 248–55. 43. “The iSkillsTM Assessment,” Educational Testing Service, available online at https://www.ets. org/iskills/about [accessed 28 July 2017]. 44. ACRL, Framework. 45. Threshold Achievement Test for Information Literacy (TATIL), available online at https:// thresholdachievement.com/the-test [accessed 28 July 2017]. https://www.ets.org/iskills/about/ https://www.ets.org/iskills/about/ https://www.ets.org/Media/Tests/ICT_Literacy/pdf/acrl_standards.pdf https://www.ets.org/Media/Tests/ICT_Literacy/pdf/iSKills_Institutional_Skill_Sample_Report.pdf https://www.ets.org/Media/Tests/ICT_Literacy/pdf/iSKills_Institutional_Skill_Sample_Report.pdf https://www.ets.org/iskills/about https://www.ets.org/iskills/about https://thresholdachievement.com/the-test https://thresholdachievement.com/the-test _Hlk478570187