The Evolution of Information Literacy Learning Outcomes in Interdisciplinary Undergraduate Science Courses
Melanie A. Gainey
Librarian, Biological Sciences and Biomedical Engineering
mgainey@andrew.cmu.edu
Neelam Bharti
Senior Librarian, Chemical Sciences and Engineering
nbharti@andrew.cmu.edu
Matthew R. Marsteller
Principal Librarian, Electrical & Computer Engineering, Mathematical Sciences, and Physics
matthewm@andrew.cmu.edu
Huajin Wang
Librarian, Biology and Computer Science
huajinw@cmu.edu
Sarah Young
Senior Librarian, Public Policy, Decision Sciences, and Statistics
sarahy@andrew.cmu.edu
University Libraries
Carnegie Mellon University
Pittsburgh, Pennsylvania
Michael Melville
Data Science Research Associate
Eberly Center for Teaching Excellence & Educational Innovation
Carnegie Mellon University
Pittsburgh, Pennsylvania
mmelvill@andrew.cmu.edu
Abstract
The ACRL Framework for Information Literacy presents opportunities for moving beyond ‘one-shot’ information literacy sessions and creating a more scaffolded and embedded approach for instruction. We collaborated with faculty at Carnegie Mellon University to create Framework-inspired information literacy learning objectives for first-year and third-year science undergraduates and are continuously refining the objectives as the curriculum evolves. This article describes our learning objective design and refinement process, challenges encountered, and ideas on how to create opportunities for embedding information literacy into a curriculum. We also share our full activity lesson plans and assessment tool.
Introduction
Successful learning of information literacy skills can have a profound effect on lifelong knowledge development whether it be for industry or further academic study. For example, information literacy training for health workers has positively impacted patient care (Ayre et al. 2015) and engineering students are more confident about finding and evaluating information when information literacy is integrated into their curriculum (Baroutian & Kensington-Miller 2016; Phillips & Zwicky 2018). However, it can be challenging for librarians to secure opportunities for information literacy instruction in the classroom, particularly in the science and engineering disciplines (Bury 2011; Pinto 2016). The works by Bury and Pinto will point the reader to earlier work on faculty attitudes toward information literacy.
Gaining opportunities for teaching science information literacy remained a challenge at Carnegie Mellon University until recently when the Mellon College of Science (MCS) began the development of a new core curriculum that offered opportunities for collaboration with the liaisons from the University Libraries, with active learning techniques as a specific request. We were invited to collaborate with MCS faculty on two courses called EUREKA! Discovery and Its Impact (hereafter, EUREKA!) and PROPEL: Preparation, Readiness, and Optimization for Professional Excellence in Life (hereafter, PROPEL). This article explores the background literature on the topic and describes the lesson plan development and class delivery. It reflects on how the ACRL Framework informed our efforts and guided the establishment of learning objectives, describes the assessment of student learning, and divulges the refinement efforts based on student and faculty feedback. Since the course is undergoing a lot of change and the role played by the liaison librarians has varied, our efforts to adapt tell an interesting story.
Carnegie Mellon University is a private, not-for-profit, doctoral granting, very-high-research-activity university, with approximately 6,000 undergraduate and 7,500 graduate students. This study focuses on the undergraduate student body of MCS. Details on MCS are given below in the “Class Introduction and Lesson Design” section of the paper.
Literature Review
We reviewed the recent literature on the use of the ACRL Framework for Information Literacy (American Library Association 2015) in the development of learning outcomes, faculty-librarian collaboration, and examples of course-embedded librarianship, in the context of undergraduate science education. The following paragraphs highlight some of the recent literature addressing these topics in an effort to frame our own experience within a broader context.
Much has been written about the ACRL Framework for Information Literacy for Higher Education (hereafter, Framework) since its adoption in 2016, and the challenges in operationalizing the Framework as actionable learning outcomes, particularly in a disciplinary context. Hosier (2017) discusses the differences between threshold concepts and learning outcomes in the context of the Framework and provides a process for achieving this translation. Kuglitsch (2015) provides a useful discussion of the importance of placing the threshold concepts, or "frames," of the Framework into a disciplinary context to facilitate learning, keep students engaged, and develop a more nuanced understanding of information literacy (IL) concepts. For example, she discusses the first ‘frame’ of the Framework, “Authority is constructed and contextual,” in the context of science disciplines. Ideas related to peer review, reproducibility, and research ethics can be effectively addressed with this frame in mind.
Beyond the challenges in translating the Framework to learning outcomes, several authors have discussed the potential merits of the Framework in serving as an effective foundation for discussing information literacy with faculty. For example, Guth et al. (2018) offers a useful recent review of literature on faculty perceptions of information literacy. They noted that the Framework is well suited for disciplinary discussions of information literacy, which may resonate more deeply with faculty than more generic approaches to IL. They conducted a survey of faculty perceptions of the Framework—while the frames were viewed as important objectives of student learning, the use of library jargon and lack of clarity were noted as concerns. Sloane et al. (2018) discuss their use of the Framework in curricular development and instructional design for the undergraduate sciences. They note the merits of the Framework for fostering librarian-faculty collaborations and its versatility in applications across instructional contexts and diverse assignments.
Franzen and Bannon (2016) offer an earlier example of applying the Framework within the context of faculty-librarian collaboration in a health sciences library. They worked with faculty to construct a curriculum map tied to the ACRL Information Literacy Competency Standards for Higher Education, the predecessor of the Framework (Association of College and Research Libraries 2000). While these Standards were rescinded and replaced with the Framework, Franzen and Bannon note the preferred alignment of these standards with the evidence-based practice model used in research and decision-making in the health sciences. The Framework, however, served to enrich the librarians’ teaching and offered added flexibility in addressing complex concepts of information use. For example, in learning how to evaluate information, students compared trade to academic journals. The Standards guided them in identifying characteristics of these two types of resources, such as language formality and the number of references. Applying the Framework's concept, Information Creation as a Process, prompted more in-depth discussions regarding the contextual dependence of authority.
The Framework has also been useful in challenging librarians to think beyond the traditional ‘one-shot’ approach to information literacy. Teaching to the Framework necessitates a more embedded approach, as the concepts are more nuanced, transformative, and speak to a higher level of cognitive function than that required for simply seeking and finding information (see Black & Allen 2017 for more about cognitive development and the Framework). Thus, opportunities for collaborating with faculty more deeply, and integrating IL more fully into the curriculum are ever more important in achieving information literacy as students develop as scientists and scholars.
Moreover, the challenge of reaching all undergraduates multiple times throughout the curriculum, in a way that allows a layered or scaffolded approach to IL education, is a perennial challenge for libraries. Examples in the literature of scaffolded approaches to IL in STEM curriculum include Ferrer-Vinent (2016) and Franzen and Bannon (2016).
Methods
Course Introduction and Lesson Design
In 2015, MCS introduced a new Core Education program that combines academic skills with personal and professional development. The backbone of the curriculum is two courses, EUREKA! and PROPEL, which were introduced in 2015 and are required for all Biological Sciences, Chemistry, Physics, and Mathematical Sciences majors. The goal of these classes is to support the holistic development of the students into scientists at defined points throughout the curriculum.
In EUREKA!, first-year MCS students learn foundational skills that can be applied to all four science majors, such as teamwork, effective and efficient research, and community engagement. Approximately 200 students attend a lecture at the beginning of each week, followed by 12 smaller recitation sections of about 15 students later in the week. The recitation sections allow the students to review and consolidate their learning of the lecture material in a smaller and more interactive class setting. The recitation sections are a mix of all four majors and undeclared students, and thus content is largely interdisciplinary.
PROPEL, a third-year course, focuses on the interplay of science and society, with an emphasis on ethics and entrepreneurism. The class is built around a semester-long interdisciplinary theme and team project that gives the students the opportunity to explore and use a wide breadth of resources. For example, in 2018, the theme was climate change, and the final project was a grant proposal to address a science or technology issue in the local community. Students are organized by major for the recitation sections, which allows for the introduction of disciplinary content.
In the summer of 2017, we had conversations with MCS faculty that led to an idea of having dedicated IL sessions taught by science librarians in the EUREKA! and PROPEL classes. We were invited to create a lecture and recitation lesson plan for each course. We delivered the lecture, but due to the large number of recitation sections in the week following the lecture, we were unable to deliver the instruction ourselves and instead created a recitation lesson plan for the teaching faculty to deliver. While this limited our direct interaction with students, it allowed us to reach a much larger number of students with our instruction than we would otherwise be able to and created a unique opportunity for collaboration with teaching faculty.
Both EUREKA! and PROPEL initially contained many guest lectures delivered by various campus units including the University Libraries. In the most recent iteration of PROPEL in Spring of 2019, the course evolved to move away from having many guest lectures to focusing more on the grant proposal project. Accordingly, the library's role transitioned into providing guidelines and instructional materials for deliverables that were incorporated into the grant proposal, such as organizing literature and compiling the bibliography using Mendeley (https://osf.io/rxy5m/). In this way, the library's contribution tied more directly into students’ work.
The MCS curriculum was designed in consultation with the Eberly Center for Teaching Excellence & Educational Innovation (hereafter, Eberly Center), Carnegie Mellon University’s teaching and learning hub, and is shaped by the latest research on learning. The classes, therefore, heavily utilize active learning, interactive technology, and team-based work. MCS is using feedback from students and the Eberly Center to refine the curriculum each year, and the structure of EUREKA! and PROPEL in particular, continues to evolve. For example, in 2019, EUREKA! will largely operate on a flipped-classroom model. This evolving and experimental nature of the course has presented both opportunities and challenges for our lesson design and delivery.
Information Literacy Learning Objectives
In our initial conversations with the MCS faculty in the summer of 2017, we introduced the Framework, which aligned well with the scaffolded and holistic nature of the class and proved to be a useful point of reference. The full set of learning objectives and activities for each iteration of EUREKA! and PROPEL can be found in the Appendix.
We created scaffolded learning objectives inspired by the Framework, with the third-year objectives building upon those of the first year (Table 1). Here, we discuss a couple of examples.
ACRL Frames | EUREKA! (First Year) | PROPEL (Third Year) |
---|---|---|
Searching as Strategic Exploration Research as Inquiry |
Construct effective database search strategies Identify and access discipline-specific scholarly databases |
Locate and integrate information from a range of resource types |
Scholarship as Conversation | Describe how scholarly information is organized and discovered | Summarize the changes in scientific knowledge over time on a particular topic |
Authority Is Constructed and Contextual | Describe the difference between scholarly and popular resources | Recognize that authority can be defined differently depending on context and discipline |
Authority Is Constructed and Contextual Information Creation as a Process | Describe the peer-review process | Critique and evaluate study design and claims |
Drawing from the Searching as Strategic Exploration Frame, one of our first-year learning objectives was to identify, access, and construct effective search strategies in disciplinary scholarly databases. We considered this to be a foundational skill that would prepare the students for more advanced and exploratory searching in the third-year, so we used an exercise called Speed Databasing (Chisnell & MacGregor 2018) to give students the opportunity to use a number of databases. The corresponding learning objective in the third-year class was to locate and integrate information from a variety of resource types, including journal articles, patents, policy documents, and financial reports. We used a role-playing exercise (https://osf.io/kjbws/) in which the math students, for example, read an article on the mathematical modeling used to create the new congressional districts in Pennsylvania and were asked to advise a state senator on the pros and cons of the redistricting, using a number of information sources to back up their arguments.
We also scaffolded learning objectives related to the Authority is Constructed and Contextual Frame. Our first-year learning objective was to describe the peer-review process and its role in creating credibility. In the corresponding Mystery Article exercise (https://osf.io/nhxc9/) students assess the credibility of an article pulled out of an envelope, with peer review being one of the criteria. Our related third-year learning objective was to create more nuanced and critical thinking on the authority of peer review with an exercise to critique and evaluate study design claims. We used the above-mentioned role-playing exercise to encourage the students to read peer-reviewed literature with a critical lens and be aware of issues such as article retraction and reproducibility of findings. Another related third-year learning objective is to recognize that the authority of non-peer-reviewed types of information such as market reports, patents, and policy documents depends on context and discipline.
Results and Discussion
Information literacy is an essential skill for college students, but it is challenging to incorporate it seamlessly into the curriculum and demonstrate its importance and relevance when the course load for learning subject-specific knowledge is already heavy. The MCS Core Education program provided an excellent opportunity to identify the challenges we face and optimize teaching objectives and methods to address these challenges. Moreover, working closely with teaching faculty allowed us to evaluate student learning and make adjustments iteratively.
Process for Refining the Course: Assessment Tool
We used a two-pronged approach for refining our learning objectives, with one of the strategies being the development of an effective assessment instrument. Over the course of our involvement with the EUREKA! and PROPEL courses, we have transitioned from using subjective assessments that rely on self-reporting of prior knowledge and attitudes to using an objective counter-balanced pre- and post-assessment of learning gains that allows us to use a data-driven approach to redesigning lesson plans (see https://osf.io/vxfte/ for all assessment instruments).
In our first iteration of EUREKA!, we conducted a pre-assessment of prior knowledge, but we were unable to compare the answers to anything on the post-assessment. We did not have a pre-assessment for the first iteration of PROPEL. The post-assessments for both of these courses included self-reported prior knowledge and student attitudes on the value of the lessons.
One of our primary challenges related to calibrating the content to a large number of students with different exposure to IL depending on their major and high school education. While observing the activities, we saw that some of the activities were too easy and failed to keep the students engaged, while others assumed too much prior knowledge. These observations varied quite a bit between recitation sections and the disciplines of the students. Our observations were supported by a large spread in the value the students ascribed to the exercises in the post-assessment. It was therefore hard to draw conclusions from the results and use them to inform our redesign of course objectives and activities. We were also concerned with the reliability of the students’ self-reported prior knowledge, since students often overstate their understanding of concepts (Dunning 2007; Ambrose et al. 2010). We thus decided to redesign our assessment for the second iteration of EUREKA! and avoid the biases associated with students’ self-reporting of knowledge and value judgments.
We collaborated with the Eberly Center to create a learning assessment that would allow us to statistically measure the learning gains of the students. We used a counterbalanced assessment design, which is commonly used to control for order effects in repeated measure experiments in psychology and social science research (von Davier et al. 2004). In this context, it controls for both differences in the difficulty in the questions in the pre- and post-assessments and order effects. We first created two versions of the assessment, A & B, that had questions that were the same question type (e.g., true/false, multiple choice) and tested the same concepts, but were worded differently. We then gave half of the students Assessment A at the beginning of the lecture, followed by Assessment B at the end of the recitation section later that week. The other half of the students took Assessment B first, followed by Assessment A. The assessments were integrated into Canvas, the university’s learning management system, and the students received immediate feedback.
The students on average scored 61.1% correct on the pre-assessment and 75% correct on the post-assessment, which corresponded to a significant learning gain of 12.9% from pre-assessment to post-assessment (Figure 1) (paired Student’s t-test; p < .001). Although 202 students participated in the class, only the 182 students who took both the pre- and post-assessments were included in the analysis. The student performance on the two versions of the assessment was similar. To control for the fact that some students merely click through the assessment, we removed 35 responses with suspicious timestamps and found no change in the results.
Figure 1. EUREKA! student performance and learning gain on pre- and post-assessments. Results are significant; n=182; p < .001, t (181) = 10.71, Cohen’s d = 1.04; error bars are standard errors of the mean.
Next, we determined the learning gain associated with each learning objective (Table 2). The learning objective learning gains were calculated by averaging the learning gains for all questions that mapped onto each learning objective. Students had the smallest learning gains for the Identify Scholarly Information (9.7%) and Describe the Peer-review Process (10.3%) learning objectives, and had larger learning gains for the following learning objectives: Identify the Leading Databases, Their Features, and Special Uses (13.5%); Identify Primary and Secondary Information Sources (13.8%); and Identify Peer-reviewed Literature (21.3%). We do not have learning-gain data for learning objectives that did not have any related assessment questions, such as Explore Database Structure and Understand Basic Search Technique and Strategy, a limitation that is further explored in Table 3.
Topic | Learning Objectives | Learning Objective Learning Gain (%) |
---|---|---|
Scholarly Information | Identify Scholarly Information | 9.7 |
Identify Primary and Secondary Information Sources | 13.8 | |
Describe the Peer-review Process | 10.3 | |
Peer Review | Identify Peer-reviewed Literature | 21.3 |
Explore Database Structure | N/A | |
Databases | Identify the Leading Databases, Their Features, and Special Uses | 13.5 |
Search Strategies | Understand Basic Search Technique and Strategy | N/A |
Build Advanced Search Strings | -10.3 | |
aLearning objectives that had a greater than 10% learning gain are indicated in bold. The learning objective-level learning gains were calculated by averaging the learning gains for all questions that mapped onto each learning objective. |
One limitation of our assessment is that we did not align the same number of assessment questions to each learning objective, with the number of questions per objective ranging from zero to three (Table 3). We also did not dedicate the same amount of lecture time to each objective, with the content and in-class activities related to Build Advanced Search Strings largely being skipped due to time. Thus, low learning gains or even a negative learning gain in the case of Build Advanced Search Strings (-10.3%) might be attributed to not dedicating enough time or active learning exercises to these concepts. Poorly worded questions can also lead to low learning gains, and we will work with the Eberly Center to optimize both the wording and the assessment alignment for the next iteration.
To further explore why some learning objectives were associated with low learning gains, we calculated the percent learning gain for each assessment question. The question-level learning gain is the difference (post - pre) in the percent of students that answered the post-assessment question and the pre-assessment question correctly. For each question, the percent of students answering correctly was averaged across both versions of the question.
We found that in most cases, low learning gains were likely caused by a ceiling effect, whereby students scored high on the pre-assessment questions (>85%) and left little room for improvement on the post-assessment. For example, 88% of students correctly answered the following pre-assessment question: Scholarly articles are written by: (A) Experts in the field (B) Journalists (C) Editor of the journals as editorials (D) Scientific reporters. Although a high percentage of students, 95%, correctly answered the same question in the post-assessment, this corresponded to only a 7.1% learning gain due to the high prior knowledge of the students.
There were questions with a potential ceiling effect for three topics, Scholarly Information, Peer Review, and Search Strategies, suggesting that in future iterations of EUREKA! we can prioritize more advanced topics related to these areas of IL. With our assessment, we were also able to identify that Building advanced search strategies is an area of weakness for the students and should have more time and active learning exercises dedicated to it.
Learning Objectives | Pre-Assessment Question (Version A) | Pre-Assessment Question (Version B) | % Correct for Pre-Assessment Question | % Correct for Post-Assessment Question | Learning Gain for Assessment Question (%) |
---|---|---|---|---|---|
Identify Scholarly Information | Scholarly articles are written by: (A) Experts in the field (B) Journalists (C) Editor of the journals as editorials (D) Scientific reporters | Scholarly articles are typically reviewed by: (A) Community (B) Experts in the field (C) Scientific reporters (D) Editor of journal | 88 | 95.1 | 7.1 |
Select the resources you can use for scholarly research (Select all that apply): (A) Patents (B) Scientific magazine articles (C) Standards (D) Editorials | What are the most common resources besides scholarly articles that can be used for research? (A) Patents (B) Standards (C) Conference papers (D) All of the above | 47.8 | 64 | 16.2 | |
What are two of the factors that can help you determine the credibility of an article? (A) The sponsor of the article and the date of publication (B) The author's credentials and contact information (C) The sponsoring organization and whether or not it’s an unbiased source (D) Its recency and verifiability | What are factors that can help you judge the accuracy of an article? (A) Date of publication and relevant information (B) Information relevant to your research and cited sources (C) Peer reviewed and date of publication (D) All the above | 53 | 58.8 | 5.8 | |
Identify Primary and Secondary Information Sources | Which one of the following is both a secondary information resource and a scholarly resource: (A) Newspaper article citing a journal article (B) Annual report (C) Review article (D) Research article | Which one of the following information resources is a primary scholarly resource for research: (A) Author’s Diary (B) Technical reports (C) Review article (D) Research article | 51 | 64.8 | 13.8 |
Describe the Peer-review Process | A peer reviewer is typically an expert in the same field as the authors. (T/F) | A peer reviewer will evaluate the quality of the science in a paper. (T/F) | 83 | 88.5 | 5.5 |
The purpose of peer review is to: (A) Provide quality control of published literature (B) Form collaborations with others in the same field (C) Determine whether the article should be made freely available to the public | Which of the following statements about peer review is TRUE: (A) Reviewers should not evaluate whether the scientific findings in an article are novel or innovative (B) Reviewers typically evaluate articles in their own discipline (C) Peer review will catch all of the mistakes that an author might have made when writing their article. | 79 | 94 | 15 | |
Identify Peer-reviewed Literature | Which article type is NOT peer reviewed? (A) Articles that are published in peer-review journals (B) Review articles (C) Editorials published in a scholarly journal | Which of the following is NOT a useful question to ask when determining whether an article is peer reviewed: (A) What type of article is this? (B) Is it published in a peer-reviewed journal? (C) Was it published within the last ten years? | 42 | 53.3 | 11.3 |
All peer-reviewed articles say “Peer Reviewed” under the list of authors. (T/F) | All articles in a peer-reviewed journal are peer reviewed. (T/F) | 16 | 47.3 | 31.3 | |
Explore Database Structure | None | None | |||
Identify the Leading Databases, Their Features, and Special Uses |
Match the following databases to the to the discipline that they cover. Databases: PubMed, SciFinder, MathSciNet, Inspec, Web of Science; Disciplines: Biology, Chemistry, Math, Physics, Interdisciplinary | Which of the following databases is the most interdisciplinary? (A) Inspec (B) SciFinder (C) Web of Science (D) PubMed (E) MathSciNet | 53.9 | 82.1 | 28.2 |
When searching for a specific research topic, Google Scholar compared to research databases: (A) Uses artificial intelligence to search, hence is better (B) Has fewer options for filtering search results (C) Returns more accurate research result, hence is better | When searching for a specific research topic, literature databases as compared to Google: (A) are more likely to find journal articles (B) collect information from less sources, hence are not as good (C) tend to return less results, hence not as good | 88 | 94.5 | 6.5 | |
Citation databases are a good tool for finding literature related to an article that you have already located. (T/F) | One advantage of research databases over Google Scholar is the ability to easily analyze your search results. (T/F) | 91 | 96.7 | 5.7 | |
Understand Basic Search Technique and Strategy | None | None | |||
Build Advanced Search Strings | The search string flu vaccine* is equivalent to flu vaccine. (T/F) | The search string (flu OR influenza) AND effectiveness is equivalent to flu OR (influenza AND effectiveness). (T/F) | 51 | 40.7 | -10.3 |
aAssessment questions that had a greater than 10% learning gain are indicated in bold. Correct answers are bolded. |
Process for Refining the Course: Collaboration with Faculty
We have designed dedicated information literacy instruction for first and third-year students, but what instruction, if any, do they receive in this area in their second year? We next wanted to better understand the curriculum as a whole for MCS students; this proved challenging because the students belong to four majors with different required coursework. MCS is also continually working to improve their core curriculum. We, therefore, engaged with the faculty and the Eberly Center to understand the information literacy competencies of the students in each discipline as they progress through the program. We learned that each of the four majors has a required Sophomore Colloquium that emphasizes professional development, suggesting another opportunity for embedding information literacy into the curriculum.
We are also working to improve the scaffolding of our learning objectives with those of a required online module on computing skills for incoming students, Computing@Carnegie Mellon (C@CM). While the learning objectives of C@CM and our first-year class are distinctly different, with C@CM focusing on how to use the library catalog and interdisciplinary databases, the instructors suggested that the students had a difficult time connecting the content between the two. Moreover, student feedback suggested that most first-year students had already taken C@CM at the time of our EUREKA! session and felt the material was repetitive. In response to that feedback, we designed a survey to evaluate how well students retained what they learned from C@CM and modified lecture content to naturally transition into deeper content, while building connections to what they had already learned and understood. We also now explicitly explain to students how the concepts are related and build upon each other, as well as provide real-world applications.
Faculty also shared that they observe an over-reliance on Google throughout the entire four-year curriculum. In response, we designed an active-learning exercise for the lecture called “Google versus Web of Science” that is specifically designed to demonstrate the value of searching in the Web of Science database when compared to Google. We will continue to use this iterative and collaborative approach to calibrating our learning objectives to the student’s competencies as the MCS curriculum continues to evolve.
Too much content, too little time
One of the challenges we encountered was trying to address a wide breadth of information literacy concepts, in an interdisciplinary albeit science-focused context, with only limited face time with students—a perennial problem in library instruction. This was made more difficult in EUREKA! by the fact that the instruction was not clearly tied to a graded course assignment, as discussed further below. For the EUREKA! recitations in our first year of involvement with the class, four separate activities were designed. Even though individual activities were well received, it was overall too overwhelming for students to finish in 60 minutes of class time. Moreover, because these recitation lesson plans were delivered by faculty, it is likely the faculty themselves were overwhelmed by having to teach so many activities related to topics that they may not have been fully familiar with, at least from a pedagogical perspective. The second time around, we streamlined the recitation lesson plan into one activity, “Along the Graphene Trail,” which integrated multiple learning objectives into one set of scavenger-hunt-style questions (see https://osf.io/j9hv7/). The content and expected outcome was written with clear instructions for teaching faculty to lead the recitation. The faculty leading recitations indicated that this change was well received, and students were engaged with the assignment.
Similarly, in our second iteration of EUREKA!, we tried to pack four different topics (scholarly vs. non-scholarly articles, peer-review process, science databases, and advanced database search strategies) within a 90 min lecture, with at least four active-learning activities. This turned out to be too much content, especially given the large class setting. We had to rush some content, and skip some important in-class active-learning activities including Google vs. Web of Science, an activity that was identified to be especially important to make students appreciate the power of structured search methods using literature databases, as compared to Google (https://osf.io/ag4x5/).
The interdisciplinary nature of MCS also made lesson delivery challenging. Especially for PROPEL—most students have declared a major at this stage, and therefore a topic interesting for Biology students may not necessarily be interesting for Physics students. In the second iteration of EUREKA!, we tried to demonstrate database searches incorporating databases of each discipline, but were short on time. An additional challenge is creating recitation lesson plans for the many teaching faculty who lead them. While some of the teaching faculty are familiar with the resources covered in the lesson plans, others are not.
Incentivizing student learning without graded assignments
EUREKA! and PROPEL are required courses with minimal credit hours, and therefore instructors are not allowed to assign long assignments or quizzes outside of class. We have invested a lot of effort into maximizing our in-class time, but not being able to use assignments as an incentive can make it difficult to engage students.
To address this challenge, we integrated multiple active-learning exercises into the lecture and recitation sessions. The activities worked really well during recitations, which have class sizes of around 20 students. For example, 74% of students in the first year of PROPEL reported in the attitudes post-assessment that they found the role-playing exercise useful for learning information literacy concepts. However, the same is difficult to achieve in a large class setting with 200 students. We found that Think-Pair-Share did not work well because many students did not engage in the exercise. Moreover, with exercises using DirectPoll (https://directpoll.com), where everyone can enter their answers in real-time on their mobile device, students lost interest after too many polling questions. We therefore need to be careful about the design and pacing of questions in future semesters.
Conclusion
We have presented here an example of collaborating with faculty to provide information literacy instruction to undergraduates in the sciences, addressing multiple learning outcomes. We used the Framework to inform learning outcome development and to support conversations with faculty related to incorporating information literacy in two required interdisciplinary lecture courses, occurring in the first and third years. Through the process of assessment and faculty collaboration, we adapted our learning outcomes to meet the needs of this continually evolving curriculum and to work toward a scaffolded learning approach.
As we anticipate continued library involvement in this program, we aim to further build on this work to improve students’ use and assessment of information as they develop as science scholars. For example, given the challenges encountered in ‘handing off’ lesson plans and materials to faculty for recitations, we plan to move toward a ‘train-the-trainer’ model in which recitation faculty and teaching assistants will receive a brief training session regarding the proposed recitation lesson. This could be delivered by a librarian, but we believe a peer-to-peer approach would be more effective with a faculty member providing the training with librarian support. This pre-recitation training will help improve consistency across recitation sections and will give faculty an opportunity to ask their own questions about the content.
We will also take advantage of the flipped classroom model that is increasingly being used to teach information literacy at other institutions (Rodriguez 2016; Dommett 2018; Shen 2018) in future iterations of EUREKA!. In line with the continual evolution of this course, the teaching faculty have decided to move away from the guest-lecture approach, with the aim of spending most of the in-class time doing hands-on activities. Thus, we are currently developing brief modules using an online learning platform that students will be required to take prior to the lecture, allowing more class time for questions and active learning, as well as providing a deeper pre-session assessment opportunity. These modules may additionally have applications in other curriculum across campus.
In addition to improved class-based assessment, we have an opportunity to carry out longitudinal assessments, since the same cohort of students is receiving instruction in both the first and third year. We plan to work with the Eberly Center to design assessment and benchmarking tools that will measure student learning across the curriculum. As discussed above, we are also seeking opportunities in the second year to reach students with information literacy training, bridging the gap between EUREKA! and PROPEL.
Finally, we found the Framework to be a useful tool in our initial conversations with faculty regarding course learning objectives and in the development of discipline-oriented scaffolded learning outcomes. The Framework has also been flexible and adaptable enough to support the evolving nature of the course and our information literacy objectives. We intend to continue to use the Framework as a foundation as we develop our online modules and other new activities and lesson plans to support the goals of the faculty as they adapt and improve the curriculum.
The experience presented in this paper highlights a variety of ways that librarians can provide course-based information literacy instruction. We encourage others to use and adapt the many active-learning activities that we developed for this course, which we provided on the Open Science Framework (https://osf.io/vxfte/), but also to consider other roles beyond the ‘sage on the stage’ and the many ways students can be reached as a result of deep collaboration with faculty.
Acknowledgements
We thank Ken Hovis for the opportunity to teach in the EUREKA! and PROPEL classes and feedback on lesson plans, Jill Chisnell and Teresa MacGregor for support in using their Speed Databasing activity, and Alicia Salaz for early support and help making connections.
References
Ambrose, S.A., Bridges, M.W., DiPietro, M., Lovett, M.C., Norman, M.K. & Mayer, R.E. 2010. How Learning Works: Seven Research-Based Principles for Smart Teaching. 1st ed. San Francisco (CA): Jossey-Bass.
Ayre, S., Barbrook, J., Engel, C., Lacey, P., Phul, A., Stevenson, P. & Toft, S. 2015. Measuring the impact of information skills training: A survey of health libraries in England. Health Information and Libraries Journal 32(1): 50–60. DOI: 10.1111/hir.12079.
American Library Association 2015. Framework for Information Literacy for Higher Education. Available from http://www.ala.org/acrl/standards/ilframework.
Association of College and Research Libraries 2000. Information Literacy Competency Standards for Higher Education. Available from http://www.ala.org/acrl/standards/informationliteracycompetency.
Baroutian, S. & Kensington-Miller, B. 2016. Information literacy: The impact of a hands-on workshop for international postgraduate students. Education for Chemical Engineers 14: 16–23. DOI: 10.1016/j.ece.2015.10.001/.
Black, S. & Allen, J.D. 2017. Part 3: College student development. The Reference Librarian 58(3):214–228. DOI: 10.1080/02763877.2016.1276505.
Bury, S. 2011. Faculty attitudes, perceptions and experiences of information literacy: A study across multiple disciplines at York University, Canadian Journal of Information Literacy 5(1): 45–64. DOI: 10.11645/5.1.1513.
Chisnell J., & MacGregor T. 2018. Speed Databasing. DOI: 10.17605/OSF.IO/K6B4H.
Dommett, E. J. 2018. Using a flipped classroom to embed information literacy skills training into academic studies. Journal of Information Literacy: 12(1): 97–108. DOI: 10.11645/12.1.2349.
Dunning, D. 2007. Self-Insight: Roadblocks and Detours on the Path to Knowing Thyself. 1st ed. New York: Taylor & Francis (Essays in Social Psychology).
Ferrer-Vinent, I.J. 2016. Programmatic and scaffolded information literacy embedded in the science curriculum. Science & Technology Libraries 35(4): 295–303. DOI: 10.1080/0194262X.2016.1214096.
Franzen, S. & Bannon, C.M. 2016. Merging information literacy and evidence-based practice in an undergraduate Health Sciences curriculum map. Communications in Information Literacy 10(2): 245–263. Available from https://eric.ed.gov/?id=EJ1125454.
Guth, L.F., Arnold, J.M., Bielat, V.E., Perez-Stable, M.A. & Meer, P.F.V. 2018. Faculty voices on the Framework: Implications for instruction and dialogue. portal: Libraries and the Academy 18(4): 693–718. DOI: 10.1353/pla.2018.0041.
Hosier, A. 2017. Creating learning outcomes from threshold concepts for information literacy instruction. College & Undergraduate Libraries 24(1): 1–13. DOI: 10.1080/10691316.2017.1246396.
Kuglitsch, R.Z. 2015. Teaching for transfer: Reconciling the Framework with disciplinary information literacy. portal: Libraries and the Academy 15(3): 457–470. DOI: 10.1353/pla.2015.0040.
Phillips, M.L. & Zwicky, D. 2018. Information literacy in engineering technology education: A case study. Journal of Engineering Technology 35(2): 48-57.
Pinto, M. 2016. Assessing disciplinary differences in faculty perceptions of information literacy competencies. Aslib Journal of Information Management 68(2): 227–247. DOI: AJIM-05-2015-0079.
Rodriguez, J.E. 2016. A massively flipped class: Designing and implementing active learning information literacy instruction for a large enrollment course. Reference Services Review 44(1): 4–20. DOI: 10.1108/RSR-07-2015-0033.
Shen, J. 2018. Flipping the classroom for information literacy instruction. Journal of Information Literacy 12(1): 48–67. DOI: 10.11645/12.1.2274.
Sloane, M.E., Quintel, D.F. & Groves, C. 2018. Lesson plan pilot project for Physical Science. Issues in Science and Technology Librarianship 90. DOI: 10.5062/f4765ckd.
von Davier, A.A., Holland, P.W. & Thayer, D.T. 2004. The Kernel Method of Test Equating. New York: Springer-Verlag (Statistics for Social and Behavioral Sciences). Available from https://www.springer.com/us/book/9780387019857.