key: cord-0074411-l0z0vvxb authors: Robinson, Karlene Patricia; Jones-Edman, Genevieve title: Teaching and assessing students of information literacy in a single session—The case of the University of the West Indies Mona library date: 2021-04-03 journal: nan DOI: 10.1177/09557490211063532 sha: e8d78c38c8a10d02b9a194f0e2c284f4379d650e doc_id: 74411 cord_uid: l0z0vvxb Assessing the performance of information literacy (IL) students can be a daunting task for librarians globally. Most IL sessions are taught in 1 to 2 hours where any meaningful assessments are difficult to achieve. This research demonstrated how this feat was achieved in an active learning environment through the use of Google Forms. This mixed method study shows how this was effectively achieved to test both lower and higher order skills in a 2 hour session to one hundred and seventy-two foundation writing course students.The research tested a rarely examined feature of Google Forms which is the tool’s effectiveness in enabling comprehensive assessment, facilitating active learning, and identifying instructional errors in an IL instruction session. The findings show that Google Forms can be used to teach and administer a quiz using both multiple-choice as well as open-ended questions to assess both low and higher order learning skills in IL. Students were able to actively respond to questions while they were being taught, the data gathered and analyzed and used to inform future library instruction. It also showed that Google Forms are useful not simply to administer multiple-choice quizzes at the end of teaching but can be used in executing real-time assessment and support active learning. Because Google Forms support the easy creation of charts and downloading/exporting of statistics, results of assessments can be shared among librarians, faculty, and students to motivate and encourage digital pedagogy. It allows for greater collaboration with faculty in the cooperative teaching of students in single sessions where there is usually difficulty in having dialogue with faculty once a session ends. This case study is based on a limited number of students; thus, the findings of this research may not be generalized but the methodology and some skills in teaching the concepts encountered by librarians may be replicated. Assessing the performance of information literacy students is extremely challenging for librarians in the Caribbean and certainly globally. To achieve this in a short or one-shot information literacy sessions is especially difficult. It is therefore understandable that there is a dearth of literature about information literacy assessment. This research paper adds to the body of literature and presents a model for assessment that may be copied regionally and internationally. This paper shows how Google Forms can be used not just simply for administering multiple-choice quizzes at the end of teaching but can be used as a practical tool in executing real-time assessment and incorporating active learning in IL sessions. The research therefore looks at Google Forms rarely examined, effectiveness as a tool used to facilitate active learning, comprehensive assessment, and identifying instructional errors consequently understanding the effectiveness of information literacy instruction. Since the inception of the Mona Information Literacy Unit (MILU) at the University of the West Indies Mona Library in 2001, it has been one of the goals of its Coordinators to engage in formative assessment of students' performance in order to refine teaching practice based on greater understanding of immediate levels of students' performance. The thinking behind this assessment is to identify whether students are learning the concepts and skills being imparted, with a view to understanding the cause and correcting them. A secondary reason is to reduce the unevenness in the quality of teaching, by ensuring that the same content and skills are taught by all instructors/librarians for the same lessons. It is also a desirable goal that librarians will be able to use their knowledge of the different ways in which students learn and use these to develop within IL the pedagogies involved in providing that support in providing instruction. Most of the teaching done by MILU is either single customized sessions or single stand-alone sessions where there is hardly any follow-up to identify if students grasped and could apply what was taught. Faculty also provide a litany of skills that they wish their students to learn in the customized sessions so time is taken up with imparting these skills, yet there is always the gnawing question of whether these students have learnt them. Additionally, some of the classes are exceptionally large. It is against this background that the authors decided to take up the challenge of devising this test and using Google Forms to administer it while teaching as well as to use the data harvested to determine if students learnt the skills that were taught and whether librarians could identify errors made in teaching these students and to what extent it is able to overcome the problems experienced in not being able to accurately assess students of IL at the University. Assessment has always been a part of information literacy sessions organized by MILU but the format has changed over the years. In the early years, assessment was primarily done through paper-based evaluation forms which followed a general training evaluation form that focused mainly on how the session was presented and less on information skills learnt. Some of the questions included: "What ways could the lecture be improved?" "What was most useful about the tutorial?" Perhaps the closest attempt at evaluating the extent of information literacy skills learnt, was an evaluation form that asked students to rate the extent to which the session improved their skills at searching, citing, and understanding different types of resources. Although these forms asked individual students to identify the areas in which they would want further training, it did not ask for contact information to follow up with this training. As the teaching evolved, the method of assessment changed from paper-based to webbased and from evaluation forms to quizzes aimed at assessing the extent of information skills transference. MILU currently uses Google Forms to create quizzes for its information literacy module-based instruction. The quizzes are completed by students at the end of a two-hour one-shot session that forms a part of the mandatory Critical Reading and Writing courses offered by the language department. These quizzes incorporate multiple-choice and short answers designed to assess learning outcomes on keyword identification, effective search strategies, and evaluation of sources. Active learning is incorporated into the session as students are given exercises and asked to record answers on their worksheets. Other information literacy sessions, like that offered during new student orientation are meant to introduce students to the Library's discovery platform-UWILinC. In these sessions, active learning is incorporated through a practical which asks students to search and identify sources and their location. However, responses are not recorded and no chance for out of class analysis exists. These two main types of assessment methods (worksheet and quiz) have limitations that affect effective IL assessment because the students leave the class with the worksheets, so instructors are not able to truly evaluate if learning takes place. Additionally, most of the questions in the quiz are in multiple-choice format with little chance to understand how the student thinks/understands and communicates this understanding and whether he or she can use the skills taught. In this "experiment," the aim was to create an active learning tool in the online environment which would overcome these challenges. This would offer benefits of realtime assessment plus out of class analysis of answers could be done with a view to improving teaching as well as content. Essentially, the printed worksheet could be transferred to Google Forms allowing for greater interaction in the teaching sessions and assessment of skills taught. Google Forms would also facilitate a greater and more fulsome assessment than simply using it for multiple-choice quizzes. Likewise, the study will assess aspects of IL instruction that were not usually assessed via Google Forms. The University of the West Indies Mona Library has a main and four branches in two parishes and serve seven (7) faculties and twelve (12) professional schools that offer more than two hundred (200) programmes with over eighteen thousand (18,000) graduate, undergraduate and continuing studies students. As part of its service to these stakeholders, the Library in situating itself in the UWI's vision, has identified information literacy as a core outcome for the ideal UWI graduate (UWI, 2016) . Information literacy training is therefore carried out by the MILU that offers a variety of instruction sessions, customized training, and IL modules. Specifically, sessions include-using the Library's discovery platform, citation styles, avoiding plagiarism, and IL modules which are embedded in foundation writing courses 1. Can Google Forms be used to actively teach and assess students simultaneously in a single IL session? 2. Can higher order search skills used in IL be assessed through the use of Google Forms? 3. Is it possible to use Google Forms to receive feedback that can assist librarians in improving their teaching? The role of information literacy Universities, as expressed in their various strategic plans, desire that their graduates are ready for the job market and that they do so by demonstrating critical thinking skills. The University of the West Indies is no different. The 2017-2022, Triple A Strategic Plan for the University, like all its previous strategic plans, speaks to the ideal graduate being a "critical thinker…effective communicator with good interpersonal skills …IT-skilled and information literate …who will use [his/her] skills to help in the revitalization of Caribbean development (UWI, 2016) . A well-structured Information Literacy (IL) programme can contribute to this given that IL has the greatest potential to produce these skills that are not only institutionally desirable, but also personally desirable by our graduates and required in the global marketplace. According to Barrie (2007) universities are under increasing pressure to have their entry level students demonstrate graduate type skills and a trajectory of improvement. Students of the 21st century must be prepared to work in a world that requires a different skill set from the past decades and hence there is a shift toward empowering them to be critical and reflective thinkers. Thus, any programme of learning that is geared toward outfitting students must engage them in their learning. The focus is therefore, on learning development instead of instructional development with the learner playing a more active role, thus teaching/instruction will change and with it the mode of assessment. The mode of assessment should be geared at helping students to analyze and evaluate information and make informed choices and inferences. Information literacy assists students to do this, but instructors of IL must be able to know if their students are learning these skills. It is vital that librarians engage in assessment as this gives credibility to their teaching, indicates importance, and improves IL as a subject (Webber and Johnston, 2003) . While lecturers have a longer time with students during a semester to engage them in various types of assessment and tailor their teaching to ensure certain skills are taught and are transferable, this is hardly the case with librarians who may teach these students only once and therefore to be effective must make their teaching experience count. The literature shows that where IL is embedded in courses as opposed to stand-alone classes, it is easier for librarians to have committed students and be better able to execute worthwhile assessment as they often can collaborate with lecturers and have follow-up with these students (Price et al., 2011) . It is even better when the library is able to offer credit courses as it is easy to assess student learning in these courses. Augustana librarians at the University of Alberta have such IL credit courses. They use a multi-pronged approach to get information about the impact IL is having on student learning and attitudes toward research by engaging in "performance-based tests" (Augustana Library, 2019). The librarians use pre-and post-tests and course assignments at different junctures in their IL credit courses. However, librarians who do not have that latitude may teach in embedded courses, such as what obtains at the UWI where librarians have only 2 hours to impart a multiplicity of skills to large classes. It is therefore a major challenge for librarians to know whether they have impacted student learning or added any value to their students' ability to conduct effective research when their time with students is so limited and lecturers want them to be taught a multiplicity of research skills. Brage and Svensson (2011) indicated that it is important that information literacy assessment methods be tied to desirable learning outcomes and guide students for the future. They also noted that educators engage in assessment to identify how an individual or group is learning and to find out what aspects of the teaching or learning can be improved. Augustana librarians in referencing Maki (2002) noted that IL assessment of students is motivated also by "institutional curiosity (Augustana Library, 2019)" This supports Brage and Svensson (2011) , in that librarians like most academicians want answers to questions as to which students learn, what they learn, when they learn and how well they learn. Libraries attached to institutions of higher learning use a variety of methods to evaluate students' learning, chief among them being online tests (Barrie, 2007) as these provide almost instant feedback and therefore are most welcome when classes are large and when there are many of such classes. The use of students' assignments to assess the writing and critical thinking skills is another method. Many have opted to use online assessment tests/ quizzes as they can be used to gather valuable information to guide librarians in planning for instruction. However, online tests come under criticism as academics believe that these online multiple-choice tests do not have the capacity to accurately assess higher order skills, but they acknowledge that at least they can be useful in comparing performances (Scharf et al., 2007) . Brage and Svensson (2011) noted that in the past, librarians focused on using multiplechoice, fill in the blanks, matching questions, and other standardized tests, and while these forms of assessment do test a few cognitive skills as well as knowledge, they are not effective in assessing search skills (46). The authors applaud authentic over traditional assessments that require students to "demonstrate complex cognitive skills and competencies that realistically represent problems and situations likely to be encountered in daily life" (46). In authentic assessment, students work on real life problems, projects or products that really get them engaged and motivated. Students must remember information learned and use this to come up with an original product based on stated criteria given in a rubric which they are given prior to their classes (Brage and Svensson, 2011: 47) . This type of assessment often takes place in an active learning setting. Rubrics help to guide learning. Yager et al. (2013) used quiz and rubric results to assess 227 information literacy students and found out that both quiz and rubric scores are useful forms of assessment as they both had positive correlation. They concluded that there are benefits of using both when teaching in an embedded curriculum. Knight (2006) however believes that a more authentic method of assessment is to use students' written work along with rubrics to scaffold assessment criteria that can assist in mapping their performance and help them to improve. This however may be time consuming and not practical for large semester-long classes. Yet many universities globally especially in Australia have circumvented this challenge and have created many short online multiple-choice tests to assess IL (Price et al., 2011) . Active learning can be defined as anything that "involves students in doing things and thinking about what they are doing" (Bonwell and Eison, 1991: 2) . It involves the application of theory and concepts where the learner is involved in his/her learning process through learning techniques such as case studies, role play, informal small group discussions, simulations, and other problem-solving exercises. Meyer and Jones (1993) noted that not only will students develop skills and abilities through these learning opportunities, but they are able to "to talk and listen, read, write, and reflect as they approach course content" (p. xi) The research on active learning shows improved academic performance and no evidence of reduced learning among students from active learning classrooms as opposed to traditional teacher centered ones (Thomas, 2009 ). Thomas further indicated that active learning is an approach to teaching where teachers get students actively involved in their learning (2009). Students are required to apply the information they learn during the time they are being instructed in the classroom and reflect on the actions they have taken. Auster and Wylie (2006) , proponents of active learning noted that we are living at a time where there is intense competition among business schools; hence, faculty members face increased pressure to excel in teaching (333). In referencing Bruce (2001) and Byrne (2000) , the authors noted that to gain competitive advantage, their students need to not only excel in research, but faculty needs to excel in teaching. Students expect that their teaching experiences are more interactive allowing them to not only to obtain knowledge but to apply the knowledge gained. Critics of active learning believe that it can create concern, confusion, and frustration among students, but the pros seem to outweigh the cons as this type of alternative assessment helps students improve their analytical skills and enable them to integrate what they know with what they learn. Klionsky (1998) suggests that active learning can take place not just with small manageable groups but also with larger groups, as the onus is on the teacher to tailor the techniques to suit the group. For example, rather than plan a bigbreak-out session with large groups, "pair chat" can probably achieve the same outcome. Where faculty members have difficulty managing a participatory class that goes off track, Auster and Wylie (2006) suggest that "clear learning objectives" (334) are necessary to ensure course content is covered. They also suggest the use of key "provocative questions" to guide class participation. Another concern is that creative active learning takes too much effort to plan and prepare. However, the advantage is that the likelihood of students being satisfied increases and the class will have a more logical flow, students will be more engaged and greater learning will take place (Auster and Wylie, 2006) . Careful planning and preparation can lead to classes that are more meaningful, fun, and productive where learning takes place by instructors doing research and following the tips and guidelines from the pros. Auster and Wylie (2006) also noted that sometimes some faculty members shy away from using active learning believing that they must try a variety of different techniques in every class and therefore may not be able to cover mandatory content. However, they noted that faculty can be just as effective by making minor changes to a lecture by simply getting students to discuss current events to improve their assimilation and understanding of course content. Google Forms has made it possible for IL to be conducted in an active learning setting and for librarians to be able to teach and assess because it facilitates feedback. It also allows for teaching to be conducted in a structured way reducing confusion and chaos which may characterize an active learning classroom. Google Forms is a cloud-computing application which is part of Google Docs suite of educational products. It enables the creation, distribution, collection, and analysis of surveys and therefore, facilitates the production of online quizzes that enable the design of multiple-choice and open-ended questions. Because of its cloud-computing feature, responses are available in real-time and can be made available in presentation style charts (pies, bar, graphs, etc.) which can be easily shared. Google Forms is a suitable choice for assessment in higher education because of features such as cost effectiveness, ease of access and sharing capabilities, ease of creation and editing and analysis (Hsieh and Dawson, 2010) . However, it is also a valuable tool in facilitating active learning and realtime assessment of students. Djenno et al. (2015) used Google Forms in an aim to incorporate more active learning in IL sessions at the University of Illinois in Chicago. They developed five questions based on learning outcomes identified and solicited in-class feedback via Google Forms. Open-ended questions were developed; such as," enter you keyword," "cut and paste citation for the best scholarly article…" and "say why you think the article will be useful…" Djenno et al. (2015) observed that the use of Google Forms facilitated more engaged students, enabled real-time assessment during sessions as well as out of class analysis of responses and overall Google Forms was an easy and effective way to incorporate active learning. They noted that although active learning was traditionally incorporated in IL sessions using handouts/worksheets, they were ineffective because students left them behind and furthermore librarians did not collect the information gathered from the worksheets. Rodriguez (2018) conducted a study using Google Forms for one-shot IL sessions and concluded that Google Forms was a valuable tool for active learning especially during one-shot instruction. The questions for this study were developed based on identified learning outcomes centered on the ACRL Framework. To evaluate whether learning outcomes were met Rodriguez created a formula to calculate success rate based on the answers supplied in the forms. The study revealed a high rate of success notwithstanding 47% of the class not completing one or more questions. Because students had Googlesuite accounts, feedback was not anonymous, so individual follow-up was possible. Rodriguez, however, cautioned that active learning through the use of Google Forms presented challenges where the assessment of the feedback is concerned because of; limited text-formatting option which hinders proper formatting of citation (e.g., italicizing); lengthy time to read-through responses when the paragraph style questions is utilized and the time constraints of one-shot sessions which can hinder completion of tasks. Rodriguez emphasizes proper planning to counter these challenges. The added value of Google Forms to the classroom has proven to be its ability not only to enable effective in-class and out of class feedback allowing for student follow-up and its potential for creating an active learning environment but also for assessment of students' performance. Simpson (2012) , in discussing Google Docs use in libraries, highlighted its ability to enable real-time in-class feedback to students and increase student engagement. Although her case related an example using Google Spreadsheet for IL assessment, the highlighted features were also applicable to Google Forms and she briefly related other ways in which Google Forms were used by librarians. Haddad and Kalaani (2014) used Google Forms to solicit regular feedback in a semester-long engineering class and also found that through the feedback gathered, engineering instructors were able adapt pedagogy and course content to improve student performance. This research has great similarity especially in aim and methodology to the study done by Rodriguez and Djenno et al. (2015) and should add to the much needed literature on active learning in information literacy instruction as well as Jamaican case study of the subject. Additional data on the effectiveness of IL sessions are incorporated as individual responses are analyzed to understand students' thought processes. This study used a mixed method approach to assess the effectiveness of teaching and assessing IL students in a single session using the technology afforded us through Google Forms. The course selected was a mandatory foundation writing course: Critical Reading and Writing where the Library usually has several IL sessions with these students in this course at the start of each of its two semesters. Six classes in the first semester with a total of one hundred and seventy-eight (178) students were used in this research. The topic selected for the lesson was "How to search UWILinC". UWILinC is the Library's discovery platform and this is usually the first lesson students are taught because they need to be empowered to use this platform to locate information resources throughout their university life. The learning outcomes were next selected, and these were chosen from the basic content that would allow these students to get a thorough understanding of how to use this platform well. They are: Define UWILinC 1. State the reasons why you need to Sign in to UWILinC. 3. Search UWILinC to locate a book and a journal article. 4. Explain how to locate current articles in UWILinC. These outcomes were then aligned to the content that would be taught. Then ten main questions were formulated that would allow the librarians teaching the lessons to actively teach and cover the content and the learning outcomes. Three of the ten questions, that is question five question six and seven had multiple parts bringing the total number of questions to 13 as seen at Supplementary Appendix B. These three questions were added as additional tasks to assist students to execute the tasks they were asked more effectively. Overall, there were a total of 20 questions; six multiple-choice questions that catered to demography which included contact information so that if necessary, librarians could follow-up with students, and thirteen open-ended questions dealing with the content. These questions were presented as a quiz and four librarians taught the lesson to six classes to a total of 178 students. These questions allowed the researchers not only to capture quantitative but qualitative data from the responses of the students. The Google Forms application allowed us to create a quiz with the paragraph style answer option as the questions required an explanatory or reflective answer. The assessment was administered via a link, customized and shortened using tinyurl.com, and projected on screen to students at the start of class and periodically while the lesson was taught (Supplementary Appendix D) . The students were told to go to the link and fill in demographic data as well as respond to the questions in the quiz. While the students were taught, they were allowed to input their responses and the data was gathered by Google Forms. The students could answer the questions as they were taught as well as interact with other students during the teaching sessions. They were required to submit their work at the end of their session. Accompanying the quiz that were given to the librarians who taught the sessions was a rubric, see (Supplementary Appendix C). This was designed to assess the performance of students as well as to standardize the open-ended questions. The scripts were marked using the rubric and the performance of students were analyzed based on the scoring standards outlined in the rubric. The scores were placed on an Excel Spreadsheet (Supplementary Appendix E) and the performance analyzed against the rubric. The open responses were analyzed to understand the thinking and possible flaws of students so these could be addressed in future teaching sessions. The librarians who were to participate in teaching were invited to a meeting where they were trained to engage with students in an active learning classroom using the Google Forms application to both teach and assess students using the quiz. The teaching was demonstrated, and the concerns of the librarians were addressed. The number of sessions for this particular course was 26 double sessions of 2 hours each and was taught by 14 librarians to a total of 240 students. However, this research was done with six librarians who taught 16 sessions to 172 students. Two librarians taught classes with 30 students instead of 15 students as indicated in Table 1 below. In addition to being asked the 13 questions for the test, the students were asked to give their age, gender, nationality, prior schooling and whether they were taught information literacy before they came to the UWI. The gender, age, and nationality of the students in this study was representative of the general population of The UWI Mona students. As seen from Figure 1 most of them are youths or young adults which is typical in a first-year class: 51% of the students were between ages 20 and 26 and approximately onethird of the class were under 20. The others ranged from 27 and upwards. In recent years, student enrollment in higher education institutions in Jamaica have been overwhelmingly female and this class reflected that trend-66% identified themselves as female. Ninetysix of the students were Jamaicans. Fifty-three percent of the class said they were taught IL in high school. The demographic information related to students' knowledge of IL was mixed but there was no significant deviation that indicated that this impacted their performance. It must be noted that the aim of the research is not to compare students' performance but rather to see if the instrument used could assess students' performance. It was clear at the end of the first teaching session and by the end of the entire period of teaching that Google Forms was able to assist in teaching and assessing students. We were able to accomplish all our teaching and learning outcomes. Because we already knew that we could use multiple-choice questions to assess students we chose open-ended questions to allow them to express themselves. Thus, we have been able to move beyond the typical multiple-choice questions which Brage and Svensson (2011) noted were mainly used by librarians in assessments. As noted, these forms of assessment do test a few cognitive skills but they are unable to assess search skills (46). We were not only able to teach but we were able to assess the outcome of student searches in one sitting or class session. Can Google Forms be used to actively teach and assess students simultaneously in a single session? Librarians were able to actively teach the lesson while students were able to learn and give their feedback as the lessons progressed. At the end of each lesson, the authors were able to capture the data on each class and note students' feedback. The Google Form was not only able to get students' responses using multiple-choice questions, but they were able to write clear sentences allowing us to capture their thoughts on certain concepts. Additionally, Google Forms enabled live viewing of responses, so librarians are able to teach and assess simultaneously and if necessary, make adjustments during class. Figure 2 shows how Google Forms instantly tabulates responses and provides immediate results in a chart to facilitate assessment. Here a sample of the responses to the open-ended question: When and why do you sign in to UWILinC? is displayed. The authors were able to capture not only the individual responses from students but also the responses for entire groups and be able to compare responses across groups. Google Forms allows for the presentation to be done in the form of charts; however, Figure 2 shows responses when a paragraph style response option is selected. Although these responses are not presented in chart format, instructors are able to view the live responses and assess learning. However, not all the librarians in this study made full use of the live responses feature; instead reverting to their traditional method of walking around and observing and individually assisting students as they completed the form. Although the librarians had a two-hour session to teach various skills as indicated by the learning outcomes, they were empowered to both teach and assess at the same time using Google Forms. The tests were aligned to learning outcomes as noted by Brage and Svensson (2011) . It was also discernible from the performance of the students that the instrument facilitated both teaching and assessment simultaneously. Although the success of the instrument is not based on the performance scores of students, the results indicated that most students were trained in the use of UWILinC (see Figure 3 ). Based on our scoring rubric (Supplementary Appendix C) the total possible score that a student could achieve is 65 marks (13 questions, multiplied by five which is the maximum score for each question). Fifty-seven percent of the students scored between 33 to 61 marks, with the majority scoring between 40 to 61 marks. Additionally, 8% scored between 30 and 32, 16% scored between 20 and 29 and 19% scored 0 to 19. A score of zero indicates that the student did not attempt the question. Most of the students who scored overall low marks did not complete all the questions. Simultaneous teaching and assessment with Google Forms is challenging, not because of the instrument, but because it incorporates active learning. As explained, time constraint is a major issue. Some librarians reported that it was challenging to cover the content within the time and some students were unable to complete the assessment within the class time. Librarians conducting the IL sessions are normally challenged by student tardiness; as the IL classes are held in the library, away from the students' regular class location, many students show up to the library late-up to 15 min into class. When time becomes a factor, this seems to affect the rate of response for reflective questions. Reflective/explanatory questions like 8 and 10 had a high "no response" rate (55% and 49%, respectively) though the majority of students were able to answer the first part of the questions which had comparatively lower "no response" rate as in number 7 and 9 (24% and 20%, respectively). The findings of this study have confirmed those of Rodriquez (2018) and Djenno et al. (2015) as it proved that Google Forms facilitates active learning in class and its real-time feature enables instructors to simultaneously teach and assess. Like Roriquez's findings, the study found that a significant percentage of the students were unable to complete the assessment because they ran out of time. However, Auster and Wylie (2006) urge teachers not to shy away from active learning for fear of not covering mandatory content. Librarians were able to facilitate active learning and cover mandatory content because clear learning outcomes were developed. Feedback from librarians indicate when they were running out of time, they ensured that mandatory content was covered, and students were able to do the active learning activities even if that meant that there would be insufficient time to input answers in the Google Form. In future adaptations of this Google Form, it may be useful to reconsider the number of questions that assess critical thinking/higher order skills as many students opted not to give explanations, especially when constrained for time. Assessment of higher order skills 1. What are the features of Google Forms that allow it to be used as an instrument to format tests for students? 2. Can higher order skills such as search skills be assessed using Google Forms? The students were asked several questions which allowed them to demonstrate several higher order search skills which would be difficult to test with the use of multiple-choice items. These questions required them to give reasons, for example, question 9: which database would be useful to your discipline and why. Google Forms' paragraph style answer option enabled the students to write their explanations and gave the librarian insights into the student's thought process. The following are some of the responses which indicated that students understood the databases relevant to their disciplines. 1. Science Direct. Reason is because I am studying to become a nurse. 2. business source complete. because it is a business database. 3. National Library of Medicine, I am a Nursing Student which falls under faculty of medicine. 4. Emerald, it is helpful for banking and finance and economics. Another question that assessed their comprehension and application of knowledge was question 11 where they were asked to state two of over four ways of finding a journal article on UWILinC. This question required that they demonstrated their search skills by explaining the process. There are over four ways of locating a journal article on UWILinC, we wanted to find out if they were able to navigate the portal and locate these articles. The students were able to explain the multiple ways they learnt during the session as expressed in the following responses: 1. (1). Article by subject (2). Find e-Journal. 2. Go to find databases, filter by discipline then search. 3. Using the search all bar then filter by resource type. 4. Two ways in which you may be able to find a journal article on UWILinc are by using Articles by subject and Find e-Journals. 5. (1). Click on articles by subject drop down the box and find the discipline that is appropriate to you. (2). By clicking e-Journal. It would have been to our advantage if we asked them to give specific articles, but time did not allow given that we were more interested in them knowing the skills to apply to the process. Another question that required them to comprehend, evaluate and apply their knowledge was Question 13: How can you find current journal articles on UWILinC? (One written in the last 5 years). This question required that students explain not only how to find journal articles but also explain how they determine where and how the database can assist them to locate the current articles. Notwithstanding this process can be accomplished in multiple ways, the students were able to give a variety of responses which indicated that they were able to use Google Forms to express themselves allowing us to see their feedback electronically. The following are some of their responses: 1. When searching for your information you access your filter and select Creation/ Publication Date after which select the years to access your information click refine and then search. 2. Refine the search by filtering the year (2012-2018). 3. Go to advanced search, click the drop down for publication date and select last 5 years. 4. Type the keyword on UWILinc in Article by subject then on the left in the bottom corner you go to the creation/publication date then type the years and then click refine. 5. By going to search all, type keywords then go to resource type and journals. Then go to creation/publication date type in from 2012 to 2017 and select refine. Based on the answers given to the questions that required students to demonstrate higher order skills, Google Forms facilitates this type of assessment, yet the literature shows that Google Forms is popularly used for post and pre-tests, using the multiplechoice format. The biggest challenge with using Google Forms to assess higher order skills is the analysis of response/information. Unlike, selecting multiple-choice responses, Google Forms does not provide analysis or tabulation when paragraph style is selected, so librarians must individually read through each response. Rodriguez (2018) had also mentioned the challenge and the time-consuming nature of reading through responses but that study, like this, allowed for the assessment of critical thinking, which often eludes some information literacy assessment. Feedback to improve teaching skills 1. Is it possible for teachers of IL to get feedback using Google Forms that can assist in improving their teaching to better assist students? Djenno et al. (2015) had emphasized the value of librarians being able to get student feedback about IL classes and the limitations presented by traditional worksheets. Like Djenno et al. (2015) and Rodriguez (2018) , this study found Google Forms to be a suitable alternative to the worksheets in its ability to facilitate student feedback that Librarians can reference later to improve teaching. The study elicited some rich quantitative and qualitative data that could be useful to the librarians who taught the sessions. The form allowed us to get feedback which would have previously not been captured because students kept the worksheets, and the online quiz did not capture all the outcomes of The UWI information literacy sessions. For example, responses to Question 1 informed us that we need to spend time in showing students the capability and functionality of the portal as too few of them knew what to expect on the portal. Question 2 was an eye opener as so few knew when and why they needed to sign in to UWILinC; this is a problematic area because reference librarians regularly field queries about the accessibility to the portal offcampus and on mobile devices. So that Librarians could follow up with students if necessary, contact information was collected (voluntarily), and this provides a suitable alternative to institutions that do not have institutional access to Google Suite. Rodriguez (2018) had also described the administrative challenge when students forgot their institutional access information; this was not a problem to the study at UWI Mona because the link to the assessment was provided to all students and Google Forms is accessible to all, whether or not you had a Google account. Ninety-six (96%) of students provided an email address for further contact by the librarian. Question 3 revealed that we need to improve our teaching strategy as most students did not know the difference between "Search ALL"and "UWI Collections." We were able through Questions 4, 5 and 6, to note if our students knew how to find a book, what a call number was and where in the library system they needed to go to locate the book they found. We learnt from the data that most knew how to find a book on the portal, but half of that number did not know what a call number was because they failed to respond or gave an incorrect answer. We therefore need to spend more time on this area. Although the authors did not ask students specifically to provide feedback on teaching methods, the Forms provided feedback for us to analyze and strategize how to improve their teaching by highlighting weak areas. It is clear from the findings of this research that Google Forms can be used to actively teach and assess students simultaneously in a single session. The five librarians who engaged in teaching got responses from students that they could use to determine that interaction took place and students were able to show what they learn and what they did not learn. The responses harvested from the results of the research showed that it is possible using Google Forms for teachers of IL to get feedback that can assist in improving their teaching. Since it allowed students to give free open-ended responses, we were able to get into the mind of our students and to see through their lenses what they learned and can now plan strategies to improve teaching. It is also clear that teachers of IL can assess if higher order skills, such as search skills can be assessed using Google Forms. Google Forms is very flexible and is constantly being upgraded to provide more features to make teaching and assessment easier. The results of assessment can be easily shared among librarians, faculty, and students to motivate and encourage digital pedagogy because Google Forms facilitates the easy export/download of data and automatically generates charts. It allows for individual feedback to be given to students as their scores can be shared with them almost instantly as indicated in the literature (Barrie, 2007) . Although Google Forms is clearly advantageous compared to printed worksheets, as noted by Rodriguez (2018) , analysis of responses can be time consuming, but the results of this study have shown that is it valuable so presenters must schedule time to review responses so as to strategize for other sessions. Additionally, open-ended questions that require students to reflect on the processes they engage in, provides valuable insights to librarians that can uncover information about the use of library resources, previously unknown or unimaginable to librarians. The methodology can be customized and used in the teaching and assessment of all types of IL sessions, be it stand alone or embedded. Google Forms is very flexible, inexpensive and can be crafted for use in interactive sessions to teach actively. Librarians can consider utilizing the live sharing of results as Google Forms automatically creates charts that can be easily viewed in class. This research can be used as a model for teaching and assessment. All the information and tools are presented here either in the body or in the appendices and the authors can be consulted for assistance in modeling this innovative approach to teaching and assessment of IL. The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/ or publication of this article. Information literacy: assessment Creating active learning in the classroom: a systematic approach A conceptual framework for the teaching and learning of generic graduate attributes Active learning: creating excitement in the classroom Making a difference? Assessment of information literacy at Linköping University Library From paper to pixels: using Google Forms for collaboration and assessment Google Forms: a real-time formative feedback process for adaptive learning A university's information literacy assessment program using Google Docs. Brick and Click Libraries A Cooperative Learning Approach to Teaching Introductory Biology Using rubrics to assess information literacy Developing an assessment plan to learn about student learning Promoting Active Learning: Strategies for the College Classroom Embedding information literacy in a first-year business undergraduate course Active learning. In: Provenzo EF (ed) Encyclopedia of the Social and Cultural Foundations of Education Google Forms in library instruction: creating an active learning space and communicating with students Direct assessment of information literacy using writing portfolios Google Spreadsheets and real-time assessment: instant feedback for library instruction UWI Triple A 2017-2022 Strategic Plan Assessment for information literacy: vision and reality Assessment of information literacy skills among first year students The author(s) received no financial support for the research, authorship, and/or publication of this article. Karlene Patricia Robinson  https://orcid.org/0000-0002-9206-5842 Supplemental material for this article is available online.