Article
Far from a Trivial Pursuit: Assessing the
Effectiveness of Games in Information Literacy Instruction
Eamon Tewell
Reference & Instruction
Librarian
Long Island University,
Brooklyn Campus
Brooklyn, New York, United
States of America
Email: eamon.tewell@liu.edu
Katelyn Angell
Reference & Instruction
Librarian
Long Island University,
Brooklyn Campus
Brooklyn, New York, United
States of America
Email: katelyn.angell@liu.edu
Received: 2 Aug. 2014 Accepted: 2
Feb. 2015
2015 Tewell and Angell. This is an Open
Access article distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective
–
To determine whether playing library-related online games during information
literacy instruction sessions improves student performance on questionnaires
pertaining to selected research practices: identifying citation types and
keyword and synonym development.
Methods
–
86 students in seven introductory English composition classes at a large urban
university in the northeastern United States served as participants. Each class
visited the library for library instruction twice during a given semester. In
the experimental group students received information literacy instruction that
incorporated two online games, and the control group received the same lesson
plan with the exception of a lecture in place of playing games. A six-item pre-
and posttest questionnaire was developed and administered at the outset and
conclusion of the two-session classes. The 172 individual tests were coded,
graded, and analyzed using SPSS.
Results – A paired sample t-test comparing the
control and experimental groups determined that that there was a statistically significant
difference between scores on pre-tests and post-tests in the experimental group
but not the control group.
Conclusion
–
Students who played the online games improved significantly more from pre-test
to post-test than students who received a lecture in lieu of playing online
games, suggesting that participating in games related to the instruction they
received resulted in an improved ability to select appropriate keywords and
ascertain citation formats. These findings contribute to the evidence that
online games concerning two frequently challenging research practices can be
successfully applied to library instruction sessions to improve student
comprehension of such skills.
Introduction
Information literacy instruction plays a key
role in the educational mission of many academic libraries. Librarians employ a
wide range of strategies for teaching members of their community regarding the
many dimensions of information access and use. One such method of teaching
draws upon games-based learning to achieve the fulfillment of learning outcomes
and increase student engagement and motivation. In practice, games-based
learning frequently consists of librarians either creating their own games,
adapting existing games used by other libraries, or designing class sessions
using gaming principles (gamification). As opposed to traditional instruction,
games may provide students with opportunities to meaningfully engage with
classmates and the instructor, participate in hands-on activities, and learn
new skills using their preexisting knowledge as a basis.
Despite the ongoing popularity of games in
library instruction, little research has been done on whether playing games in
academic library settings may in fact translate into learning. In the present study, the authors predicted that
students in the classes that incorporated games would score higher on the
pre-/posttest assessment tool than students in the classes without games. In
contrast, the null hypothesis was that there would be no significant
differences in scores between the two groups. Using two games whose efficacy
has been previously tested by their developers, this study seeks to build on
this existing evidence and provide insight into the question of whether online
games are a preferable method of instruction compared to lectures in terms of student
comprehension of targeted concepts.
Literature
Review
A
review of the literature reveals that using games for information literacy
instruction is increasing in terms of acceptance and popularity, but in many
cases assessment beyond student interest has yet to be explored. The scholarly
discourse on games as tools to improve literacy began in 2003, when Arizona
State University professor James Paul Gee published his seminal monograph on
games-based learning titled What Video
Games Have to Teach Us about Learning and Literacy. Gee expounds upon the
many ways in which games facilitate learning through his 36 Video Game Learning
Principles, including critical learning, encouraging exploration and discovery,
just-in-time learning, and utilizing active learning methods (2007). Regarding
information literacy specifically, Gumulak and Webber (2011) found that the
video game-playing activities of 28 teenagers closely corresponded to
established information literacy models.
Gaming
in libraries made a national debut at the 2005 Gaming, Learning and Libraries
Symposium, where presenters from various library settings discussed how and why
games were being used in libraries (Doshi, 2006). Since the mid-2000s a
significant amount of literature has been generated on the subject of games in
library instruction. Though gaming-related topics such as developing video game
collections and providing outreach through gaming events appear with
regularity, this review will focus on games-based learning for information
literacy instruction. Also important to note is that the educational literature
contains a great number of studies regarding the use of educational games, and
non-library educators have incorporated games into their pedagogy for a far
longer time than librarians. However, for the purposes of considering only the
most applicable research in terms of setting, class content, and other
contextual factors, this literature review focuses on non-digital and digital
game initiatives at academic libraries.
Non-Digital Games
Non-digital
games have been implemented at a number of college and university libraries due
to their easy-to-play nature and inherent capacity for personal engagement with
others in the class. Though the authors selected to use online, digital games
for the research at hand, a brief review of the use of non-digital games will
help provide additional context on game-based learning in academic libraries.
Leach and Sugarman (2005) note that the success of a library instruction game
is dependent upon several factors, including the type of game played, the
incorporation of learning outcomes, and the instructor’s flexibility. The
authors present best practices for designing games using their activity based
on the quiz-show Jeopardy! as a case study. Similarly, Walker (2008) used the Jeopardy! format
in eight one-shot sessions to reinforce concepts learned earlier in the class,
reporting that students responded positively to the game. Both articles suggest
that the game’s highly familiar format is an important factor in student
receptivity. Smith (2007) developed games such as tic-tac-toe, word searches,
and crossword puzzles that used library-based terms and concepts.
Many
non-digital games are developed in order to supplement or enhance library
orientation sessions. Being a type of information literacy instruction that
typically occurs in the first semester of a student’s higher education
enrolment and focuses on basic research practices, the research into the use of
games in academic library orientation sessions provides useful related evidence
to consider and build upon. Marcus and Beck (2003) compared a traditional
orientation to one that sent freshmen on a library treasure hunt that required
locating a series of clues. By conducting a brief post-orientation test the
authors found that the treasure hunt received more positive student feedback
than the traditional orientation and held increased educational benefits (p.
31). Thorough reviews of the many types of information literacy games,
including in-person and virtual games, have been conducted by Margino (2013)
and Smale (2011). Smale (2012) developed the internet resource evaluation game Quality Counts wherein students search
for and critically evaluate websites. Survey responses indicated that players
enjoyed the game and felt that their skills levels increased (p. 140).
Digital Games
Digital
and online games to teach college and university students library skills
appeared in the literature at an early juncture with Koelewyn and Corby’s 1982
report on a computer game requiring students to use the Reader’s Guide to Periodical Literature. In the arcade-inspired
game Citation students were randomly
assigned one of ten topics and then must construct a bibliography of a
predetermined number of sources as quickly as possible using the Reader’s Guide (p. 171). A great deal
has changed technologically since Koelewyn and Corby’s study, but the reasons
for incorporating digital games into instruction remain the same. While at
least one academic library has opted to modify an existing commercial videogame
to tailor its learning objectives to their needs (Clyde & Thomas, 2008),
the vast majority of libraries using digital games have developed their own.
The online board game The Information
Literacy Game (Rice, 2008) was received positively by students, who played
the game by rolling a digital die and correctly answering questions to move
ahead on the board. Gallegos and Allgood (2008) describe a process that began
with a board game and led to development of an online game, which ultimately
indicated student receptivity to playing information literacy games.
Librarians
at James Madison University created two online games to serve two distinct
purposes (McCabe & Wise, 2009). Citation Tic-Tac-Toe asks players to identify
the type of a given citation while playing tic-tac-toe, and Magnetic Keyword
uses virtual refrigerator magnets to help students practise identifying
keywords. The authors assessed each game differently, using quantitative
methods for Citation Tic Tac Toe and qualitative
methods for Magnetic Keyword, finding that in both cases students had increased
their skill levels (p. 13). Armstrong and Georgas (2006) developed and assessed
an interactive tutorial titled “Doing Research” and discerned a statistically
significant difference in university student skills using a pre- and posttest
questionnaire. Smith and Baker (2011) describe the impetus and development of
two online games at Utah Valley University. The authors surveyed 52 students,
who responded to the games’ informative and entertaining nature (p. 638).
Mary
Broussard (2010), a prominent researcher in games-based learning, created the
online game Secret Agents in the Library as
an alternative to a traditional library orientation. Groups work in teams to
answer a series of questions requiring use of the library’s website and
locating materials in the stacks. Additionally, Broussard (2012) reviewed 17
online library games and analyzed the traits of successful games, offering six
suggestions for libraries seeking to develop their own digital games. Most
recently Broussard (2014) makes a case for games as tools for conducting
formative assessment in the classroom, arguing that both games and assessment
of student learning during a session share significant synchronicities.
The
literature demonstrates that librarians have considered it worthwhile to
incorporate games for the purposes of library orientations, engagement in
one-shot sessions, practising specific library skills, and more. Because a wide
variety of games exist in terms of format and objectives, generalizing research
findings is challenging. A vast majority of researchers measured student
receptivity to a particular game instead of whether playing a game contributed
to student learning. Furthermore, reviewing the literature of games in library
instruction presented difficulties in that digital games have a lifespan that
can be as brief as one semester. Bibliobouts, one of the most promising
research-oriented games in terms of gameplay and adaptability by other
institutions, is no longer available due to its four-year grant funding
reaching its end (University of Michigan School of Information, 2012), though
the BiblioBouts team completed a book on designing effective online information
literacy games (Markey, Leeder & Rieh, 2014).. Gaming expectations and
technologies change rapidly, and as such it is difficult to determine which
games are being used or are available. After a review of the literature, the
authors were prepared to select the games most appropriate to their setting and
learning outcomes.
Methodology
Research Design and Participants
The
study was a quasi-experiment, as the requests for library instruction by
teaching faculty at Long Island University did not permit random assignment of
the university’s undergraduate population. The specific design was two
groups/nonrandom selection/pretest posttest. Pretest/posttest models are
commonly employed by educational researchers to investigate effects of a
particular treatment on learning (Freed, Hess & Ryan, 2002).
Eighty-six
students enrolled in introductory English composition classes at a large, urban
university in the northeast served as participants. The sampling technique
employed was convenience, a type of nonprobability sampling frequently used in
research involving college students. The participants comprised seven English
classes in total. Professors of these classes contacted the library of their
own accord to request instruction for their students. All seven classes visited
the library for group information literacy instruction (ILI) classes at two
points during the semester. The researchers were the sole ILI instructors
included in this study.
Participant
ages ranged from 16 to 40, with an average age of 19. Thirty participants
identified as male and 56 participants identified as female. Participants were
divided into two groups prior to instruction: a control group of 43 students
and an experimental group of 43 students.
Instruments and Procedure
Before
beginning the experiment the researchers needed to secure Institutional Review
Board (IRB) approval, The researchers were granted an exemption from formal
review as this study qualified as “research conducted in established or
commonly accepted educational settings, involving normal educational practices”
(Long Island University, n.d.).
The
researchers informed their coordinator of instruction that they would like to
teach seven sessions of English composition classes, and were thus assigned all
classes requested by faculty desiring two ILI sessions. Three of the sessions
were taught in fall 2014 and four were taught in spring 2014. The seven classes
were divided into two groups prior to the instruction: the control group and
the experimental group. One researcher taught four classes in the experimental
condition and the other researcher taught three classes in the control
condition. There was a total of 43 students in the experimental classes and 43
students in the control classes. Each researcher selected the classes which fit
best into his or her schedule. Students in the control group would not play any
educational games, while students in the experimental group played a keyword
development game in the first ILI session and a citing game in the second ILI
session. The sessions were all one hour and fifteen minutes long and there was
an average of three weeks between the first and second sessions.
Lesson
plans were created for first and second sessions of both the control and the
experimental classes. The lesson plans were identical with the exception that
students in the experimental condition played a game (see Appendix A for a
detailed lesson plan). Apart from the games, the researchers collaboratively
developed all classroom materials utilized in this study. At the very beginning
of the first session each student was administered a six-question
multiple-choice-paper pretest developed collaboratively by the two researchers
and adapted from Beile’s Test of Information Literacy for Education (Beile
O’Neil, 2005). Students were given five minutes to complete the quiz, and all
participants finished on time. This instrument assessed their knowledge of
basic keyword development and citing skills (see Appendix B).
Both
groups of students were then given a presentation on basic keyword development
and database strategy skills. Afterwards the experimental groups were asked to
play a freely available game called Doing Research, created by librarians at
the University of Illinois at Chicago and available at: www.uic.edu/depts/lib/reference/services/tutorials/DoingResearch.shtml (Armstrong & Georgas, 2006). Players are
presented with a topic, the representation of women in film, and asked to
choose certain keywords that represent the topic before moving forward. In the
next step several synonyms for the terms “women” and “film” must be selected.
Students were allowed fifteen minutes to play the game. Both sessions concluded
with the distribution to students of an activity in which they explored a
research paper topic and located one article in an academic database.
For
the second session, both classes began with a presentation on citing in both
MLA and APA formats. Librarians then gave students a demonstration of ProQuest
Databases. The experimental group subsequently played a game created by James
Madison University librarians called Citation Tic-Tac-Toe, available at: www.lib.jmu.edu/tictactoe/
(McCabe & Wise, 2009). Citation Tic-Tac-Toe asks players to correctly
identify a format when presented with a citation, such as articles, book
chapters, and website domains. Students were given ten minutes to play the
game. Next, both groups were provided with a worksheet that entailed locating
an article on their research paper topic and the documentation of this article
in APA and MLA Styles. Before the second session ended students were given a
posttest, which presented them with the same questions as the pretest ordered
differently to discourage memorization. Therefore, the independent variable in
this project was the online games, while the dependent variable was the
measures of achievement on the assessment tool.
Once
all of the classes were taught the pretests and posttests were graded by the
researchers. The standard 100 percentile grading method was employed, with each
of the six questions representing 17 percentage points (rounded up from 16.66).
If students skipped a question the item was automatically counted as incorrect.
Data Collection and Analysis
All
pretests and posttests were coded using a simple numerical coding system.
Although all of the tests were anonymous this system was used to keep track of
the artifacts. Participants in the experimental group received a number ranging
from 1-43 and participants in the control group received a number ranging from
44-86. The pretests and posttests were then coded accordingly. Statistical
analysis was used to determine if there was any significant difference between
scores on the pretests and posttests in both groups. A one-tailed paired
(dependent) t-test was chosen to analyze the data. Descriptive statistics were
also generated to ascertain group means and standard deviations. These
statistics provide average scores on the pretests and posttests in the
experimental and control groups. Individual pre- and posttest scores were not
compared, as the researchers focused on assessment at the class (group) level.
Results
A
paired sample t-test comparing the control and experimental groups determined
that that there was a statistically significant difference between scores on
pretests and posttests in the experimental (games) condition: t(42)=-3.056, p = 0.002. There was not a significant
difference between scores on pretests and posttests in the control (no games)
condition: t(42)=-.506, p = 0.308. Table 1 provides
the full statistical breakdown of the t-test’s output.
Additionally,
descriptive statistics for the scores on the pretests and posttests in both the
experimental and control groups were calculated (see Table 2).
Although
both conditions saw students improve their scores over time, the experimental
group experienced a much larger improvement, as scores improved by around two
percentage points in the control condition and around ten points in the
experimental condition. The standard deviations were very similar, with the
greatest deviation occurring in the pretest experimental condition and the
lowest deviation occurring in the posttest control condition.
Table
1
Output
for Paired Samples t-test
Pair |
Condition |
Mean |
Std. Dev. |
Std. Error Mean |
t |
df |
Sig. (1-tailed) |
Pair 1 |
Pre No Games- Post No Games |
-2.326 |
30.138 |
4.596 |
-.506 |
42 |
.308 |
Pair 2 |
Pre Games- Post Games |
-10.488 |
22.508 |
3.432 |
-3.056 |
42 |
.002 |
Table
2
Means
for Pretest and Posttest Scores in Games and No Games Conditions
Pair |
Condition |
Mean |
N |
Std. Deviation |
Std. Error Mean |
Pair 1 |
Pretest No Games |
60.30 |
43 |
24.657 |
3.76 |
Pair 1 |
Posttest No Games |
62.63 |
43 |
23.815 |
3.632 |
Pair 2 |
Pretest Games |
58.33 |
43 |
25.735 |
3.924 |
Pair 2 |
Posttest Games |
68.81 |
43 |
25.057 |
3.821 |
Discussion
Statistical
analysis revealed that the null hypothesis, which proposed there would not be a
significant difference between test scores in the experimental and control
groups, can confidently be overturned. The alternative hypothesis, which
predicted that students taught with games would outperform students in a
control group on a library skills test, was confirmed. These findings suggest
that the trend within academic librarianship of incorporating games into
instruction has not been in vain; rather, the present study offers educators
evidence that games may have the potential to positively impact information
literacy skill development.
Currently,
there is very little research within LIS literature employing a two group
pretest/posttest design to assess the effectiveness of games. McCabe and Wise
(2009) are an exception, as they piloted their game Citation-Tic-Tac-Toe with
both a control and experimental group. Similar to the present study, McCabe and
Wise learned that students who played the game performed better on a posttest
than students in a control group who took an online citation tutorial instead.
When combined with the findings of the present study there now exists
increasing evidence that games can enhance the development of information
literacy skills, most demonstrably of citing.
Two
additional empirical articles mentioned in the literature review support the
findings that games can increase information literacy knowledge. Armstrong and
Georgas (2006), creators of the Doing Research tutorial used in the present
study, found that students scored significantly higher on a posttest following
participation in this game than on a pretest. Although the lack of a control
group prevented valuable comparative opportunities, the experiences of the
students in Armstrong and Georgas’s project fared similarly to the students in
the present study’s experimental group. Both initiatives demonstrated the
ability of interactive computer activities to boost scores on information
literacy tests.
Marcus
and Beck (2003) conducted an innovative study which compared the learning
outcomes and attitudes of first year students in two different ILI groups: a
self-guided treasure hunt orientation or a traditional library tour. The
treasure hunt can certainly be considered an educational game, as students
adventured around the facility completing interactive library-related tasks and
were awarded prizes. All students were given a library skills multiple-choice
quiz following the treasure hunt, and statistical analysis showed that students
in the treasure hunt (experimental) group performed better than students in the
traditional tour (control) group.
What
all of these studies share in common is empirical evidence that games can play
a part in helping students sharpen their IL skills.
The positive statistical results support greater inclusion of games into active
learning pedagogies within the academic library classroom, as well as
potentially increasing the allocation of additional time and money for the
development of educational games.
Limitations and
Future Directions
Despite
the concerted effort of the researchers to control variables in the
quasi-experiment there are several limitations deserving of attention,
including: researcher assignment to classes; students receiving insufficient
time to complete the questionnaires; and the potential for student skills
gained independent of library instruction between classes. First, instead of
assigning one researcher to teach all of the games classes and the other
researcher all of the control classes, a future study would entail both
researchers teaching both types of classes. This measure would maximize the potential
of the treatment (games) to affect learning and to minimize possible
confounding influences of individualized instruction techniques of the two
researchers.
Another
limitation of this project is the potential for participants to have
experienced procedural bias. In brief, this bias occurs when participants are
given an instrument to complete in a set time limit under close supervision of
the researcher(s). In this study students were administered the pretests and
posttests with the knowledge that they had five minutes to fill out each
questionnaire. Some participants could have felt pressured and rushed through
the questions, making mistakes that might have been prevented by allowing them
additional time. A small body of psychological research spanning nearly fifty
years indicates the negative impact that timed tests can have on some
individuals. Morris and Liebert (1969) empirically demonstrated that college
students who showed high levels of worry on a questionnaire performed worse on
an intelligence test than both high-worry students in an untimed condition and
low-worry students. Many years later Onwuegbuzie and Daley (1996) conducted a
study which measured the performance of graduate students on a statistics
examination in both timed and untimed conditions. Analysis revealed that on
average students in the untimed conditions received higher scores. Another
study focused on a community college population, noting that untimed tests can
be particularly beneficial to older and nontraditional students (Hodges &
Kennedy, 2004).
A
third limit worth noting is that the passage of time in between completion of
the pretests and posttests in both groups could have caused an extraneous
time-related variable. Students in the first session did not return to the library
for at least two weeks subsequent to the second session; therefore, during this
time they ostensibly could have gained some information literacy skills outside
of the classes taught by the researchers. For example, a student could have
visited the reference desk for keyword development or citation help, or
consulted with a librarian for a one-on-one tutorial. Therefore, it is a
possibility that some students scored higher on the posttests than the pretests
not because of the incorporation of games into instruction (i.e. the
treatment), but because they improved their research skills in other ways
during the period between the two sessions.
Future
research could adopt a methodology similar to the study at hand by examining
the educational impact of games-based teaching interventions using pre- and
posttests, but might do so using a longitudinal analysis conducted over the
course of multiple academic years or with the addition of a qualitative measure
to expand upon the dimensions of the evidence being presented. Additionally,
the wide variety of game formats and their different educational capacities
should be considered, including medium (in-person, digital, and hybrid) and
duration (from part of a standalone instruction session to integration throughout
a semester-long course). Evaluating the effects of information literacy
gameplay when practised individually versus in small groups would be another
beneficial avenue for research and would contribute much needed research to the
area of games and learning in the context of library instruction.
Conclusion
The
results of this study suggest that, when implemented in information literacy
instruction sessions, brief online games addressing two common research
processes—identifying keywords and synonyms in addition to categorizing
citation types—can be successfully utilized to improve student comprehension of
these skills. The instruction containing games was compared with instruction
with additional lecture, the latter being a type of teaching that can be considered
“traditional” information literacy instruction. These games represent a modest
change to the content addressed in the instructors’ ILI sessions, and as such
might easily be adopted by other librarians interested in using participatory,
game-driven methods to encourage engagement with information literacy
practices. The effective use of games will vary according to student
backgrounds, desired learning outcomes, and other classroom factors, but in the
appropriate circumstances games-based learning may have the potential to
enhance student engagement and learning in regards to instructional content.
An
additional advantage to games-based learning, noted by several researchers but
outside of this study’s scope, is the role of gameplay in affective elements
that contribute to learning, such as student enjoyment of the session and
intrinsic motivation. The authors have found anecdotally in their experiences
as instructors that the elements of engagement and motivation can be greatly
improved when games are a part of student learning experiences. It is the
authors’ hope that this research adds to the evidence base concerning the
efficacy of games in the library classroom, and will encourage additional
research and reflection on games-based learning and other popular teaching
methods to ensure that our practices as information literacy instructors are
grounded in effective pedagogy, and in turn, instruction that places learners
first and foremost.
References
Armstrong,
A., & Georgas, H. (2006). Using interactive technology to teach information
literacy concepts to undergraduate students. Reference Services Review 34(4), 491-497. Retrieved from http://www.emeraldinsight.com/loi/rsr
Beile O’Neil, P. (2005). Development and validation of the Beile Test
of Information Literacy for Education (Doctoral dissertation). University
of Central Florida, Orlando, FL.
Broussard, M. J. (2010). Secret agents in
the library: Integrating virtual and physical games in a small academic
library. College & Undergraduate
Libraries 17(1), 20-30. http://dx.doi.org/10.1080/10691310903584759
Broussard, M. J. (2012). Digital games in
academic libraries: A review of games and suggested best practices. Reference Services Review 40(1), 75-89.
Retrieved from http://www.emeraldinsight.com/loi/rsr
Broussard, M. J. (2014). Using games to make
formative assessment fun in an academic library. Journal of Academic Librarianship 40(1), 35-42. http://dx.doi.org/10.1016/j.acalib.2012.12.001
Clyde, J., & Thomas, C. (2008).
Building an information literacy first-person shooter. Reference Services Review 36(4),
366-380. Retrieved from http://www.emeraldinsight.com/loi/rsr
Doshi, A. (2006). How gaming could improve
information literacy. Computers In Libraries 26(5),
14-17. Retrieved from http://www.infotoday.com/cilmag/default.shtml
Freed, M., Hess, R. K., & Ryan, J. M.
(2002). The educator’s desk reference: A
sourcebook of educational information and research. (2nd ed.)
Lanham, MD: Rowman & Littlefield Publishers.
Gallegos, B., & Allgood, T. (2008).
The Fletcher Library game project. In A. Harris & S. E. Rice (Eds.), Gaming in academic libraries: Collections,
marketing and information literacy (pp. 149-163). Chicago IL: Association
of College and Research Libraries.
Gee, J. P. (2007). What video games have to teach us about learning and literacy. New York: Palgrave Macmillan.
Gumulak, S., & Webber, S. (2011).
Playing video games: Learning and information literacy. Aslib Proceedings 63(2/3),
241-255. Retrieved from http://www.emeraldinsight.com/loi/ap
Hodges, D. Z., & Kennedy, N. H.
(2004). Editor's choice: Post-testing in developmental education: A success
story. Community College Review, 32(3),
35-42. http://dx.doi.org/10.1177/009155210403200303
Long Island University. (n.d.). Institutional
Review board application for exempt category review. Retrieved February 15,
2015 from http://www.liu.edu/~/media/Files/AcademicAffairs/SponResearch/Forms/UC_HumanSubjects-Exempt-0413.ashx
Koelewyn, A. C., & Corby, K. (1982).
Citation: A library instruction computer game. RQ 22(2), 171-174.
Leach, G. J., & Sugarman, T. S. (2005).
Play to win! Using games in library instruction to enhance student learning. Research Strategies 20(3), 191-203. http://dx.doi.org/10.1016/j.resstr.2006.05.002
Marcus, S., & Beck, S. (2003). A
library adventure: Comparing a treasure hunt with a traditional freshman
orientation tour. College & Research
Libraries 64(1), 23-44. http://dx.doi.org/10.5860/crl.64.1.23
Margino, M. (2013). Revitalizing
traditional information literacy instruction: Exploring games in academic
libraries. Public Services Quarterly 9(4),
333-341. http://dx.doi.org/10.1080/15228959.2013.842417
Markey, K., Leeder, C, & Rieh, S. Y.
(2014). Designing online information
literacy games that students want to play. Lanham, MD: Rowman &
Littlefield.
Martin, J., & Ewing, R. (2008). Power
up! Using digital gaming techniques to enhance library instruction. Internet Reference Services Quarterly 13(2-3),
209-225.
McCabe, J., & Wise, S. (2009). It’s
all fun and games until someone learns something: Assessing the learning
outcomes of two educational games. Evidence
Based Library and Information Practice, 4(4), 6-23. Retrieved from http://ejournals.library.ualberta.ca/index.php/EBLIP/index
Morris L, & Liebert, R. (1969).
Effects of anxiety on timed and untimed intelligence tests: Another look. Journal Of
Consulting And Clinical Psychology, 33(2), 240-244. http://dx.doi.org/10.1037/h0027164
Onwuegbuzie, A. J., & Daley, C. E.
(1996). The relative contributions of examination-taking coping strategies and
study coping strategies to test anxiety: A concurrent analysis. Cognitive Therapy and Research, 20(3),
287-303. Retrieved from http://link.springer.com/journal/10608
Rice, S. E. (2008). Education on a
shoestring: Creating an online information literacy game. In A. Harris & S.
E. Rice (Eds.), Gaming in academic
libraries: Collections, marketing and information literacy (pp. 175-188).
Chicago IL: Association of College and Research Libraries.
Smale, M. A. (2011). Learning through
quests and contests: Games in information literacy instruction. Journal of Library Innovation 2(2),
36-55. Retrieved from http://www.libraryinnovation.org/index
Smale, M. A. (2012). Get in the game:
Developing an information literacy classroom game. Journal of Library Innovation 3(1), 126-147. Retrieved from http://www.libraryinnovation.org/index
Smith, A.-L., & Baker, L. (2011).
Getting a clue: Creating student detectives and dragon slayers in your library.
Reference Services Review 39(4),
628-642. Retrieved from http://www.emeraldinsight.com/loi/rsr
Smith, F. A. (2007). Games for teaching
information literacy skills. Library
Philosophy & Practice 9(2),
1-12. Retrieved from http://digitalcommons.unl.edu/libphilprac/
University of Michigan School of
Information. (2012). About the BiblioBouts Project. Bibliobouts Project. Retrieved February 15, 2015 from http://bibliobouts.si.umich.edu/BiblioBoutsAbout.html.
Walker, B. E. (2008). This is Jeopardy! An
exciting approach to learning in library instruction. Reference Services Review 36(4), 381-388. Retrieved from http://www.emeraldinsight.com/loi/rsr
Appendix A
Lesson Plans
Lesson Plans for
Session #1
Experimental Group
1.
Introduction and overview of class content
(5 minutes)
2.
Students take pretest (5 minutes)
3.
Prezi presentation on keyword development
and topic formulation (10 minutes)
4.
Students play keyword game (15 minutes)
5.
Demonstrate Gale Virtual Reference Library
and Points of View Reference Center (15 minutes)
6.
Students complete keyword worksheet
activity (25 minutes)
Control Group
1.
Introduction and overview of class content
(5 minutes)
2.
Students take pretest (5 minutes)
3.
Prezi presentation on keyword development
and topic formulation (10 minutes)
4.
Brief lecture on keyword selection (15
minutes)
5.
Demonstrate Gale Virtual Reference Library
and Points of View Reference Center (15 minutes)
6.
Students complete keyword worksheet activity
(25 minutes)
Lesson Plans for
Session #2
Experimental Group
1.
Introduction and overview of class content
(5 minutes)
2.
Prezi presentation on citing in APA and
MLA formats (15 minutes)
3.
Students play Citation Tic-Tac-Toe (10
minutes)
4.
Demonstrate ProQuest Databases (10
minutes)
5.
Students complete citation and database
searching worksheet activity (25 minutes)
6.
Students take posttest (5 minutes)
7.
Concluding remarks (5 minutes)
Control Group
1.
Introduction and overview of class content
(5 minutes)
2.
Prezi presentation on citing in APA and
MLA formats (15 minutes)
3.
Brief lecture on citation styles (10
minutes)
4.
Demonstrate ProQuest Databases (10
minutes)
5.
Students complete citation and database
searching activity (25 minutes)
6.
Students take posttest (5 minutes)
7.
Concluding remarks (5 minutes)
Appendix B
Assessment Quiz
1.
Using the citation below, what does the item in bold text represent?
Szajnberg,
N. (2012). Zombies, Vampires, Werewolves: An Adolescent's Developmental System
for the Undead and Their Ambivalent Dependence on the Living, and Technical
Implications. Psychoanalytic Review, 99(6),
897-910. doi:10.1521/prev.2012.99.6.897
a.
Article Title
b.
Volume
c.
Author
d. Journal Title
2.
You have a class assignment to investigate Americans’ attitudes towards the
Iraq War. A keyword search in the library catalog on “Iraq War” returns over
700 items. Which of the following steps would give you the best search results?
a.
change search to “What are some of the most popular American attitudes on the
Iraq War?”
b. add “American
attitudes” to your search
c.
search by Author using the same keywords
d.
search by Title using the same keywords
3.
Which is the article title in the
following MLA citation?
Bray,
Kate. “A Week in the Life of Jay-Z.” The
Independent [London] 25 Sept. 2009: 20. ProQuest
Databases. Web. 10 Sept. 2013.
a.
The Independent
b.
ProQuest Databases
c.
There is no title provided
d. A Week in the Life of Jay-Z
4.
Select the keywords that best represent synonyms for the concept “college students.”
a.
colleges, universities, community colleges
b.
millennials, generation Y, generation X
c. graduate
students, freshmen, sophomores
d.
midterms, finals, break
5.
The following citation is for:
Orians,
Gordon, and Gene Christman. A Comparative
Study of the Behavior of Red-Winged, Tricolored, and Yellow-Headed Blackbirds.
Berkeley: University of California Press, 1968. Print.
a. a book
b. a
chapter in a book
c. a
journal article
d.
none of the above
6.
Select the set of keywords that would provide the best search results for the
following question:
What
incentives do people have to use Facebook or other social media?
a.
Facebook, Twitter, Instagram
b. Facebook, social media, motivation
c.
Facebook, psychology, friends
d.
incentives, choices, motives