Research Article
Exploring the Impact of Individualized Research
Consultations Using Pre and Posttesting in an Academic Library: A Mixed Methods
Study
Lindsey Sikora
Research Librarian
Health Sciences Library
University of Ottawa
Ottawa, Ontario, Canada
Email: lindsey.sikora@uottawa.ca
Karine Fournier
Research Librarian
Health Sciences Library
University of Ottawa
Ottawa, Ontario, Canada
Email: karine.fournier@uottawa.ca
Jamie Rebner
Masters of Human Kinetics
University of Ottawa
Ottawa, Ontario, Canada
Email: jrebn066@uottawa.ca
Received: 30 Aug. 2018 Accepted: 29 Jan. 2019
2019 Sikora, Fournier, and Rebner. This is an Open
Access article distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
DOI: 10.18438/eblip29500
Abstract
Objective – Academic
librarians consistently offer individualized help to students and researchers.
Few studies have empirically examined the impact of individualized research
consultations (IRCs). For many librarians, IRCs are an integral part of their
teaching repertoire. However, without any evidence of an IRC’s effectiveness or
value, one might ask if it’s worth investing so much time and effort. Our study
explored the impact of IRCs on students' search techniques and self-perceived
confidence levels. We attempted to answer the following questions: 1) Do IRCs
improve students’ information searching techniques, including the proper use of
keywords and/or subject headings, the accurate use of Boolean operators, and
the appropriate selection of specialized resources/databases? 2) Do IRCs
influence students’ confidence level in performing effective search strategies?
Methods – Our study used a
mixed-methods approach. Our participants were students from the Faculties of
Health Sciences and Medicine at the University of Ottawa, completing an
undergraduate or graduate degree, and undertaking a research or thesis project.
Participants were invited to complete two questionnaires, one before and one
after meeting with a librarian. The questionnaires consisted of open-ended and
multiple choice questions, which assessed students' search techniques, their
self-perceived search techniques proficiency and their confidence level. A
rubric was used to score students' open-ended questions, and self-reflective
questions were coded and analyzed for content using the software QSR NVivo.
Results – Twenty-nine
completed pre and posttests were gathered from February to September 2016.
After coding the answers using the rubric, two paired-samples t-tests were
conducted. The first t-test shows that students’
ability to use appropriate keywords was approaching statistical significance.
The second t-test showed a statistically
significant increase in students’ ability to use appropriate search strings
from the pretest to the posttest. We performed a last paired-samples t-test to
measure students’ confidence level before and after the appointment, and a
statistically significant increase in confidence level was found.
Conclusion – Out of three
paired t-tests performed, two showed a statistically significant difference
from the pretest to the posttest, with one t-test approaching statistical
significance. The analysis of our qualitative results also supports the
statement that IRCs have a positive real impact on students’ search techniques
and their confidence levels. Future research may explore specific techniques to
improve search strategies across various disciplines, tips to improve
confidence levels, and exploring the viewpoint of librarians.
Introduction
In
the current digital age, a university student’s challenge is not finding
information, but rather locating the appropriate, validated, and trustworthy
information required. Librarians support students in this challenge in various
ways, including in-class instructions, specialized workshops, and reference
desk assistance. More specifically, individualized research consultations
(IRCs) between librarians and students have been increasing, with librarians
spending less time at the reference desk. This shift in service appears to be a
trend in many academic libraries. For the purpose of this study, IRCs were
defined as scheduled appointments that aim to help students with
their research projects, including, but not limited to, the literature review
process.
In a scoping review, Fournier and Sikora (2015) found
that though IRCs have been taking place for decades, the impact of these
meetings on a student’s information literacy (IL) skills is challenging to
measure. The authors reviewed 20 articles for assessment methods, with the
following techniques identified: 1) usage statistics; 2) surveys; 3) objective
quantitative methods. While many libraries use statistics and surveys for
assessment purposes, only three articles examined using objective quantitative
methods as a measure of the impact of IRCs on IL skills (Fournier & Sikora,
2015). It is extremely difficult to evaluate an IRC service objectively (Schobert,
1982). However, this does not mean it should not be
attempted. The three studies trying to measure this impact used different
approaches. Donegan,
Domas, and Deosdade (1989) wanted to demonstrate impact between group
instruction and term paper counselling, while Erickson and Warner (1998)
examined whether getting individual tutorials vs. no tutorials would change the
assessment. The authors were unable to demonstrate a statistically significant
difference in the impact of IRCs on students’ IL skills. Reinsfelder (2012)
found a statistically significant difference in his study, which investigated
IL skills using citation analysis to compare students’ draft and final papers
in a course. Reinsfelder concluded that “some quantitative evidence
demonstrating the positive impact of individual research consultation” (p. 263)
had been proven.
As there is a paucity in the literature surrounding
objective quantitative methods evaluating the impact of IRCs on students’ IL
skills after meeting with a librarian, we sought to present a new method using
pre and posttests to examine students’ database searching skills by using a
rubric to analyze their search strategies.
Literature
Review
It
is well known that interactions occurring at the traditional library reference
desk are declining (Association of Research Libraries, 2015). However, the
demand for librarians to offer more personalized, in-depth services to students
and faculty has remained stable, or even risen (Covert-Vail &
Collard, 2012). These services often involve a librarian’s comprehensive
knowledge of resources and strategies tailored to locate the appropriate
information. IRCs can serve as one way to connect students to librarians with
such expertise.
User
surveys and feedback forms have provided librarians with comments from
students, illustrating the usefulness of IRCs (Butler & Byrd, 2016).
Researchers have discussed the benefits IRCs can provide for students, such as
the “overwhelming usefulness” students often reported following a one-on-one
meeting with a librarian (Butler & Byrd, 2016), the opportunity
to aid in developing students’ problem-solving skills (Fields,
2006), the overall positive patron experience with academic library research
consultations (Rogers & Carrier, 2017), or the increase in
goodwill between libraries and faculty members that extends beyond the library
environment (Handler, Lackey, & Vaughn, 2009). While these
interactions have positive connotations to encourage ongoing relationships
between librarians and students, they are subjective in nature, and do not
provide an objective method to analyze a student’s success in developing future
research skills.
Over
the last several decades, few researchers have attempted to assess IRCs
quantitatively, as it is challenging to quantitatively prove their
effectiveness. Reasons for these challenges vary, including not having the
appropriate instrument to evaluate IRCs, the topics of the IRCs can be
difficult to compare, and librarians have various ways in which they conduct
their IRCs. Nevertheless, researchers have tried to surmount this challenge by
utilizing different quantitative approaches. Bergen and MacAdam (1985)
analyzed the number and type of students (male vs. female, freshman to seniors,
in various departments) who used a voluntary one-on-one instruction service. In
1989, Donegan et al.
used objective quantitative methods such as post-instruction testing by
creating a multiple choice test that was given to students immediately
following an instruction session. Reinsfelder (2012) and
Sokoloff and Simmons (2015) examined IRCs using citation analysis
within the management and business fields. Reinsfelder evaluated the quality of
citations used in undergraduate papers, before and after meeting with students
individually, whereas Sokoloff and Simmons created an IL rubric, adapted from
the Association of American Colleges and Universities rubric, to analyze the
performance standards of their group of students. However, no researchers have
specifically assessed the impact of IRCs in the health sciences and medicine
fields.
ACRL Framework for Information Literacy for Higher
Education
The design for our pre/posttest questionnaire, as well
as our rubric for assessing the students’ search strategies, was informed by
the new ACRL Framework for Information
Literacy for Higher Education sixth concept: Searching as strategic
exploration (Association of College and Research Libraries, 2016).
This framework states that searching for
information is often nonlinear and iterative, requiring the evaluation of a
range of information sources and the mental flexibility to pursue alternate
avenues as new understanding develops. It goes on to state that as the
searching process is complex and often daunting for students, meeting with a
librarian permits them to become a more advanced searcher by allowing them to
“search more broadly and deeply to determine the most appropriate information
with the project scope” (p.9). We hypothesize that by matching students’
information needs and search strategies to the appropriate search tools, such
as specialized bibliographic databases, we are able to help them design and
refine their search strategies as necessary, based on their search results.
Pre
and Posttesting Methodology
As
previously stated, studies dedicated to the quantitative assessment of IRCs are
scarce, and even fewer using a pre and posttest methodology have been found in
the literature. In light of this gap, we reviewed the literature evaluating
group instructions using a pre and posttest method, focusing on their
methodology and test design, in order to prepare our questionnaires.
Many
studies use multiple choice questions as their pretest and posttest design to
assess IL skills. Multiple choice questions have been used to assess one-shot
sessions (Bryan & Karshmer, 2015), IL credited courses (Goebel,
Neff, & Mandeville, 2007) and library instructions classes (Chiarella,
Khadem, Brown, & Wrobel, 2014; Ivanitskaya, DuFord, Craig, & Casey,
2008). They have also been used to compare online vs. face-to-face library
instructions, whether with one-shot face-to-face instructions (Mery,
Newby, & Peng, 2012), or with face-to-face workshops (Shaffer,
2011). Understandably, multiple choice questions provide quantifiable data to
assess students’ IL skills, making multiple choice the evaluation method of
choice in many studies. However, other types of assessment techniques have
appeared in the literature. Open-ended questions have been used for pre and
posttesting to capture students’ understanding of IL concepts (Cook
& Walsh, 2012; Gross & Latham, 2013; Wakimoto, 2010).
Further,
pre and posttesting methodologies have been found to be successful outside of
the library literature. Shivaraju, Manu, Vinaya, and Savkar (2017) evaluated
knowledge of didactic lecturing among medical students through a pre and
posttest questionnaire based evaluation technique. They analyzed how much
students were aware of pharmacology concepts before the lecture, and evaluated
the students’ learning of key concepts following the lecture. Their results
found that students’ understanding improved following the lecture, as they were
able to improve their focus towards the lecture, which improved their overall
performance in pharmacology. These findings are also corroborated in other
medical schools using this methodology in medical education (Cramer &
Mahoney, 2001; Muthukumar, D’cruz, & Anandarajan, 2013).
Self-Efficacy
Theory
Bandura’s
(1977,
1997)
self-efficacy theory was used as an inspiration to help design our
questionnaires. More specifically, research taking place in a library setting
using self-efficacy theory was sought out. The term self-efficacy “refers to a
person’s belief in his or her own capability to perform specific activities or
tasks” (Ren, 2000, p.323). Ren (2000) tested students before and
after library instruction on the following qualities: their self-perceived
search performance, their attitude about acquiring search skills, and their
emotions while completing an assignment. The author concluded that in order
“for self-efficacy to increase, students must have adequate searching practice,
experience learning accomplishments and not be overwhelmed with negative
emotions such as confusion and frustration” (Ren, 2000, p. 327). Serap
Kurbanoglu (2003) explored the relationship between university
students’ IL and their self-efficacy beliefs. The author concluded that more
research needs to be conducted to better understand how self-efficacy beliefs
affect individuals’ information problem solving behaviours and lifelong
learning activities.
Aims
For
our project, we issued a pre and posttest questionnaire, evaluating students’
searching techniques in medical databases such as Medline (via Ovid), before
and after meeting with a librarian. We also wanted to gain insight into their
self-perceived ability to search the databases by measuring their self-efficacy.
We then assessed their search strategies with a rubric we designed (Table 1).
Research
Questions
a) Do IRCs improve students’ searching techniques,
including the proper use of keywords and/or subject headings, the accurate use
of Boolean operators, and the appropriate selection of specialized
resources/databases?
b) Do IRCs influence students’ confidence levels in
performing effective search strategies?
Objectives
Our
study’s primary goal was to evaluate the impact that IRCs have on students’
search techniques and their confidence levels, with the following objectives:
a)
Assessing students’ search techniques
before and after they meet individually with a librarian.
b)
Discovering what factors influenced
students’ self-perceived search techniques proficiency and their self-perceived
confidence level of such search techniques.
c)
Determining if an IRC influences
students’ confidence levels in performing effective search strategies.
d)
Exploring students’ expectations and
their satisfaction levels with IRCs.
Methods
Population
The
University of Ottawa has over 40,000 students in attendance (University
of Ottawa, n.d.). There are 4,500 students within the departments of the
Faculty of Health Sciences, which include nursing, rehabilitation, nutrition,
human kinetics, and interdisciplinary health sciences. The Faculty of Medicine
includes the School of Medicine, postgraduate students, epidemiology and public
health, population health and bench science programs, totaling 2,250 students.
Participants included a convenience sample of University of Ottawa students who
were completing an undergraduate or graduate degree in the Faculties of Health
Sciences or Medicine, and also undertaking a research or thesis project.
Data Collection
In order to assess the impact of IRCs on students’
searching techniques, a mixed-methods approach was used. Pre and posttesting
were used, and ethics approval was received from the University of Ottawa,
Office of Research Ethics and Integrity, file number was H12-14-03.
The
first round of data collection took place in 2015, but without a monetary
incentive, very few participants completed the posttest (n = 9). Additional
academic disciplines were also involved in this round of data collection
including management, social sciences, arts, and humanities. We found that the
topics and resources covered in IRCs can fluctuate greatly between disciplines.
For that reason, we decided that the second round of data collection would be
concentrated on a more homogenous group: health sciences and medicine. This
method would allow a better comparison group between students. The first round
of data collection acted as a pilot, allowing a review of the questionnaires,
with several questions being adjusted to increase clarity. Data from the first
round of data collection is not included in the results listed below.
The
second round of data collection took place from February to September 2016.
Even with a monetary incentive, it was challenging to recruit participants (n =
29). The pre and posttest questionnaires can be found in Appendix B.
In addition to the authors, two other librarians
employed by the University of Ottawa were included in the second round of data
collection. When a student contacted a librarian for assistance, a recruitment
email was sent to the student, which contained a brief description of the
study, and the links to the consent form and the pretest questionnaire. At the
end of the first questionnaire, participants were asked if they wished to
complete the second questionnaire (posttest). If their answer was affirmative,
the online survey system (FluidSurveys) would send them an invitation one week
after the first questionnaire was filled up. Using this method of recruitment
allowed complete anonymity for the participants, that is, none of the
librarians, including the present study’s authors, providing IRCs knew if the
students they were helping had participated in the study or not. This anonymity
helped to reduce bias, in the sense that librarians wouldn’t change their
approach or their attitudes toward students depending on whether they
participated in the study or not. Librarians were asked to use a Search
Strategy Worksheet (see Appendix A) with every student they met during an IRC
for the duration of the study, whether they were participating in the study or
not. This worksheet is frequently used during regular IRCs at this library,
outside the scope of this study, therefore, no training of the librarians was
required.
The
questionnaires consisted of open-ended and self-reflective questions (see
Appendix B for the pre and posttest full questionnaires). The open-ended
questions assessed students’ search techniques, specifically their choice of
keywords, synonyms, subject headings, and the creation of a search string with
the appropriate use of Boolean operators. The self-reflective questions
assessed students’ self-perceived proficiency with search techniques, their
confidence level in their search techniques, and their expectations of (before)
or their satisfaction with (after) the IRC. To preserve anonymity, once data
collection was complete, students’ personal information was removed and
replaced with an anonymous identifying number (e.g., “student 1”) in both
questionnaires.
Rubrics
were created as a multi-purpose scoring tool to assess student performance.
While rubric development can stop after the performance criteria have been
identified and performance levels established (Wolf & Stevens,
2007),
more comprehensive rubrics include another step in which each of the cells in
the matrix contains a description of the performance at that level. We created
a rubric to code open-ended questions on search techniques (Table 1), capturing
details to assess the appropriate use of keywords and the search strategy. The
rubric scoring was completed by one of the study’s authors (JR).
Table
1
Rubric
Used to Assess the Pre and Posttest Results for the Appropriate Use of Keywords
and the Search Strategy
Requirement |
Insufficient (0) |
Acceptable (1) |
Superior (2) |
Uses appropriate
keywords |
No keywords provided, or if keywords provided,
very little connection to the research question or topic and are too broad. No use of synonyms. |
Keywords provided are connected to the research
question or topic, but not all subjects are covered. Keywords are somewhat
focused and not too broad. Synonyms used, if applicable. Very little, or no use of subject headings
(optional). |
Keywords provided are connected to the research
question or topic and all subjects are covered. Keywords are well focused. Appropriate use of synonyms, if applicable. Appropriate use of subject headings (optional). |
Builds appropriate
search string |
No search string provided. |
Search string provided with some errors or
missing elements (e.g.: not all keywords are present; mistakes in the use of
Boolean operators) |
Search string provided with no errors and all
elements are present (all keywords are present, no errors with the use of
Boolean operators) |
Results
Our
sample size was small, with only 29 completed pre and posttests. Pre and
posttest self-reflective, open-ended answers were coded and analyzed with the
use of the software QSR NVivo. Multiple choice and Likert scale questions were
analyzed using SPSS. Results are presented following the study’s outlined
objectives.
The first objective was to assess students’ search techniques
before and after they met individually with a librarian. To do so, we asked
participants to provide their list of keywords and search strings before
meeting with a librarian, if they already had done some searching by themselves
(e.g., (marine OR ocean) AND (biology OR science)). To assess if their keyword
and search string selection were accurate and appropriate, we asked
participants to state their research topic or question. We were then able to
use our rubric (Table 1) to code their answers. Two paired-sample t-tests were
conducted to evaluate the impact that a consultation with a librarian had on
students’ ability to appropriately use keywords and build search strategies.
The first t-test showed that students’ ability to use appropriate keywords from
the pretest (M = 1.00, SD = .66) to the posttest (M = 1.34, SD = .72, t (28) = -1.98,
p > .05, two-tailed) was approaching statistical significance. The mean
increase in score was .345 with a 95% confidence interval ranging from -.70 to
.01. The eta squared statistic (.12) indicated a large effect size. The second
t-test showed a statistically significant increase in the students’ ability to
use appropriate search strategies from the pretest (M = .21, SD = .41) to the
posttest (M = .76, SD = .79), t (28) = -3.59, p = .001 (two-tailed). The mean increase in score
was .55 with a 95% confidence interval ranging from -.87 to -.24. The eta
squared statistic (.32) indicated a large effect size.
The
second objective was to discover factors that influenced students’
self-perceived search techniques proficiency and confidence level. We asked
participants which factors influenced their confidence level before and after
the IRC. Before the appointment, both positive and negative factors were stated
in almost equal measure, with positive factors rated slightly higher. Negative
factors were grouped by the following themes:
1)
lack of available research
2)
research topic difficulty
3)
lack of prior knowledge
4)
difficulty using databases
Positive
factors were categorized under the themes:
1)
prior knowledge
2)
help from other people (colleagues,
supervisors)
After
the appointment, the factors that influenced students’ confidence level were
almost all positive, and were grouped under the following themes:
1)
new or prior knowledge
2)
support from others
3)
strength of research question or search
string
There
were no statistically significant differences found between any of the themes
presented.
Our
third objective was to determine if the IRC influenced students’ confidence
levels in performing effective search strategies. To answer that objective, we
measured student confidence level before and after the appointment. We asked
participants how confident they were in finding relevant sources of information,
using a scale from 1 (“not confident at all”) to 10, (“very confident”). In the
pretest, the mean was 5.85 (Table 2), and in the posttest, the mean was 7.24
(Table 3).
We
performed a paired-samples t-test to evaluate the impact that meeting with a
librarian had on students’ confidence with regard to finding relevant sources
of information. There was a statistically significant increase in confidence
level from the pretest (M = 5.93, SD = 1.46) to the posttest (M = 7.24, SD = 1.46), t (28) =
-4.34, p < .001 (two-tailed). The mean increase in confidence was 1.31 with
a 95% confidence interval ranging from -1.93 to -.69. The eta squared statistic
(.40) indicated a large effect size.
We
also asked participants in the posttest if the appointment with a librarian
influenced their confidence level, and 96.6% of respondents said “yes” (Table
4). When asked to describe how the appointment with a librarian changed their
confidence level, participants provided positive comments, which we compiled
under three main themes:
1)
finding useful resources
2)
learning how to properly search
databases
3)
learning how to execute a search
strategy
Furthermore,
students were asked how the appointment with a librarian might have influenced
various elements of their research project. A response rate of 63% of
participants mentioned that the appointment with a librarian influenced their
keyword selection (Table 5).
Table
6 illustrates that 45% of students mentioned that the IRC influenced their
search strategy, while 10% mentioned that it did not. However, it should be
noted that 45% of respondents didn’t have a search strategy before meeting with
a librarian, which is why they answered “does not apply” to that question
(Table 6).
Table
2
Pretest
Confidence Level on a Scale from 1 to 10
|
Minimum |
Maximum |
Mean |
On a
scale from 1 to 10, where 1 represents “not confident at all” and 10
represents “very confident”; how confident are you with finding relevant
sources of information? |
3 |
9 |
5.85 |
Table
3
Posttest
Confidence Level on a Scale from 1 to 10
|
Minimum |
Maximum |
Mean |
On a
scale from 1 to 10, where 1 represents “not confident at all” and 10
represents “very confident”; how confident are you with finding relevant
sources of information? |
4 |
10 |
7.24 |
Table
4
Appointment
with Librarian Influenced Students’ Confidence in their Search Techniques
|
Frequency |
Percentage
(%) |
Yes |
28 |
96.6 |
No |
1 |
3.4 |
Total |
29 |
100 |
Table
5
Students’
Keyword Selection had Changed after Meeting with a Librarian
|
Frequency |
Percentage
(%) |
Yes |
17 |
63 |
No |
10 |
37 |
Total |
27 |
100 |
Table
6
Students’
Search Strategy had Changed after the Appointment with a Librarian
|
Frequency |
Percentage |
Yes |
13 |
44.8 |
No |
3 |
10.4 |
Does
not apply |
13 |
44.8 |
Total |
29 |
100 |
The
last objective explored students’ expectations and their satisfaction levels
with the IRC. Students were asked if their expectations were met after meeting
with a librarian using a Likert scale from 1 to 10, where 1 represented
“expectations not met at all” and 10 represented “exceeded expectations”, and
86.1% of respondents answered 7 or higher. We also asked participants to
describe how their expectations were or were not met. Participants’ answers
were grouped into three themes:
1)
My expectations were met since I learned the appropriate resources and
information-seeking knowledge.
2)
My expectations were met because I learned how to search properly.
3)
My expectations were not met because the appointment time was used to teach me
how to use the resources rather than to find all available information.
Discussion
Our
study demonstrated that students who met with a librarian for an IRC improved their
search strategies. Although there wasn’t a significant statistical difference
indicated on the pre/posttest questionnaire with regard to the students’
ability to use appropriate keywords, there was a statistically significant
increase in the students’ ability to use appropriate search strategies overall.
These strategies may include the choice of keywords, synonyms, subject
headings, and the creation of a search string with the appropriate use of
Boolean operators. This indicates that while individual keywords still pose a
challenge for students, their overall strategies for searching have
holistically improved.
Additionally,
there was a statistically significant increase in the students’ confidence with
regard to retrieving relevant sources of information, after having met with a
librarian. The analysis of our qualitative results also supported the positive
impact that IRCs have on students’ search techniques, as participants indicated
that their expectations were met as they learned how to search properly, and
how to use the appropriate resources.
Although participants’ confidence levels
significantly increased after meeting with a librarian, we noted that 12 out of
29 respondents indicated a confidence level of 7 or higher prior to the
appointment with a librarian, and mainly stated “prior knowledge” as a factor
influencing their confidence level. Prior knowledge may include previous
searching experience for another research paper or with a particular database,
or familiarity when searching for their specific research topic. It could then
be inferred that many participants had a high self-perceived confidence in
their own search techniques prior to meeting with a librarian. As Maddux and
Volkmann (2010) stated: “people who maintain strong self-efficacy
beliefs during self-regulatory efforts are […] more likely to persevere” (p.
317). In other words, to help oneself to self-regulate (the process by which
people control their thoughts, feelings and behaviours), one has to believe in
one’s own capabilities to perform the task at hand in order to do it. The
students had likely completed previous searches and felt confident leading up
to their meeting with the librarian.
Another possible reason for this high confidence
level could be the information-seeking behaviours exhibited by the digital
generation. Keshavarz,
Esmaeili Givi, and Vafaeian (2016) studied IL self-efficacy in graduate
students and found that a high degree of their self-efficacy stemmed from their
confidence levels, as well as their motivation and proficiency. Their results
are consistent with what we discovered.
However, once they meet with a librarian, they learned how to use new resources
they hadn’t previously considered, with new search techniques that they did not
possess previously (Keshavarz et al, 2016). With the plethora of scientific
literature easily retrievable from the Internet, many students might think they
are self-sufficient, or do not require professional help, but once they learn
what specialized databases and strong search strategies can provide, they
appreciate the new knowledge they have acquired after meeting with a librarian.
Our
study is unique, as it is one of the first to quantitatively examine student
improvement with search strategies in the health sciences. While our methods
were not without validation (i.e., the use of a rubric), it does allow future
research to build on it in order to create methods that can become validated
and reliable. It may also demonstrate a quantitative return on investment (ROI)
for libraries, showing the impact that librarians play in the role of student
learning, however, this would require further research. Librarians often must
defend their impact in a research environment quantitatively, and this may be
one manner in which it could be measured.
Limitations
Our study is not without limitations. Firstly, there
were only 29 completed questionnaires for the pre and posttesting period. A
higher response rate would increase the significance of the results. Also, the
sample of students was a convenience sample, and therefore, not representative
of the student population.
Secondly, assessing individualized consultations is
challenging, as the field of study involved or the type of sources needed are
dependent on the research question. As such, individualized consultations are
not identical. Therefore, attempting to compare them is challenging due to the
different variables each consultation brings to the table. We tried to limit
the variability as much as possible by limiting the fields of study (only
health sciences and medicine), and requesting participants be involved with a
research project.
Thirdly, our rubric was not validated. It is true
that rubrics can positively contribute to student learning and program
improvement by ensuring that the learning target is more clear, guiding the
instructional design and delivery, and making the assessment process more
accurate and fair (Wolf & Stevens, 2007). However, without piloting and assessing the
rubric properly, in order to adapt it as needed, the validity of the process can
be questioned. Although we did pilot our rubric during our first round of data
collection with all disciplines, performing a second pilot with our more
targeted audience of only health sciences and medicine students would have been
beneficial.
Future studies on this topic should include
qualitative data from interviews conducted with librarians to examine their
perceptions of an effective IRC. As well, specific focus groups with students
may also alert librarians to challenges and barriers that were not originally
anticipated. Additional research involving IRCs is certainly needed, and future
studies could examine the similarities and differences between disciplines in
order to adequately meet the unique needs of students in those fields.
Conclusion
With the study’s limitations
in mind, we can affirm that, overall, IRCs have a positive impact on students’
search techniques and their confidence levels. Library services are rapidly changing, and assistance to students takes
many forms. In-person, one-on-one tailored help is tremendously appreciated by
students and should be kept as an additional service offered to students.
Anecdotally, the Health Sciences Library at the University of Ottawa has seen
an increase in IRCs provided at a distance via Skype. This could be an
additional method to continue offering this dedicated individualized assistance
to students going forward.
Conflict of Interest
Two
grants aided in the execution of our research project. We received $600 from
the University of Ottawa Library Research Grant, which was used to provide a
$10 incentive for completing the pretest, and another $10 incentive for the
posttest. The full amount received was used for incentives. We believe that the
incentive helped gather enough completed posttests, as many students often
completed the first questionnaire but didn’t usually bother to complete the
second one. We received a second grant of the amount of $1,500 from the CARL
Research in Librarianship Grant. Funds from that grant were used to pay for a
research assistant (JR), who was instrumental by helping with our data
analysis.
References
Association of
Research Libraries. (2015). Service
trends in ARL libraries, 1991-2015. Retrieved from https://www.arl.org/focus-areas/statistics-assessment/statistical-trends#.XIkvuihKi70
Association of
College and Research Libraries. (2016). Framework
for information literacy for higher education. Chicago, IL: Author.
Retrieved from http://www.ala.org/acrl/standards/ilframework
Bandura, A.
(1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84(2), 191-215.
Bandura, A.
(1997). Self-efficacy: The exercise of
control. New York, NY: W.H. Freeman.
Bergen, K.,
& MacAdam, B. (1985). One-on-one: Term paper assistance programs. RQ, 24(3),
333-340. Retrieved from https://www.jstor.org/stable/pdf/25827393.pdf
Bryan, J. E.,
& Karshmer, E. (2013). Assessment in the one-shot session: Using pre-and
post-tests to measure innovative instructional strategies among first-year
students. College & Research
Libraries, 74(6), 574–586. https://doi.org/10.5860/crl12-369
Butler, K.,
& Byrd, J. (2016). Research consultation assessment: Perceptions of
students and librarians. The Journal of
Academic Librarianship, 42(1),
83-86. https://doi.org/10.1016/j.acalib.2015.10.011
Chiarella, D.,
Khadem, T. M., Brown, J. E., & Wrobel, M. J. (2014). Information literacy
skills retention over the first professional year of pharmacy school. Medical Reference Services Quarterly, 33(3), 302–312. https://doi.org/10.1080/02763869.2014.925693
Cook, P., &
Walsh, M. B. (2012). Collaboration and problem-based learning. Communications in Information Literacy, 6(1), 59–72. Retrieved from https://pdxscholar.library.pdx.edu/comminfolit/vol6/iss1/6/
Covert-Vail, L.,
& Collard, S. (2012). New roles for
new times: Research library services for graduate students. Retrieved from
the Association of Research Libraries website: http://www.arl.org/storage/documents/publications/nrnt-grad-roles-20dec12.pdf
Cramer, J. S.,
& Mahoney. M. C. (2001). Introducing evidence based medicine to the journal
club, using a structured pre and post test: A cohort study. BMC Medical Education, 1(6). https://doi.org/10.1186/1472-6920-1-6
Donegan, P. M.,
Domas, R. E., & Deosdade, J. R. (1989). The comparable effects of term
paper counseling and group instruction sessions. College & Research Libraries, 50(2), 195–202. https://doi.org/10.5860/crl_50_02_195
Erickson, S. S.,
& Warner, E. R. (1998). The impact of an individual tutorial session on
MEDLINE use among obstetrics and gynaecology residents in an academic training
programme: A randomized trial. Medical
Education, 32(3), 269-273. https://doi.org/10.1046/j.1365-2923.1998.00229.x
Fields, A. M.
(2006). Ill‐structured problems and the reference consultation: The librarian's
role in developing student expertise.
Reference Services Review, 34(3),
405–420. https://doi.org/10.1108/00907320610701554
Fournier, K.,
& Sikora, L. (2015). Individualized research consultations in academic
libraries: A scoping review of practice and evaluation methods. Evidence Based Library & Information
Practice, 10(4), 247-267.
https://doi.org/10.18438/B8ZC7W
Goebel, N.,
Neff, P., & Mandeville, A. (2007). Assessment within the Augustana model of
undergraduate discipline-specific information literacy credit courses. Public Services Quarterly, 3(1–2), 165–189. https://doi.org/10.1300/J295v03n01_09
Gross, M., &
Latham, D. (2013). Addressing below proficient information literacy skills:
Evaluating the efficacy of an evidence-based educational intervention. Library & Information Science Research,
35(3), 181–190. https://doi.org/10.1016/j.lisr.2013.03.001
Handler, L.,
Lackey, M., & Vaughan, K. (2009). "Hidden treasures": Librarian
offices hours in three health sciences schools. Medical Reference Services Quarterly, 29(4), 336-350. https://doi.org/10.1080/02763860903249076
Ivanitskaya, L.,
DuFord, S., Craig, M., & Casey, A. M. (2008). How does a pre-assessment of
off-campus students’ information literacy affect the effectiveness of library
instruction? Journal of Library
Administration, 48(3-4), 509–525.
https://doi.org/10.1080/01930820802289649
Keshavarz, H.,
Esmaeili Givi, M., & Vafaeian, A. (2016). Students’ sense of self-efficacy
in searching information from the Web: A PLS approach. Webology, 13(2), 16-31.
Retrieved from http://eprints.rclis.org/32203/
Maddux, J. E.,
& Volkmann, J. (2010). Self-efficacy. In R. H. Hoyle (Ed.), Handbook of personality and self-regulation
(pp. 315–331). Oxford: Wiley-Blackwell.
Mery, Y., Newby,
J., & Peng, K. (2012). Why one-shot information literacy sessions are not
the future of instruction: A case for online credit courses. College & Research Libraries, 73(4), 366-377. https://doi.org/10.5860/crl-271
Muthukumar, S.,
D’cruz, S. M., & Anandarajan, B. (2013). Introduction of pre-test and
post-test enhances attentiveness to physiology lectures: Students’ perceptions
in an Indian medical college. International
Journal of Biomedical Advanced Research, 4(5), 341-344. Retrieved from https://ssjournals.com/index.php/ijbar/article/view/319
Reinsfelder, T.
L. (2012). Citation analysis as a tool to measure the impact of individual
research consultations. College & Research
Libraries, 73(3), 263–277. https://doi.org/10.5860/crl-261
Ren, W.-H.
(2000). Library instruction and college student self-efficacy in electronic
information searching. The Journal of
Academic Librarianship, 26(5),
323–328. https://doi.org/10.1016/S0099-1333(00)00138-5
Rogers, E.,
& Carrier, H. S. (2017). A qualitative investigation of patrons’
experiences with academic library research consultations. Reference Services Review, 45(1),
18–37. https://doi.org/10.1108/RSR-04-2016-0029
Serap
Kurbanoglu, S. (2003). Self‐efficacy: A concept closely linked to information
literacy and lifelong learning. Journal
of Documentation, 59(6), 635–646.
https://doi.org/10.1108/00220410310506295
Shaffer, B. A.
(2011). Graduate student library research skills: Is online instruction
effective? Journal of Library &
Information Services in Distance Learning, 5(1–2), 35–55. https://doi.org/10.1080/1533290X.2011.570546
Shivaraju, P.
T., Manu, G., Vinaya, M., & Savkar, M. K. (2017). Evaluating the
effectiveness of pre- and post-test model of learning in a medical school.
National Journal of Physiology, Pharmacy
and Pharmacology, 7(9), 947-951. https://doi.org/10.5455/njppp.2017.7.0412802052017
Schobert, T.
(1982). Term-paper counseling: Individualized bibliographic instruction. RQ, 22(2),
146-151. Retrieved from https://www.jstor.org/stable/pdf/25826898.pdf
Sokoloff, J.,
& Simmons, R. (2015). Evaluating citation analysis as a measurement of
business librarian consultation impact. Journal
of Business and Finance Librarianship, 20(3),
159–171. https://doi.org/10.1080/08963568.2015.1046783
University of
Ottawa. (n.d.). Enrolment. In Institutional research and planning. Retrieved 20
Feb. 2018 from https://www.uottawa.ca/institutional-research-planning/resources/facts-figures/fact-book/enrolment
Wakimoto, D. K.
(2010). Information literacy instruction assessment and improvement through
evidence based practice: A mixed method study. Evidence Based Library & Information Practice, 5(1), 82–92. https://doi.org/10.18438/B80616
Wolf, K., &
Stevens, E. (2007). The role of rubrics in advancing and assessing student
learning. Journal of Effective Teaching,
7(1), 3–14. Retrieved from https://files.eric.ed.gov/fulltext/EJ1055646.pdf
Appendix
A
Search Strategy
Worksheet
Search Statement / Topic |
|||||
1. |
Search Question or Topic: |
|
|||
List as many as you need |
|||||
2. |
Major Concepts: |
1. 2. 3. 4. |
|||
3. |
Search Terms: |
||||
Concept 1 AND Concept 2 AND Concept 3 AND Concept 4 |
|||||
|
|
|
|
|
|
OR |
|
|
|
|
|
OR |
|
|
|
|
|
OR |
|
|
|
|
|
OR |
|
|
|
|
|
OR |
|
|
|
|
|
OR |
|
|
|
|
|
Resources
/ databases to use:
Source: Rielding, Ann Marlow. 2002. Learning to learn. New York: Neil Shuman Publisher.
Appendix B
Pre and Posttest Questionnaires
Assessing
Individualized Research Consultations – Pretest Questionnaire
|
Yes |
|
No |
|
Yes |
|
No |
|
None |
|
Some
sources |
|
Many
sources |
|
Male |
|
Female |
|
There
isn’t an option that applies to me |
|
19
years old or less |
|
20 to
25 years old |
|
26 to
30 years old |
|
31
years old and over |
|
Undergraduate
degree |
|
Graduate
degree |
|
1st
year |
|
2nd
year |
|
3rd
year |
|
4th
year |
|
Other:
______________________ |
|
Post
graduate certificate |
|
Master
degree |
|
Doctoral
degree |
|
Other: |
|
Faculty
of Health Sciences |
|
Faculty
of Medicine |
|
Yes |
|
No |
|
Yes |
|
No |
Assessing Individualized Research Consultations -
Posttest Questionnaire
|
Not at
all |
|
Slightly
modified |
|
Modified
completely |
|
Yes |
|
No |
|
Yes |
|
No |
|
Does
not apply |
|
Yes |
|
No |
This work is licensed under a Creative Commons
Attribution-Non Commercial-ShareAlike 4.0 International License.