Research Article

 

A Systematic Review of Information Literacy Programs in Higher Education: Effects of Face-to-Face, Online, and Blended Formats on Student Skills and Views

 

Alison L. Weightman

Director, Specialist Unit for Review Evidence (SURE)

University Library Service

Cardiff University

Cardiff, United Kingdom

Email:  WeightmanAL@cardiff.ac.uk

 

Damian J. J. Farnell

Lecturer in Medical Statistics

School of Dentistry

Cardiff University

Cardiff, United Kingdom

Email: FarnellD@cardiff.ac.uk

 

Delyth Morris

Subject Librarian

University Library Service

Cardiff University

Cardiff, United Kingdom

Email: MorrisD13@cardiff.ac.uk

 

Heather Strange

Research Associate

SE Wales Trials Unit

Cardiff University, United Kingdom

Email: StrangeHR1@cardiff.ac.uk

 

Gillian Hallam

Information Literacy Project Manager

University of Queensland

Brisbane, Australia

Email: g.hallam@library.uq.edu.au

 

Received: 8 Feb. 2017      Accepted: 2 Aug. 2017   

 

 

cc-ca_logo_xl 2017 Weightman, Farnell, Morris, Strange, and Hallam. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

 

Abstract

 

Objective – Evidence from systematic reviews a decade ago suggested that face-to-face and online methods to provide information literacy training in universities were equally effective in terms of skills learnt, but there was a lack of robust comparative research. The objectives of this review were (1) to update these findings with the inclusion of more recent primary research; (2) to further enhance the summary of existing evidence by including studies of blended formats (with components of both online and face-to-face teaching) compared to single format education; and (3) to explore student views on the various formats employed.

 

Methods Authors searched seven databases along with a range of supplementary search methods to identify comparative research studies, dated January 1995 to October 2016, exploring skill outcomes for students enrolled in higher education programs. There were 33 studies included, of which 19 also contained comparative data on student views. Where feasible, meta-analyses were carried out to provide summary estimates of skills development and a thematic analysis was completed to identify student views across the different formats.

 

ResultsA large majority of studies (27 of 33; 82%) found no statistically significant difference between formats in skills outcomes for students. Of 13 studies that could be included in a meta-analysis, the standardized mean difference (SMD) between skill test results for face-to-face versus online formats was -0.01 (95% confidence interval -0.28 to 0.26). Of ten studies comparing blended to single delivery format, seven (70%) found no statistically significant difference between formats, and the remaining studies had mixed outcomes. From the limited evidence available across all studies, there is a potential dichotomy between outcomes measured via skill test and assignment (course work) which is worthy of further investigation. The thematic analysis of student views found no preference in relation to format on a range of measures in 14 of 19 studies (74%). The remainder identified that students perceived advantages and disadvantages for each format but had no overall preference.

 

Conclusions There is compelling evidence that information literacy training is effective and well received across a range of delivery formats. Further research looking at blended versus single format methods, and the time implications for each, as well as comparing assignment to skill test outcomes would be valuable. Future studies should adopt a methodologically robust design (such as the randomized controlled trial) with a large student population and validated outcome measures. 

 


 


Introduction

 

The provision of information literacy (IL) education for students is an established and valued role within university libraries. There are many definitions of IL but this can be broadly described as, “knowing when and why you need information, where to find it, and how to evaluate, use and communicate it in an ethical manner” (CILIP, 2017). IL training has been shown to result in an increase in student skills and understanding compared to no instruction (Koufogiannakis & Wiebe, 2006; Weightman, Farnell, Morris & Strange, 2015).

 

Around a decade ago, two systematic reviews of IL interventions in higher education looked at the specific question of online versus face-to-face instruction in academic libraries (Koufogiannakis & Wiebe, 2006; Zhang, Watson & Banfield, 2007). Both reviews concluded that online provision was as effective as face-to-face training in terms of skills learned but noted the lack of robust comparative studies.

 

Since the reviews were published, further studies of ‘taught’ student IL provision comparing traditional versus online delivery have been completed, including studies looking at blended (with components of both online and face-to-face teaching) compared to single format delivery. There are suggestions from the library setting of theoretical benefits to a blended approach (such as the ‘flipped classroom’ where students study online in advance of the face to face session), particularly for the more technical and practical skills involved in information literacy (Arnold-Garza, 2014). The potential benefits of blended teaching include the effective use of class time, more active learning, allowance of individual learning styles, and speed (Arnold-Garza 2014). Such techniques are increasingly being used across academic settings, suggesting that these will become the ‘new traditional model[s]’ (Brown, 2016).

 

A recent meta-analysis of 45 studies of online and face-to-face learning across the education and subject spectrum, from secondary to higher education, concluded that students in online learning conditions performed modestly better than those receiving face-to-face instruction. However, this analysis indicated a significant difference only for the blended versus face-to-face and not the online versus face-to-face

 

conditions (Means, Toyama, Murphy & Baki, 2013). The authors noted that blended formats tended to involve additional learning time and resources which could explain the findings. A further systematic review and meta-analysis of 44 studies exploring knowledge acquisition in health education (Liu et al., 2016) concluded that blended learning was more effective, or at least as effective, as single format learning but that the result should be treated with caution given the huge variation between studies.

 

We could not identify any review level evidence from the IL literature on blended versus other learning formats with similar curricula/contact times and ‘hard’ outcomes such as skills acquisition. Neither was there a systematic summary of student views on the different formats.

 

Thus, the aims of this research study were to carry out an up-to-date systematic review of research into IL programs in higher education to:

 

(i)                  confirm or refute the findings of the earlier reviews in terms of the relative effectiveness of traditional (face-to-face) and online (web or computer based) educational provision by the inclusion of more recent studies;

(ii)                expand the scope of the review to include comparative studies of blended versus single format delivery; and

(iii)              systematically explore the views of research participants from each study on their perceptions of the differing formats.

 

Methods

 

We undertook a systematic review of controlled studies to summarize the findings of comparative research studies using both quantitative and qualitative methods. We extracted data on student skills as assessed after exposure to each delivery format and completed a thematic analysis of student views identified within the research.

 

Studies were identified via a comprehensive search for published and unpublished papers comparing face-to-face and online information literacy programs using database searching and supplementary search methods. 

 

Search strategy 

 

We searched seven relevant databases for formally published research publications or ‘grey literature’ in higher education or libraries in October 2016:  British Education Index; ERIC; Proquest Dissertations and Theses (Index to Theses); Librarians’ Information Literacy Annual Conference (LILAC) Abstracts; Library, Information Science & Technology Abstracts (LISTA); LOEX Conference Abstracts; Open Grey; Scopus.

 

Text words and phrases were identified from the authors’ knowledge of the subject area and existing known literature. Text mining for common words and phrases using the free software, Termine (National Centre for Text Mining 2012) was also used to identify the most relevant search terms to use in text word searching. This software used the titles and abstracts from a set of 42 papers that explored information literacy education taught to students in universities. A set of search terms and associated subject headings were developed for LISTA (Table 1) and then adapted for each database.

 

We sought recent studies (from January 1995 onwards) to assure relevance to the modern and higher speed internet architecture, and the wide-scale adoption of database searching in libraries.

 

In addition, the extensive use of supplementary search methods increased the sensitivity of the search (i.e., the ability to identify the vast majority of relevant papers). These methods included reference list follow up, unpicking of

related systematic reviews for primary research studies, citation tracking (via Scopus and Google Scholar), expert contact and hand searching of the 2016 editions of a number of journals: College and Research Libraries; Communications in Information Literacy; Evidence Based Library and Information Practice; Health Information & Libraries Journal; Journal of Academic Librarianship; Journal of Information Literacy; Journal of the Medical Library Association; portal: Libraries & the Academy.

 

 

Table 1

Search Terms for LISTA

S1 AND S2 AND S3 (1995-2016)

S3 TI (Test score OR learning outcome OR effective* OR student performance OR control group OR randomised OR pretest OR pre-test OR posttest OR post-test OR randomized OR trial OR controlled OR efficacy OR impact OR evaluat*) OR AB (Test score OR learning outcome OR effective* OR student performance OR control group OR randomised OR pretest OR pre-test OR posttest OR post-test OR randomized OR trial OR controlled OR efficacy OR impact OR evaluat*) 

S2 (DE "College Students" OR DE "College Freshmen" OR DE "College Seniors" OR DE "College Transfer Students" OR DE "First Generation College Students" OR DE "Graduate Students" OR DE "In State Students" OR DE "On Campus Students" OR DE "Out of State Students" OR DE "Preservice Teachers" OR DE "Two Year College Students" OR DE "Undergraduate Students" ) OR ( TI ( College student* OR freshman OR first-year OR undergrad* OR freshmen OR sophomore* OR universit* OR higher education OR academic OR taught postgraduate*) OR AB ( College student* OR freshman OR first-year OR undergrad* OR freshmen OR sophomore* OR universit* OR higher education OR academic OR taught postgraduate*) ) 

S1 DE Information Literacy OR TI ( (Information litera* OR library instruct* OR library skill* OR acrl il standard OR information competen* OR bibliographic instruct* OR library research OR il concept OR instruction librarian) OR ((Research skill* OR electronic information or information retrieval or ebm skill OR electronic resource* OR instructional method OR user train* OR user education OR literacy instruct* OR hands-on instruction OR research strateg* OR evidence-based OR print workbook OR instructional format OR social medi* learning OR online tutor*) AND librar*) 

AB: Word(s) in the abstract; DE: Descriptor (assigned by indexer); S: Set of terms; TI: Word(s) in the title; *= truncation term.

 

 


 

Inclusion and exclusion criteria

 

The criteria for selection of studies are provided in Table 2. The training had to be described as information literacy or library skills, with a statement that equivalent content was covered within each format to avoid any potential for bias as a result of differing curricula. 

 

Study selection

 

After removing duplicates and clearly irrelevant citations (unrelated to library-based training), study selection at both title/abstract and full-text stages was undertaken independently by two authors. Any disagreements at either stage were resolved by recourse to a third reviewer.

 

Quality assessment and data extraction 


Two authors independently appraised each included study using criteria specifically developed for educational interventions. We used the Glasgow checklist for educational interventions (Morrison, Sullivan, Murray & Jolly, 1999), adapted to include the questions from the ReLIANT checklist for library based educational interventions (Koufogiannakis, Booth & Brettle, 2005). A quality commentary for each paper was agreed by discussion and these commentaries, along with summary data from each study on skill related outcomes and any student views, were extracted by one author and checked by another. The study detail, including the IL content of each intervention, was summarized in the detailed data extraction table (see Appendix) with summary data provided in Table 3.

 

Data synthesis 


We carried out a synthesis of the findings across the body of evidence on skills outcomes and student views.

 

We combined the study findings for skills outcomes by meta-analysis when studies provided means, sample sizes, and standard deviations for the outcomes. Meta-analysis forms a pooled result based on all studies by finding an average of the outcomes from each study. For fixed-effects meta-analysis, the results of each study are “weighted” by the variance (i.e., the overall standard error squared) for the difference in means for each study when forming this average. Thus, those studies that are more accurate (often those studies with larger sample sizes) make a greater contribution to the result. A similar weighting occurs for random effects meta-analysis, except that heterogeneity (in variances and effects sizes) is accounted for also in the weighting process. The included studies used different types of tests (and thus had different maximum possible test scores) so a standardized mean difference (SMD = difference in means divided by the standard deviation) was employed.

 

 

Table 2
Inclusion/Exclusion Criteria

Population

·         Undergraduates and postgraduates enrolled in higher education coursework programs

Intervention

·         An information literacy intervention comparing face-to-face and online delivery formats with a formal assessment of student skills (via a test, diagnostic essay, or end-of-course exam)

Comparators

1.       Face-to-face

2.       Online

3.       ‘Blended’ (with face-to-face and online components)

Outcomes

Primary outcome

·         Change in information literacy skills

Secondary outcomes

·         Student views on the educational format(s)

Limits

Studies published since January 1995

Types of evidence  included

Randomized and non-randomized controlled studies

Exclusions

·         Sessions for research postgraduates, unless as part of a formal ‘taught’ program, such as a research methods course

·         Sessions for professional trainees, not based at the University (e.g. junior health professionals based in hospital or primary care sites) 

·         Comparisons involving differing face-to-face formats only, or differing online formats only

·         Different curricula for each learning format

·         Students not from the same cohort (e.g. different year groups for different formats)

 

 

 


A Forest plot (Lewis & Clark, 2001) shows both the results of each individual study and the pooled results of meta-analysis. The pooled results are identified by the diamonds within the Forest plot, where the middle of the diamond gives the pooled point-value estimate for the SMD and its edges give the associated 95% confidence interval (CI). For specific studies, the point-value estimate of the SMD is indicated by the central symbol and the associated 95% CI for the SMD is indicated by the horizontal line. An overall meta-analysis that included all studies, irrespective of subgroup, was carried out using standard statistical software (STATA V13). When the number of studies included in meta-analysis was large enough (i.e., equal to or greater than about 10 studies), any evidence of bias was assessed by funnel plots, Egger’s and Begg’s test of small sample size effects.

 

Heterogeneity was assessed by I2 scores and P < 0.05 from a chi-squared test of heterogeneity before deciding whether to carry out a random-effects or fixed effects meta-analysis. Random-effects meta-analysis takes into account both the variability within each individual study (shown by the confidence intervals for each study) and variability between the different studies (i.e., variability of the point-estimates of the SMD). This approach tends to lead to larger confidence intervals than fixed-effects meta-analysis, which includes only variability within each individual study.

 

(1)     We also carried out a thematic analysis of information on student views, where available within the comparative studies, using methods described by Braun and Clarke (2006) to generate descriptive themes. Initially, each paper was examined line by line, by two authors independently. Codes (features of the options expressed) were assigned to relevant sentences and paragraphs. These codes were then organized, via discussion, into related areas to construct descriptive themes that best reflected students’ views on the different teaching formats. All data on student views from each paper were then imported into Nvivo 10 software (QSR International Pty Ltd., 2012) for analysis.

 

Results

 

Of 5,313 records identified via the various search strategies employed (Figure 1), 33 studies met the inclusion criteria for providing a direct comparison between traditional and online IL education, and these studies were included in the review. Summary data from all studies are provided in Table 3. Detailed information on study characteristics and the results of skills assessments is available (see Appendix).

 

Study Quality

 

Of the 33 studies, 11 were randomized controlled trials (Brettle & Raynor, 2013; Churkovich & Oughtred, 2002; Goates et al., 2016; Greer et al., 2016; Koenig & Novotny, 2001; Kraemer et al., 2007; Lechner, 2007; Schilling, 2012; Shaffer, 2011; Swain et al., unpub; Vander Meer & Rike, 1996), whereas the remaining studies were (non-randomized) controlled before and after studies. 

 

The vast majority of research was carried out in the U.S. (26 studies; 79%). Of the remaining seven studies, three were based in the U.K. (Brettle & Raynor, 2013; Walton & Hepworth, 2012; Swain et al., 2015 unpub.), two in Australia (Churkovich & Oughtred, 2002; Salisbury & Ellis, 2003), one in Canada (Bordignon et al., 2016) and one in the Czech Republic (Kratochvil, 2014).

 

The 11 studies that used a randomized controlled design were less prone to bias since the study design increased the likelihood that the student groups were well matched.  However, most of the studies had some methodological limitations (Table 3). 

 

Of the 33 studies, 25 did not pilot or validate the test instrument. Only two studies carried out formal validity testing (Brettle & Raynor, 2013; Mery et al., 2012a) with a further five piloting the test before use (Bordignon et al. 2016; Burhanna et al., 2008; Churkovich & Oughtred, 2002; Kratochvil, 2014; Swain et al., 2015 unpub.). Finally, one study used a predetermined rubric for marking (Goates et al., 2016).

 

Of the 33 studies, 17 included mean IL test scores with standard deviations and could be included in the meta-analyses (Alexander & Smith, 2001; Anderson & May, 2010; Beile & Boote, 2005; Brettle & Raynor, 2013; Churkovich & Oughtred, 2002; Germain, Jacobson & Kaczor, 2000; Goates, Nelson & Frost, 2016; Greer, Hess & Kraemer, 2016; Lantzy, 2016; Mery, Newby & Peng, 2012a; Shaffer, 2011; Silk, Perrault, Ladenson & Nazione, 2015; Swain, Weightman, Farnell & Mogg unpub.; Vander Meer & Rike, 1996; Walton & Hepworth, 2012; Wilcox Brooks, 2014).

 

The results from the studies were ‘heterogeneous’ (i.e., effect sizes or variances varied considerably) and so a random-effects meta-analysis was used. A sensitivity analysis was carried out in order to study the effects of heterogeneity that was here driven by just one or two "outlying" studies in each comparison. These studies were systematically removed from the meta-analyses. This process did not change the overall results of meta-analysis very greatly: i.e., effect sizes and associated 95% confidence intervals remained broadly constant and the statistical significance (or not) of all two-group comparisons remained unchanged. Clearly though, caution should be exercised when interpreting pooled results of meta-analysis when the heterogeneity is high.

 

 

 


Figure 1
Flow diagram (‘n’ indicates the number of studies).

 


 

 

Of the 33 studies, 21 provided data on participants’ views (Anderson & May, 2010; Beile & Boote 2005; Burhanna, Eschedor Voelker & Gedeon, 2008; Byerley, 2005; Churkovich & Oughtred, 2002; Gall, 2014; Goates et al., 2016; Holman, 2000; Kaplowitz & Contini, 1998; Koenig & Novotny, 2001; Kraemer, Lombardo & Lepkowski, 2007; Lantzy, 2016; Nichols, Shaffer & Shockey, 2003; Nichols Hess, 2014; Schilling, 2012; Shaffer, 2011; Silk et al., 2015; Silver & Nickel, 2007; Swain et al., unpub; Vander Meer & Rike, 1996; Wilhite, 2004). In all cases this information related to views expressed by students rather than the library staff delivering the interventions (Table 3).

 

 

Table 3
Summary of Included Studies

Study details

Population and Setting

Methods

Outcomes: Skills

Outcomes: Views

 

Limitations

First author and year:

Alexander 2001

 

Study Design:

CBA, posttest only

 

Delivered by: Graduate student (FtF); Course coordinator (online)

 

Setting:

Western Kentucky University, U.S.

 

Participants:

88 undergraduates on Library Media course

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

14x 1h course (face-to-face) vs. self-paced (online)

Neutral

No pretest. Mean scores posttest for skill levels: 82.6 (traditional) and 85 (online).

 

Follow-up period: N/S

 

Favoured online

Preference for the online course in terms of:

·         perceived benefits/effectiveness of course (p<0.05)

·         comfort in doing library research (p<0.01).

Researcher was both teacher and investigator. Students self-selected for online course. No pretest. No piloting or validation of test. No information on participant loss.

 

First author and year:

Anderson 2010

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

University of North Texas, U.S.

 

Participants:

103 undergraduates on Introduction to Communication course

Interventions:

(1)     Face-to-face

(2)     Blended

(3)     Online

 

Hours of contact time:

Entire course: 3 x 50 minute sessions

Neutral

Skills increased with no significant differences between formats (p>0.1) other than research assignment (persuasive presentation) scores higher for online (p=0.000).

 

Follow-up period: 5 weeks

 

-

Teaching content, student characteristics & treatment may have varied between groups. No information on characteristics. No validation of tests. Pretest scores high so difficult to assess any benefit.

First author and year:

Beile 2005

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

University of Central Florida, U.S.

 

Participants:

49 postgraduates on research methods course

Interventions:

(2)     Face-to-face

(3)     Blended

(4)     Online

 

Hours of contact time:

FtF 70 min. Online ~80 min

 

Neutral

Skills increased with no significant differences between formats.

 

Follow-up period: N/S

 

Neutral

Confidence/self-efficacy levels increased in all groups with no significant differences between formats.

 

Teaching content, student characteristics & treatment may have varied between groups. No information on characteristics. No validation of tests. Response rates varied.

First author and year:

Bordignon 2016

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

Seneca College, Toronto, Canada

 

Participants:

110 undergraduates on foundation English composition course

 

Interventions:

(1)     Online videos

(2)     FtF

 

Hours of contact time:

Not stated

Neutral

Skills increased in both formats with no clear differences between them.

 

 

Follow-up period:

Immediately post-training

-

No information on student characteristics. Participation was optional and students self-selected.  MCQs changed for the two groups. No overall test results.

First author and year:

Brettle 2013

 

Study Design:

RCT

 

Delivered by: Librarian

Setting:

University of Salford, U.K.

 

Participants:

77 undergraduate nursing students

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

1 hour

Neutral

Skills increased (p=0.001) with no significant differences between formats (p=0.263).

 

Follow-up period: 1 month

-

 

Loss of participants was explained but only 71% completion and no intention to treat analysis.

 

 

First author and year:

Burhanna 2008

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

Kent State University, Ohio U.S.

 

Participants:

313 undergraduates on orientation program

 

Interventions:

Library tour

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

0.5h

 

No pretest.

Neutral

Greater understanding of library services in online group (92% compared with 82.6%; no significance levels) although no difference in knowledge gained.

Follow-up period: N/S

Neutral

The majority of students in both formats agreed that

·         The course was effective/beneficial

and they were

·         Comfortable in asking for help from library staff

·         More comfortable in doing library research

·         More likely to use the library

 

 

 

Students self-selected type of course, and whether they participated in survey. Over half of in-person participants selected by instructor. No pretest. No validation of test.

 

First author and year:

Byerley 2005

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

University of Colorado, U.S.

 

Participants:

141 undergraduates in English 141 course

 

 

Interventions:

(1)     Face-to-face

(2)     Blended – FtF with online

(3)     Online

 

Hours of contact time:

Not stated

Neutral

Skills increased slightly in each group. The mean score for the blended group was significantly different from the FtF although not the online group.

 

Follow-up period: ~8 weeks

Unclear

No useable data – views of online groups only were sought.

FtF course introduced three databases while online course introduced only one. Different numbers for each format and no information on characteristics. Test not piloted or validated.

First author and year:

Churkovich 2002

 

Study Design:

cRCT

 

 

Delivered by: Librarian

Setting:

Deakin University, Geelong, Australia

 

Participants:

174 undergraduate sociology students

Interventions:

(1)     Face-to-face

(2)     Blended

(3)     Online

 

Hours of contact time:

Unclear

Favoured face-to-face

Skills increased in each group with a greater improvement in FtF compared to other formats (statistically significant).

 

Follow-up period: N/S

Favoured face-to-face

There was no difference in confidence/self-efficacy levels of the FtF and blended classes although a significant improvement in both compared to the online only course.

There was a clear preference for the class compared to the online course with 14/15 positive comments versus 3/9 positive comments.

 

Group sizes and student origins varied and no information on characteristics. Test trialed although only with secondary students & comments from academic staff. No data on statistical significance.

First author and year:

Gall 2014

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

University of Iowa, U.S.

 

Participants:

27 postgraduates in social work on campus (numbers off campus unclear)

Interventions:

Library induction

(1)     Face-to-face

(2)     Online

(3)     No instruction

 

Hours of contact time:

FtF 50 mins. Online self-paced

Neutral

Skills increased in each group although no significant differences between groups.

 

Follow-up period: N/S

 

Favoured online?

Online orientation ‘seemed to’ increase confidence/self-efficacy in choosing databases (awareness of library resources).

Small sample size. No useable posttests for no instruction (off campus) group. No information on characteristics.  Loss of participants not discussed. Test not piloted or validated. No confidence intervals or statistical tests.

First author and year:

Germain 2000

 

Study Design:

CBA

 

Delivered by: Librarian

 

Setting:

University at Albany, New York, U.S.

 

Participants:

303 undergraduate on gen. education program

 

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

FtF 55 mins

Online 15-55 mins

 

Neutral

Skills increased in each group with no differences between formats.

 

Follow-up period: 1.5 to 6 weeks

 

-

 

Numbers varied between groups and no information on student characteristics. Tests not validated.

First author and year:

Goates 2016

 

Study Design:

RCT

 

Delivered by: Librarian

 

Setting:

Brigham Young University, Utah, U.S.

 

Participants:

122 undergraduates (primarily life sciences) on advanced writing course.

 

Interventions:

(1)     Face-to-face

(2)     Blended

 

Hours of contact time:

50 mins

No pretest

Favoured face-to-face

Assignment scores (a rubric graded search strategy) were higher for students receiving FtF format (p<0.01)

 

Follow-up period:

Immediately after training

Neutral

Positive comments on perceived effectiveness of skills development similar for both formats

 

Randomization method not described. No information on student characteristics.

First author and year:

Greer 2016

Linked to Kraemer 2007

 

Study Design:

cRCT

 

Delivered by: Librarian

 

Setting:

Oakland University, Michigan, U.S.

 

Participants:

257 undergraduates on writing & rhetoric course

Interventions:

(1) Online

(2) Blended

 

Hours of contact time:

Online self-paced?

Blended self-paced? plus 1h instruction

No pretest

Neutral

The exam scores of the two groups were nearly identical.

 

Follow-up period:

Unstated but short-term

-

No information on student characteristics or drop outs.  Test not validated.

 

 

First author and year:

Holman 2000

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

University of North Carolina at Chapel Hill, U.S.

 

Participants:

125 undergraduates on English Composition and Rhetoric course

Interventions:

(1)     Face-to-face

(2)     Online (CAI)

(3)     No instruction

 

Hours of contact time:

FtF: 40 or 60 mins. CAI 30 - 45 mins

Neutral

Skills increased in each group with no statistically significant difference between formats.

 

Follow-up period: N/S

 

Neutral

No perceived differences in effectiveness/benefits.  Pace of online course and clarity of FtF course preferred.

Low completion rate online. Length/intensity of formats varied. Posttest timing varied.  Groups were different sizes and minimal information on characteristics. No piloting or validation of test.

First author and year:

Kaplowitz 1998

 

Study Design:

CBA

 

Delivered by: Teaching assistants

 

Setting:

UCLA, U.S.

 

Participants:

423 biology undergraduates

 

Interventions:

(1) Face-to-face (lecture)

(2) Online (CAI)

 

Hours of contact time:

50 minutes (lecture), 45-60 minutes (CAI)

Neutral

Skills increased in each group with no differences between formats.

 

Follow-up period: ~12 months

 

Unclear

No useable data – views of online group only were sought.

No information on group characteristics. No content info/validation of test. Only those completing pre/posttests evaluated. No confidence intervals or p values.

First author and year:

Koenig 2001

 

Study Design:

RCT

 

Delivered by: Librarian

Setting:

University of Illinois at Chicago, U.S.

 

Participants:

Undergraduates (number unstated) on a communication course

 

 

 

Interventions:

(1)     Fact to face

(2)     Online

 

Hours of contact time:

FtF unclear

Online 50 mins

 

Neutral

Skills increased in each group with no differences between formats.

 

Follow-up period: N/S (‘end of module’)

 

Neutral

Confidence/self-efficacy increased in both groups although no difference between groups.

Information lacking on timing/mode of FtF session. Students self-selected for format. Tests not validated.  Drop outs noted although numbers on the course not stated.

First author and year:

Kraemer 2007

 

Linked to Greer 2016

Study Design:

cRCT

 

Delivered by: Librarian

Setting:

Oakland University, Michigan, U.S.

 

Participants:

224 undergraduates on Rhetoric composition class

 

Interventions:

(1)     Face-to-face

(2)     Blended online plus FtF

(3)     Online (WebCT)

Hours of contact time:

FtF 3h. Blended self-paced plus 2h. Online self-paced

Favoured blended
Skills increased in each group (p<0.0000) with a significantly greater pre-post improvement in the blended compared to the online only group (p=0.023).

Follow-up period: N/S

 

 

Neutral
Similar levels of satisfaction (perceived effectiveness/benefits) across groups.

High pretest scores (~70%) limited value of test scores. Lack of information on student characteristics. Test not piloted or validated.

 

First author and year:

Kratochvil 2014

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

Masaryk University, Czech Republic

 

Participants:

251 Medicine undergraduates & postgraduates

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

Unclear. Possibly 3x2.5h sessions for FtF

Unclear
Skills increased in each group although unclear if any differences between groups.

Follow-up period: N/S

 

-

 

Unsuitable question construction in test and not validated. Different student groups for each format. No information on numbers or characteristics. Could have been major differences in treatment.

First author and year:

Lantzy 2016

 

Study Design:

CBA

 

Delivered by: Librarian

 

Setting:

California State University, U.S.

 

Participants:

64 undergraduates in a kinesiology course

 

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

1.25 hours

Neutral

Both groups showed highly significant pre-post test score increases (p<0.0001) but there were no significant differences between groups.

 

Follow-up period:

Immediately after training

Neutral
No significant differences across formats in views re:

·         confidence/self-efficacy

·         clarity of presentation

·         responsiveness of instructor

No information on student characteristics. Tests were not piloted or validated.

 

First author and year:

Lechner 2007

 

Study Design:

RCT

 

Delivered by: Librarian

Setting:

Richard Stockton College of New Jersey, U.S.

 

Participants:

27 occupational/physical therapy postgraduates

 

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:
Not stated. Online probably self-paced.

Favoured face-to-face
% change pre to post = 8.1% for the online group and 18.1% for the FtF group.

Follow-up period: N/S (probably same day)

 

 

 

 

 

-

Different sized groups and no information on characteristics. Only 63% completed both tests. Much higher pretest scores in online group. No confidence intervals or p values.

First author and year:

Mery 2012a, 2012b

 

Study Design:

CBA

 

Delivered by: FtF: Tutor (1); Librarian (2);

Online: Librarian

 

 

Setting:

University of Arizona, U.S.

 

Participants:

660 undergraduates on English compositional course

Interventions:

(1)     Face-to-face (tutor)

(2)     Face-to-face (librarian)

(3)     Online

 

Hours of contact time:
FtF 50 mins. Online over 10 weeks

Favoured online

Skills increased significantly in the FtF librarian and online groups but not in the tutor group. The online group performed better than FtF groups in both skills test (Mery 2012a) and assignment scores (bibliography quality) (Mery 2012b).

Follow-up period: N/S

-

Content and delivery varied between formats. No student characteristics and some selection by instructors. Much larger online group (570 students compared to circa 30 in other groups). No discussion of participant loss.

 

First author and year:

Nichols 2003

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

State University of New York (SUNY), U.S.

 

Participants:

64 undergraduates on English composition course

 

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

FtF 50 mins. Online unclear

 

Neutral

Skills increased slightly in each group although no difference between groups.

 

Follow-up period: N/S

 

Neutral

No differences between groups re:

·         perceived benefits/effectiveness

·         satisfaction

·         confidence levels

·         preference for format

 

No information on the characteristics of each group. Test not described or validated.  No information on loss of participants.

 

First author and year:

Nichols Hess 2014

 

Study Design:

CBA

 

Delivered by: Librarian

 

Setting:

Oakland University, Rochester, U.S.

 

Participants:

31 undergraduate sociology students

 

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

FtF not stated

Online self paced

Neutral

Skills increased in each group with no difference between groups.

 

Follow-up period:

Up to two months

Neutral

No significant differences between formats in:

·         Comfort in asking for help

·         Using library resources

Students receiving FtF instruction valued the personal connection and responsiveness of instructor.

Those receiving online instruction valued the convenience and ability to repeat sections.

Very little methodological information. Different numbers in each group and no information on student characteristics. Test not piloted or validated. Only completers analyzed. Not possible to assess statistical significance of results.

 

First author and year:

Orme 2004

 

Study Design:

CBA

 

Delivered by: Librarian

 

Setting:

Indiana University, U.S.

 

Participants:

128 business undergraduates

 

Interventions:

(1)     Face-to-face

(2)     Blended online (TILT) plus FtF

(3)     Online only (TILT)

 

Hours of contact time:

Unstated

 

Neutral

No pretest. No statistically significant difference between groups.

 

Follow-up period: ~10 weeks (next semester)

 

-

Only students designated as ‘successful’ (passing TILT quizzes or seminar) were included in the study. Exact content, length and intensity of teaching for each cohort not clear. Test not validated. No pretest.

First author and year:

Salisbury 2003

 

Study Design:

CBA

 

Delivered by: Information specialist

Setting:

University of Melbourne, Australia

 

Participants:

282 history/film undergraduates

 

Interventions:

(1)     Face-to-face (lecture)

(2)     Face-to-face (hands on)

(3)     Online

 

Hours of contact time:

1 hour

 

Neutral

Skills increased in each group although no clear differences between groups.

 

Follow-up period:  N/S

 

-

No detail on content, length or intensity of each mode of delivery. No student characteristics. No validation of test. No confidence intervals or p values.

First author and year:

Schilling 2012

 

Study Design:

RCT

 

Delivered by: Librarian

 

Setting:

Indiana University, U.S.

 

Participants:

128 medical undergraduates

 

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

1.5 h

Neutral

No statistically significant difference between groups in MEDLINE searching score.

 

Follow-up period:

Two weeks for skills test:

15 weeks for attitudes survey

 

No pretest

Neutral

No significant differences between formats in terms of:

·         Perceived effectiveness

·         Likelihood of using library (more)

No information on student characteristics. No validation of test. No confidence intervals with results.

 

First author and year:

Shaffer 2011

 

Study Design:

RCT

 

Delivered by: Librarian

Setting:

University of New York at Oswego, U.S.

 

Participants:

59 postgraduates on a research methods course

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

~2 hours

Neutral

Skills increased in each group although no difference between groups.

 

Follow-up period: N/S

 

Favoured face-to-face*

The FtF group had higher satisfaction scores on the 5-point Likert scale (4.03 viz 3.41).

 

Tests were not validated. *Online group experienced technical difficulties.

 

First author and year:

Silk 2015

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

Midwestern University, U.S.

 

Participants:

232 undergraduates on an organization communication course

 

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

1 hour

 

Neutral

Skills increased in each group with no significant difference between groups. The online group was more successful in finding research articles (87.4% vs. 78.0%, p=0.063).

 

Follow-up period: 4 weeks

Neutral

No significant differences in:

·         Confidence/self-efficacy

·         engagement/dynamism of instruction.

 

No information on student characteristics. Tests not piloted or validated. Only those who completed post and delayed posttest were included - ca 50% attrition in FtF and 59% in online.

First author and year:

Silver 2007

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

University of South Florida, U.S.

 

Participants:

295 psychology undergraduates

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

FtF Not stated. Online self-paced (allowed one week )

Neutral

No pretest. No posttest difference between groups.

 

Follow-up period: N/S

 

Unclear

Marginally greater number in online group saying they were more confident or much more confident after instruction (88.4% vs. 78.3% for FtF).  

Students allowed to self-select group. Student characteristics varied (and different year groups were used). Test was not validated. No pretest.

First author and year:

Swain 2015

 

Study Design:

RCT

 

Delivered by: Librarian

Setting:

Cardiff University, U.K.

 

Participants:

58 dental undergraduates

 

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

FtF 50 mins. Online:  Self-paced within 50 min slot

Neutral

Skills increased in each group although no significant difference between groups.

 

Follow-up period: 5 days

 

Neutral

Overall no significant differences in

·         comfort in asking for library assistance

·         preference for format other than tendency to favour of the format allocated. 

 

Limited information on characteristics. Test was piloted although not validated. Only 58 students attended training but 60 claimed training received at posttest.

 

First author and year:

Vander Meer 1996

 

Study Design:

RCT

 

Delivered by: Librarian

Setting:

Western Michigan University, U.S.

 

Participants:

186 undergraduates on high school/University transition course

Interventions:

(1)     Face-to-face

(2)     Online

 

Hours of contact time:

Not stated.

Neutral

Skills increased in each group although no significant difference between groups (p<0.05).

 

Follow-up period: ~10 weeks (end of semester)

Neutral

No difference in perceived:

·         Confidence/self-efficacy

·         Clarity

·         Interest

Online group perceived greater enjoyment (p=0.05)

All students had access to tutorial. Test not piloted or validated. Only 53% completion of posttest. No characteristics although large samples with similar baseline skill and survey results.

First author and year:

Walton 2012

 

Study Design:

CBA

 

Delivered by: Librarian

 

Setting:

Staffordshire University, U.K.

 

Participants:

35 sport and exercise undergraduates

 

Interventions:

(1)     Blended

(2)     Intermediate: FtF plus access to online materials

(3)     Face-to-face

 

Hours of contact time:

Blended 4x50 mins

Others 50 mins

 

No pretest

Neutral

Students in the blended group made greater use of evaluative criteria than those in the intermediate or FtF groups but this was not statistically significant.

 

Follow-up period:

Not stated, possibly at end of 5 week intervention period

-

Groups different sizes and no student characteristics.  Assessors not blinded to group. Evaluation criteria not validated. Small sample size.  Four times as much contact time for the blended vs. FtF and intermediate formats.

First author and year:

Wilcox Brooks 2014

 

Study Design:

CBA

 

Delivered by: Librarian

 

Setting:

Northern Kentucky University, U.S.

 

Participants:

38 undergraduates in advanced composition courses

Interventions:

(1)     Blended

(2)     Face-to-face

 

Hours of contact time:

Not stated

Neutral

No significant differences between groups in bibliographic analysis of final course paper.

 

Follow-up period:

Not stated

Unclear

No useable data – views of the blended group only were sought.

Hours of contact time not stated. No information on student characteristics. Outcome measures not piloted or validated

 

First author and year:

Wilhite 2004

 

Study Design:

CBA

 

Delivered by: Librarian

Setting:

University of Oklahoma, U.S.

 

Participants:

44 business undergraduates

Interventions:

(1)     Face-to-face

(2)     Online

(3)     No instruction

 

Hours of contact time:

Not stated. 45 min video

Neutral

Skills increased in each group when compared to control (p=0.010) although no significant difference between intervention groups (p=0.75).

 

Follow-up period: N/S

 

Favoured face-to-face

General preference for FtF with higher scores from FtF group for

  • Satisfaction
  • Clarity
  • Length of course

Slightly different numbers in groups and pretest scores are very different suggesting characteristics varied across groups. Test not piloted or validated. Issues for online group.

CBA:  Controlled before and after study; cRCT: Cluster randomized controlled trial; FtF: Face-to-Face; N/S: Not stated; RCT: Randomized controlled trial

Shaded rows are papers included in the meta-analysis.

 

Skills

 

Of the 33 studies, 8 did not include a pretest (Alexander & Smith, 2001; Burhanna et al., 2008; Goates et al., 2016; Greer et al., 2016; Orme, 2004; Schilling, 2012; Silver & Nickel, 2007; Walton & Hepworth, 2012). The remaining 25 studies all noted an increase in skills from pretest to posttest across delivery formats.

 

A total of 12 studies could be included in a meta-analysis, which indicated that a significant increase in skills occurred from pre- to posttest. The overall result from meta-analysis for the SMD change was 1.02 (95% Confidence Interval [CI]: 0.75 to 1.29) for face-to-face delivery (Figure 2) and 0.92 (95% CI: 0.57 to 1.26) for online delivery (Figure 3).

 

Overall, and as suggested by the pre- to post-results, there was compelling evidence that skills acquired through IL teaching are comparable for face-to-face and online delivery methods. Of the 33 studies, 27 (82%) reported that there was no statistically significant difference in skills learned via face-to-face and online delivery formats. For one study the results were unclear because of analysis weaknesses (Kratochvil, 2014), two favoured online delivery (Lechner, 2007; Mery et al., 2012a), two favoured face-to-face delivery (Churkovich & Oughtred, 2002; Goates et al., 2016) and one favoured the blended delivery option (Kraemer et al., 2007). 

 

 

Figure 2
Change in information literacy skills pre- to post-instruction face-to-face.

 

 

Figure 3
Change in information literacy skills pre- to post-instruction online

 

 

 


For the 13 studies that could be included in a meta-analysis the SMD (95% CI) for face-to-face compared to online instruction was -0.01 (-0.28 to 0.26) (Figure 4).

 

There was not enough data to assess whether there was any difference between skills outcomes and contact time, time to follow-up, delivery method (librarian or non-librarian) or study design. However, there appeared to be no obvious associations from looking at the data.

 

Findings were mixed for the ten studies that included a blended delivery arm (Anderson & May, 2010; Beile & Boote, 2005; Byerley, 2005; Churkovich & Oughtred, 2002; Goates et al., 2016; Greer et al., 2016; Kraemer et al., 2007; Orme, 2004; Walton & Hepworth, 2012; Wilcox Brooks, 2014), although seven of these studies (70%) found no statistically significant difference between blended and other formats in terms of test or assignment outcomes. Of the ten, one study (Byerley, 2005) noted that the blended method provided greater skill development than the face-to-face provision, although this was not significant compared to online provision. Another study (Goates et al., 2016) noted higher posttest scores for students receiving a face-to-face versus blended format (p<0.01). A further study (Kraemer et al., 2007) found a significantly greater pre-post improvement in the blended learning compared to the online learning group.

 

 

 


Figure 4
Comparison of information literacy skills for face-to-face vs. online instruction
.

 

 

 


For those studies that could be included in a meta-analysis, there was no statistically significant difference between blended and single format training in terms of skills learnt. The SMD comparing blended to online or face-to-face instruction were 0.15 (95% CI, -0.03 to 0.34; 4 studies) and 0.36 (-0.03 to 0.75; 3 studies) respectively (Figure 5). 

 

Based on the studies that could be included in a meta-analysis, the single format training appeared to be more effective than blended training when skills were measured via a specific assignment such as a piece of persuasive presentation research (Anderson & May, 2010) or a rubric graded search strategy (Goates et al., 2016). (Figure 5)   Three further studies looked at specific assignments; two via bibliography assessment within a piece of course work (Mery et al., 2012b; Wilcox Brooks, 2014) and one by a search strategy assessment (Schilling, 2012).  Mery et al. (2012b) observed a statistically significant improvement in the online compared to the face-to-face group but the other two studies found no difference between face-to-face and blended (Wilcox Brooks, 2014) or online vs. face-to-face groups (Schilling, 2012). No conclusions can be based on this limited evidence. 

 

 

Figure 5
Comparison of information literacy skills for online or face-to-face instruction versus blended instruction.

 

 

Student views


Overall there was evidence that students felt that the different delivery methods had their advantages and disadvantages. However, the findings are mixed with no clear preference for one method over another. Of the 22 studies gathering information on student views, 3 collected data from students exposed to the online (
Byerley, 2005; Kaplowitz & Contini, 1998) or blended (Wilcox Brooks, 2014) training only.

 

From the 19 studies gathering views on both types of format, 14 (74%) found that students expressed no preference at all in relation to format (Table 3). In the five studies finding variations in student views between formats, two studies found that the online course was favoured in terms of perceived benefits, attitudes to the course, and comfort in carrying out library research (Alexander & Smith, 2001) or increased self-efficacy (a belief in one’s ability to succeed) in choosing databases to search (Gall, 2014). Three studies identified a preference for face-to-face delivery in terms of greater confidence following training (Churkovich & Oughtred, 2002; Shaffer, 2011) or higher satisfaction in general and around the clarity and length of training (Wilhite, 2004). The online group experienced technical difficulties in the studies by Shaffer (2011) and Wilhite (2004). Findings from the themes identified in intervention studies analyzing student views on face-to-face versus online formats are summarized in Figure 6. Where the findings for a particular measure are neutral, this shows that there was no clear preference from students concerning the online and face-to-face formats. 

 

There were not enough data to guide conclusions concerning perceptions of blended versus single format. However from three studies comparing all three types of format, two found that the views of students across formats were neutral (Beile 2005, Kraemer 2007) while one noted a preference for the face-to-face format in terms of confidence/self-efficacy (Churkovich 2002). A study comparing face-to-face and blended formats found no differences in perceived skills (Goates 2016).

 

Study Design Features


The interventions in 30 of the 33 studies were delivered by librarians. Face-to-face teaching was delivered by graduate students (Alexander & Smith, 2001) or teaching assistants (Kaplowitz & Contini, 1998) in two studies. There was no difference in skills between the face-to-face and online groups at posttest in both studies. Only the study by Alexander and Smith (2001) included comparative information on student views and they found a preference for the online option. Mery et al. (2012a) provided the only direct comparison between the deliverers of the intervention, with two face-to-face groups; one trained by librarians and the other by course tutors. The researchers found that skills increased significantly in the librarian and online groups, but not in the tutor group.

 

Of 21 studies providing information on face-to-face contact time, the typical time period was 50-60 minutes (12 studies, see Table 3). The longest contact time was for the study by Alexander (2001) where graduate students delivered 14 one-hour sessions. The results for the skills test (posttest only) were neutral, but students voiced a preference for the online training. The shortest contact time was 0.5 hour (Burhanna et al., 2008), where the researchers reported a trend towards greater skills development in the online group but no difference in student views. 

 

 

Figure 6
Analysis of student views on face-to-face versus online formats [numbers of studies].

 

 

Only 14 studies provided information on the follow-up period between training and the skills test, where the range of follow-up periods was immediately post-training to 12 months (see Table 3). There was no statistically significant difference between the two formats in terms of skills retained in 13 studies. There was a statistically significant improvement in the face-to-face group in Goates et al. (2016), where skills were measured immediately post-training.

 

For the 11 randomized controlled trials, 7 studies (64%) found no difference in skills between the formats tested (Brettle & Raynor, 2013; Greer et al., 2016; Koenig & Novotny, 2001; Schilling, 2012; Shaffer, 2011; Swain et al., unpub; Vander Meer & Rike, 1996), 3 favoured face-to-face training (Churkovich & Oughtred, 2002; Goates et al., 2016; Lechner, 2007) and 1 favoured the blended approach (Kraemer et al., 2007).  

 

Of the 11 randomized controlled trials, 8 explored student views, with 2 favouring the face-to-face format (Churkovich & Oughtred, 2002; Shaffer, 2011) and 6 (75%) with neutral findings (Goates et al., 2016; Koenig & Novotny, 2001; Kraemer et al., 2007; Schilling, 2012; Swain et al., unpub; Vander Meer & Rike, 1996).

 

Discussion

 

Despite the methodological shortcomings of many of the studies included in this review, there is consistent evidence across the body of comparative studies that:

 

       Face-to-face (traditional) teaching strongly increases information literacy (IL) skills when assessed directly pre- and post-teaching.

       Online (web-based) teaching strongly increases IL skills when assessed directly pre- and post-teaching.

       The increase in skills as a result of teaching is broadly comparable for face-to-face and online teaching methods.

       Students do not express a clear preference for one format over another although they perceive some differences in the delivery methods (and advantages and disadvantages of each).

 

The findings from our review of student skills are in keeping with a systematic review evaluating the impact of online or blended and face-to-face learning of clinical skills in undergraduate nurse education (McCutcheon, Lohan, Traynor & Martin, 2015). On the basis of 19 published papers, the authors concluded that online teaching of clinical skills was no less effective than traditional means.

 

Definitive evidence on the effectiveness of blended learning methods compared to single format teaching is limited although it appears that test score outcomes for single and blended format teaching are similar. The potential differences between outcomes, as measured by assignment and test performance, is intriguing and worthy of further study. One might identify test scores and assignment scores as measuring the different outcomes of cognitive (factual knowledge) and behavioural (skills needed to complete a task) aspects of information literacy, respectively.

 

While the majority of studies that had a potentially more reliable methodology (i.e. the 11 randomized controlled trials) demonstrated neutral findings, four of the studies favoured face-to-face or blended approaches. Many of the studies had some methodological shortcomings however.

 

Across the full body of the 33 studies reviewed here, it seems that the choice of format can be left to the educator. Given our awareness of the increase in the use of online and blended formats for IL teaching, from personal experience and the published literature, this confirmation is welcome. Both the student context (e.g., campus-based or distance learners) and cohort sizes are likely to be decisive factors.  Blended learning is perceived by academic staff as being more time consuming (Brown, 2016), although we could not find any empirical evidence to confirm or refute this perception; nor were any studies identified comparing preparation time for single format face-to-face vs. online sessions.

 

One development opportunity for the online context is the personalized online learning environment using adaptive learning software (Nguyen, 2015). This is an exciting prospect for enhancing student learning in the increasingly online arena of information searching that remains to be explored.

 

Limitations

 

The authors cannot guarantee that all relevant studies were identified although this review is based on an extensive search for published and unpublished research studies. The quality of the included studies is moderate at best. Only 11 studies adopted the randomized controlled trial design, which should minimize the potential for bias, and only 7 piloted or validated the skills tests used. Heterogeneity across studies was high so the meta-analysis results should be interpreted with caution. There is also relatively little evidence from outside the U.S.

 

Conclusions and Implications for Practice

 

The body of research evidence suggests that information literacy training is equally effective, and well received, across a range of delivery methods. The format can vary to suit the requirements of the student population and the educational situation. In the light of these findings, in our institutions we are confident in moving towards a greater use of online options, particularly for routine IL sessions such as library orientations for new students and for access by individuals at ‘point of need’. 

 

Future comparative studies should aim to minimize the potential for bias, perhaps by adopting a randomized controlled design. These studies should also employ a large population and they should use validated test instrument(s). More high quality research comparing blended and single format delivery methods will be valuable, along with exploration to unravel the potential dichotomies in outcomes from specific assignments (marked course work) as opposed to IL skills tests. Further research into the time and resource implications for educators in delivering teaching via these different methods would also be useful. 

 

Once these studies have been completed it should be possible to provide clearer guidance to educators, perhaps along the lines of a ‘decision aid’ to guide the choice of teaching format for particular contexts and student groups.

 

Acknowledgements

 

The authors acknowledge, with thanks, the following specialists for reading and commenting on two drafts of this manuscript: Alison Brettle (Professor in Health Information and Evidence Based Practice, University of Salford U.K.), Cecily Gilbert (Research Librarian, Barwon Health Library, Victoria Australia) and Erica Swain (Subject Librarian, Cardiff University, U.K.).

References

 

Alexander, L.B. & Smith, R.C. (2001). Research findings of a library skills instruction web course. Portal: Libraries and the Academy, 1(3), 309-328. https://doi.org/10.1353/pla.2001.0033

 

Anderson, K. & May, F.A. (2010). Does the method of instruction matter? An experimental examination of information literacy instruction in the online, blended, and face-to-face classrooms.  Journal of Academic Librarianship, 36(6), 495-500. https://doi.org/10.1016/j.acalib.2010.08.005

 

Arnold-Garza, S. (2014). The flipped classroom teaching model and its use for information literacy instruction. Communications in Information Literacy, 8(1), 7-22. http://files.eric.ed.gov/fulltext/EJ1089137.pdf

 

Beile, P.M. & Boote, D.N. (2005). Does the medium matter? A comparison of a web-based tutorial with face-to-face library instruction on education students’ self-efficacy levels and learning outcomes. Research Strategies, 20, 57-68. https://doi.org/10.1016/j.resstr.2005.07.002

 

Bordignon, M., Strachan, G., Peters, J., Muller, J., Otis, A., Georgievski. A., & Tamin, R. (2016). Assessment of online information literacy learning objects for first year community college English composition. Evidence Based Library & Information Practice, 11(2), 50-57. https://doi.org/10.18438/b8t922

 

Braun, V. & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77-101. https://doi.org/10.1191/1478088706qp063oa

 

Brettle, A. & Raynor, M. (2013). Developing information literacy skills in pre-registration nurses: An experimental study of teaching methods. Nurse Education Today, 33(2), 103-109. https://doi.org/10.1016/j.nedt.2011.12.003

 

Brown, M.G. (2016). Blended instructional practice: A review of the empirical literature on instructors’ adoption and use of online tools in face-to-face teaching.  Internet and Higher Education, 31, 1-10
https://doi.org/10.1016/j.iheduc.2016.05.001

 

Burhanna, K.J., Eschedor Voelker, T.J. & Gedeon, J.A. (2008). Virtually the same: Comparing the effectiveness of online versus in-person library tours. Public Services Quarterly, 4(4), 317-338.
https://doi.org/10.1080/15228950802461616

 

Byerley, S.L. (2005). Library instruction: Online or in the classroom? Academic Exchange, 9(4), 193-197.
https://www.questia.com/library/journal/1G1-142636415/library-instruction-online-or-in-the-classroom

 

Churkovich, M. & Oughtred, C. (2002). Can an online tutorial pass the test for library instruction? An evaluation and comparison of library skills instruction methods for first year students at Deakin University. Australian Academic & Research Libraries, 33, 25-38.
https://doi.org/10.1080/00048623.2002.10755177

 

CILIP. (2017). Information literacy. The library and information association. Web page. https://www.cilip.org.uk/research/topics/information-literacy

 

Gall, D. (2014). Facing off: Comparing an in-person library orientation lecture with an asynchronous online library orientation. Journal of Library & Information Services in Distance Learning, 8(3/4), 275-287.
https://doi.org/10.1080/1533290x.2014.945873

 

Germain, C.A., Jacobson, T.E. & Kaczor, S.A.A. (2000). Comparison of the effectiveness of presentation formats for instruction: Teaching first-year students. College & Research Libraries, 61(1), 65-72.
https://doi.org/10.5860/crl.61.1.65

 

Goates, M.C., Nelson, G.M. & Frost, M. (2016). Search strategy development in a flipped library classroom: A student-focused assessment. College & Research Libraries, anticipated publication date 1 May 2017.

 

Greer, K., Hess, A.N. & Kraemer, E.W. (2016). The librarian leading the machine: A reassessment of library instruction methods. College & Research Libraries, 77(3), 286-301.
https://doi.org/10.5860/crl.77.3.286

 

Holman, L. (2000). A comparison of computer-assisted instruction and classroom bibliographic instruction. Reference & User Services Quarterly, 40(1), 53-60.
https://www.jstor.org/stable/20863900

 

Kaplowitz, J. & Contini, J. (1998). Computer-assisted instruction: Is it an option for bibliographic instruction in large undergraduate survey classes? College & Research Libraries, 59(1), 19-27.
https://doi.org/10.5860/crl.59.1.19

 

Koenig, M. & Novotny, E. (2001). On-line course integrated library instruction modules as an alternative delivery method. Chapter 26, pp.200-208 In: Dewey BI, editor. Library User Education. Lanham, Maryland and London: Scarecrow Press; 2001.

 

Koufogiannakis, D., Booth, A. & Brettle, A. (2005). ReLIANT: Reader's guide to the literature on interventions addressing the need for education and training.  Library and Information Research, 30(94), 8.
http://eprints.rclis.org/8082/1/RELIANT__final_.pdf

 

Koufogiannakis, D. & Wiebe, N. (2006). Effective methods for teaching information literacy skills to undergraduate students: A systematic review and meta-analysis. Evidence Based Library and Information Practice, 1(3), 3-43.
https://doi.org/10.18438/b8ms3d

 

Kraemer, E.W., Lombardo, S.V. & Lepkowski, F.J. (2007). The librarian, the machine, or a little of both: A comparative study of three information literacy pedagogies at Oakland University. College & Research Libraries, 68(4), 330-342.
https://doi.org/10.5860/crl.68.4.330

 

Kratochvil, J. (2014). Measuring the impact of information literacy e-learning and in-class courses via pre-tests and post-test at the Faculty of Medicine, Masaryk University. Mefanet J, 2(2), 41-50.
https://is.muni.cz/repo/1214193/en/Kratochvil/Measuring-the-impact-of-information-literacy-e-learning-and-in-class-courses-via-pre-tests-and-post-test-at-the-Faculty-of-Medicine-Masaryk-University?lang=en

 

Lantzy, T. (2016). Health literacy education: the impact of synchronous instruction. Reference Services Review, 44(2), 100-21.
https://doi.org/10.1108/rsr-02-2016-0007

 

Lechner, D.L. (2007). Graduate student research instruction: Testing an interactive web-based library tutorial for a health sciences database. Research Strategies, 20, 469-481.
https://doi.org/10.1016/j.resstr.2006.12.017

 

Lewis, S. & Clarke, M. (2001). Forest Plots: Trying to see the wood and the trees. BMJ, 322(7300), 1479-1480.
https://doi.org/10.1136/bmj.322.7300.1479

 

Light, R.J. & Pillemer, D.B. (1984). Summing up: The science of reviewing research.  Cambridge, Massachusetts: Harvard University Press, 1984

 

Liu, Q., Peng, W., Zhang, F., Hu, R., Yingzue, L. & Yan, W. (2016). The effectiveness of blended learning in health professions: systematic review and meta-analysis. Journal of Medical Internet Research, 18(1), e2.
https://doi.org/10.2196/jmir.4807

 

McCutcheon, K., Lohan, M., Traynor, M. & Martin, D. (2015). A systematic review evaluating the impact of online or blended learning vs. face-to-face learning of clinical skills in undergraduate nurse education.  Journal of Advanced Nursing, 71(2), 255-270
https://doi.org/10.1111/jan.12509

 

Means, B., Toyama, Y., Murphy, R., Baki, M. (2013). The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teachers College Record 115, 1-47
https://www.sri.com/sites/default/files/publications/effectiveness_of_online_and_blended_learning.pdf

 

Mery, Y., Newby, J. & Peng, K. (2012a). Why one-shot information literacy sessions are not the future of instruction: A case for online credit courses. College & Research Libraries, 73(4), 366-377.
https://doi.org/10.5860/crl-271

 

Mery, Y., Newby, J. & Peng, K. (2012b).  Performance-based assessment in an online course: Comparing different types of information literacy instruction. portal: Libraries and the Academy, 12(3), 283-298
https://doi.org/10.1353/pla.2012.0029

 

Morrison, J.M., Sullivan, F., Murray, E. & Jolly, B. (1999). Evidence-based education: Development of an instrument to critically appraise reports of educational interventions. Medical Education, 33(12), 890-893.
https://doi.org/10.1046/j.1365-2923.1999.00479.x

 

National Centre for Text Mining. Termine web demonstration, 2012. Web page.  http://www.nactem.ac.uk/software/termine/

 

Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning & Teaching, 11(2), 309-319.
http://jolt.merlot.org/Vol11no2/Nguyen_0615.pdf

 

Nichols, J., Shaffer, B. & Shockey, K. (2003). Changing the face of instruction: Is online or in-class more effective? College & Research Libraries, 64(5), 378-388.
https://doi.org/10.5860/crl.64.5.378

 

Nichols Hess, A.N. (2014). Online and face-to-face library instruction: Assessing the impact on upper-level sociology undergraduates. Behavioral & Social Sciences Librarian, 3, 132-147.
https://doi.org/10.1080/01639269.2014.934122

 

Orme, W.A. (2004). A study of the residual impact of the Texas Information Literacy Tutorial on the information-seeking ability of first year college students. College & Research Libraries, 65(3), 205-215.
https://doi.org/10.5860/crl.65.3.205

 

QSR International Pty Ltd. NVivo 10 software. Released 2012. http://www.qsrinternational.com/nvivo-support/downloads

 

Salisbury, F. & Ellis, J. (2003). Online and face-to-face: Evaluating methods for teaching information literacy skills to undergraduate arts students. Library Review, 52(5), 209-217.
https://doi.org/10.1108/00242530310476715

 

Schilling, K. (2012). The efficacy of elearning for information-retrieval skills in medical education. European Conference on e-Learning, October 2012.

 

Shaffer, B.A. (2011). Graduate student library research skills: Is online instruction effective?  Journal of Library & Information Services in Distance Learning, 5(1/2), 35-55.
https://doi.org/10.1080/1533290x.2011.570546

 

Silk, K.J., Perrault, E.K., Ladenson, S. & Nazione, S.A. (2015). The effectiveness of online versus in-person library instruction on finding empirical communication research. The Journal of Academic Librarianship, 41,149-154.
https://doi.org/10.1016/j.acalib.2014.12.007

 

Silver, S.L. & Nickel, L.T. (2007). Are online tutorials effective? A comparison of online and classroom library instruction methods. Research Strategies, 20(4), 389-396.
https://doi.org/10.1016/j.resstr.2006.12.012

 

Swain, E., Weightman, A.L., Farnell, D.J.J & Mogg, R. (2016). An experimental study of online versus face-to-face student induction at a university library: Both formats are equally effective and well received. Unpublished.

 

Vander Meer, P.F. and Rike, G.E. (1996). Multimedia: Meeting the demand for user education with a self-instructional tutorial. Research Strategies, 14(3), 145-158.
https://www.learntechlib.org/p/82581

 

Walton, G. & Hepworth, M. (2012). Using assignment data to analyse a blended information literacy intervention: A quantitative approach. Journal of Librarianship & Information Science, 45(1), 53-63.
https://doi.org/10.1177/0961000611434999

 

Weightman, A.L., Farnell, D.J.J., Morris, D. & Strange, H. (2015). Information literacy teaching in universities – A systematic review of evaluation studies. Preliminary findings for online versus traditional methods. [Poster] Eighth Evidence Based Library & Information Practice Conference (EBLIP8), Brisbane, Queensland, Australia, 6-8 July 2015.

 

Wilcox Brooks, A.W. (2014). Information literacy and the flipped classroom: Examining the impact of a one-shot flipped class on student learning and perceptions. Communications in Information Literacy, 8(2), 225-235.
http://files.eric.ed.gov/fulltext/EJ1089274.pdf

 

Wilhite, J.M. (2004). Internet versus live: Assessment of government documents bibliographic instruction. Journal of Government Information, 30(5/6), 561-574.
https://doi.org/10.1016/j.jgi.2004.10.002

 

Zhang, L., Watson, E.M. & Banfield. L. (2007). The efficacy of computer-assisted instruction versus face-to-face instruction in academic libraries: A systematic review. Journal of Academic Librarianship, 33(4), 478-484.
https://doi.org/10.1016/j.acalib.2007.03.006

 

 


Appendix

 

Additional file: Evidence Table: Effectiveness