Research Article

 

Information Literacy Skills of First-Year Library and Information Science Graduate Students: An Exploratory Study

 

Andrea Hebert

Human Sciences, Education, and Distance Learning Librarian

LSU Libraries

Louisiana State University

Baton Rouge, Louisiana, United States of America

Email: ahebert@lsu.edu

 

Received: 7 Feb. 2018     Accepted: 17 July 2018

 

 

cc-ca_logo_xl 2018 Hebert. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

DOI: 10.18438/eblip29404

 

 

Abstract

 

Objective This cross-sectional, descriptive study seeks to address a gap in knowledge of both information literacy (IL) self-efficacy and IL skills of students entering Louisiana State University’s Master of Library and Information Science (MLIS) program.

 

Methods An online survey testing both IL self-efficacy and skills was administered through Qualtrics. The online survey instrument used items from existing instruments (Beile, 2007; Michalak & Rysavy, 2016) and was distributed to two cohorts of incoming students; the first cohort entered the MLIS program in fall 2017, and the second entered in spring 2018.

 

Results Data varied between cohorts and between survey instruments for both IL self-efficacy and skills; however, bivariate analysis of data indicated a moderate positive correlation between overall IL self-efficacy and demonstrated IL skill scores in both fall 2017 and spring 2018 cohorts.

 

Conclusion The study indicates a need for a larger, multi-institutional study using a rigorously validated instrument to gather data and make generalizable inferences about the IL self-efficacy and skills of incoming LIS graduate students.

 

Introduction

 

Students enrolled in U.S. library and information science (LIS) graduate programs are an understudied population in LIS literature. Most articles focus on LIS curricula and teaching methodologies. Very few published studies focus on the fundamental information literacy (IL) skill set of students entering library school. Because of this lack of data, research and instruction services librarians who work with LIS graduate students are unable to anticipate accurately these students’ information needs and information literacy proficiencies, making it a challenge to provide support and instruction.

 

LIS students in the United States are a heterogeneous mix. LIS graduate programs pull students from a wide range of undergraduate majors (Taylor, Perry, Barton, & Spencer, 2010), and approximately 49% of students enrolled in American Library Association (ALA) accredited master’s programs in the United States are 30 years of age or more (Albertson, Spetka, & Snow, 2015, Table II-8-c-2-ALA), suggesting that they are returning to academia after professional employment. The varied academic and professional backgrounds of LIS graduate students make it hard to predict what IL skills incoming students may possess. At Louisiana State University (LSU), it is not uncommon to encounter new LIS graduate students who cannot look up a book in an OPAC, cannot distinguish a citation for a journal article from that of a monograph, and who are unfamiliar with peer review, but librarians who work with LIS graduate students need more than anecdotal information about these students to serve them efficiently and effectively.

 

Likewise, understanding students’ IL self-efficacy can guide librarians in their outreach and instruction to this population. Students with low self-efficacy need additional encouragement and guidance (Tang & Tseng, 2013). However, if students’ self-efficacy is higher than their actual skill level, students may be unaware of their weaknesses and may be unlikely to seek help (Gross & Latham, 2012). Librarians may need to promote their expertise and services more heavily not only to students with low self-efficacy but also to those students who have high levels of IL self-efficacy but lower levels of demonstrated IL skills.

 

Literature Review

 

Bandura defines perceived self-efficacy “as people’s judgments of their capabilities to organize and execute courses of action required to attain designated types of performances” (1977, p. 391). People with positive self-efficacy beliefs are more likely to engage in activities that improve actual competencies, but Bandura (1986) is careful to note that misjudgments of self-efficacy (overestimating or underestimating one’s talents) can cause a negative impact. People who underestimate their self-efficacy often limit themselves and underperform because of self-doubt, while those who greatly overestimate their abilities expose themselves to frustration and failure (Bandura, 1986).

 

Definitions of IL vary widely and continue to evolve. The Association of College and Research Libraries’ Framework for Information Literacy for Higher Education (2016, para. 5) stresses its more conceptual aspects: “Information literacy is the set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued, and the use of information in creating new knowledge and participating ethically in communities of learning.” For the purpose of this study, IL will refer to a narrower, more traditional, and concrete definition—the ability “to recognize when information is needed and . . . the ability to locate, evaluate, and use effectively the needed information” (American Library Association, 1989, para. 3).

 

There is a growing body of literature focused on IL self-efficacy and its relationship to demonstrated IL skills. A systematic review of literature revealed that out of 53 studies, 41 clearly or partially indicated that students overestimated their IL skills (Mahmood, 2016). The review included studies dealing with high school, undergraduate, graduate, and professional students, but only 4 of the 53 studies focused solely on graduate students (Mahmood, 2016). Mahmood (2016) found that 83% of the studies focusing on undergraduates indicated that undergraduate students frequently overestimate their IL skills. The results of studies dealing with graduate students were less conclusive. Boucher, Davies, Glen, Dalziel, and Chandler (2009) noted that graduate students both under- and overestimated their skill levels, and those who rated their skills highest often had the lowest performance scores. Likewise, Jackson (2013) found that although some graduate students accurately predict their skill levels, others overestimated them; in short, there was no clear correlation. Other studies indicated a weak positive correlation (Robertson & Felicilda-Reynaldo, 2015) or mixed results (Perrett, 2004). Mahmood’s (2016) review covered the years from 1986 to 2015, but in two recent articles, international graduate students in business were found to overestimate their IL skills (Michalak & Rysavy, 2016; Michalak, Rysavy, & Wessel, 2017).

 

These contradictory findings echo the conflicting research about graduate student IL as a whole. Some librarians believe that graduate students’ need for IL instruction exceeds that of undergraduates because of the intensive research required by many graduate programs (Crosetto, Wilkenfeld, & Runnestrand, 2007). Catalano’s (2010) research indicates that graduate students are generally able to evaluate information but lack advanced search skills; other research indicates that graduate students actually have sophisticated IL skills (Green, 2010). A study asking graduate students to rate their feelings of engagement, affirmation, and puzzlement during an information literacy instruction session revealed conflicting responses, pointing to a wide range of abilities and competencies among graduate students (Saunders, Severyn, Freundlich, Piroli, & Shaw-Munderback, 2016). Even when graduate students are aware that they need research help, they are hesitant to approach librarians (Harrington, 2009; Sadler & Given, 2007).

 

Few articles address LIS graduate students’ IL self-efficacy or IL skills, but there is a suggestion that LIS graduate students have high IL self-efficacy but lower than expected performance. Several studies point to LIS students having positive IL self-efficacy (Kurbanoglu, 2003; Pinto, Fernandez-Ramos, Sanchez, & Meneses, 2013; Saunders et al., 2015). Kurbanoglu’s (2003) study of undergraduates enrolled in an Information Management program (Hacettepe University, Ankara, Turkey) suggests that although there is a slight increase in IL self-efficacy between students’ first and second years, there is little gain in succeeding years. Although Kurbanoglu’s (2003) study is valuable, it is not longitudinal—different students were tested over the course of four years. The differences in self-efficacy for each program year could be due to the group of students tested instead of an actual increase in self-efficacy. There is evidence that LIS graduate students have limited IL skills, including difficulty formulating Boolean search queries (Conway, 2011; Islam & Tsuji, 2010). In fact, a study of students entering Curtin University’s (Perth, Australia) Information Studies graduate program found that “33% of postgraduates were unable to identify a citation as indicating a journal article; 59% were unable to select the best method of searching for a specific journal article; 48% were unaware of how to find a book chapter using a library catalogue; and 33% were unable to identify the Boolean operator ‘AND’ as a means to narrow a search” (Conway, 2011, pp. 130–131).

 

A study of LIS students in 18 countries revealed that although students were confident in their search skills, the students’ self-reported information behaviors and attitudes raised “some concerns as to whether LIS students are moving beyond the general population in their location, search, evaluation, and use of resources” (Saunders et al., 2015, p. S94). Doctoral students enrolled in information science programs in Spain, Cuba, and Mexico generally ranked their IL knowledge as high, but the authors of the study commented, “Although the results of the self-assessments are encouraging, the authors of this article, as a result of their extensive experience in training doctoral students and directing doctoral dissertations, believe that the real world reality is not, however, as encouraging” (Pinto, Fernandez-Ramos, Sanchez, & Meneses, 2013, p. 151). Although this discrepancy has been noted, there are no direct measurements to confirm it.

 

Aims

 

This paper describes an exploratory study to address this gap in knowledge by gathering data to answer the following questions:

 

1.       What level of information literacy self-efficacy do first-year MLIS students have?

2.       What information literacy skills do first-year MLIS students demonstrate?

3.       Is there a relationship between first-year MLIS students’ perceived and demonstrated information literacy skills?

 

Methods

 

The study used an online survey to determine first-semester MLIS students’ levels of self-efficacy and to test their IL skills. Each IL skill was keyed to a self-efficacy belief, allowing the author to compare discrete beliefs and skills. The author submitted an application for exemption to LSU’s Institutional Review Board (IRB) to use the survey in fall 2017. The IRB chair reviewed the application for this project (LSU IRB# E10534) and determined that the project did not require a formal review.

The author repeated the study in spring 2018 for additional data collection using a different instrument and a streamlined distribution method. The librarian submitted another IRB exemption application, which reflected the use of a new instrument, the change in distribution method, and an updated consent script. The IRB exemption was granted (LSU IRB# E10817).

 

Study Population and Sampling Design

 

The study included students entering the MLIS degree program at LSU’s School of Library and Information Science (SLIS). Participants had to be enrolled in their first semester of the MLIS degree program and could have no more than 3 graduate-level credits in LIS.

 

Fall 2017

 

In fall 2017, 42 students eligible for the study entered the MLIS program at LSU (B. Antie, personal communication, Sept. 5, 2017); because the study population was small (N = 42), the author used a census survey instead of a sample survey. The study relied on a voluntary response, but the director of the SLIS program encouraged students to complete the survey. Respondents who completed the survey received an Amazon.com eGift code worth $5.00 as an incentive after the survey closed; the incentives were funded by LSU Libraries.

 

Qualtrics recorded 61 survey attempts. The author determined that of the 61 responses, 35 were from students who met the inclusion criteria. Data from ineligible students were deleted. Of the 35 responses from students who met inclusion criteria, 3 respondents to the survey took the survey twice. In these cases, the author deduped the responses using the following criteria:

 

1.       Retain the attempt that is most complete (fewest skipped questions).

2.       If both attempts are complete, keep the first attempt and delete the second.

This left 32 valid responses—32 out of 42 eligible students responded to the survey for a response rate of 76%.

 

Spring 2018

 

The spring 2018 study used the same inclusion and exclusion criterion as the fall 2017 study. On the first day of classes (January 10, 2018), the author obtained a list of the 30 incoming MLIS students and their university email addresses from the Office of the University Registrar (B. Antie, personal communication, January 10, 2018). The study once again used a census survey. The study relied on a voluntary response and no incentive was offered for participating in the survey. Qualtrics recorded 23 survey attempts; 22 students indicated that they met the inclusion criteria, and 3 attempts were incomplete. After data from the ineligible student and from incomplete surveys were deleted, 19 valid responses were left for a response rate of 65.5%.

 

Study Design

 

The survey was created in Qualtrics, a Web-based survey platform. Five graduate assistants at LSU Libraries took the survey to ensure the survey’s functionality and provide an estimated completion time.

 

On the first day of the fall 2017 semester (August 21, 2017), SLIS’s Administrative Coordinator of Academic Services emailed a link to the survey along with a short introduction explaining the purpose of the study. There were three reminders for participation after the initial survey distribution on August 21 with the survey closing on September 6.

 

The spring 2018 survey was distributed on January 11, 2018, to newly enrolled SLIS MLIS students (N = 30) to their university email addresses using Qualtrics. The author sent email reminders through Qualtrics. The survey closed on January 25, 2018.

 

Data Collection Instruments

 

Respondent data were collected through Qualtrics. A statement containing information required by LSU’s IRB prefaced both surveys.

 

Fall 2017

 

In fall 2017, the instrument consisted of 4 questions to measure IL self-efficacy, 18 questions to measure specific IL skills, and 5 demographic questions. The author gained permission to use questions 2, 3, 4, and 6 of Michalak and Rysavy’s (2016) Students’ Perceptions of Their Information Literacy Skills Questionnaire (SPIL-Q) (M. Rysavy, personal communication, June 16, 2017). SPIL-Q measures perceived IL self-efficacy with a 5-point Likert scale. Although there are other well-known and validated IL self-efficacy instruments, in particular Kurbanoglu, Akkoyunlu, and Umay’s Information Literacy Self-Efficacy Scale (ILSES) (2006), SPIL-Q allows users to rate their self-efficacy with just six questions, allowing the author to keep the survey brief. Questions 2, 3, 4, and 6 measure locating information, accessing information, evaluating information, and citing, respectively. This modified SPIL-Q will be referred to as M-SPIL-Q for clarity in this paper.

 

The author adapted questions from the Information Literacy Assessment for Education (ILAS-ED) to measure IL skills. ILAS-ED, also known as B-TILED (Beile, 2007), assesses basic IL skills with multiple-choice questions. During ILAS-ED’s development, the instrument demonstrated reasonable reliability and validity (Beile, 2005, 2007). It is freely available and has been used in several IL studies (Alfonzo & Batson, 2014; Batarelo Kokić & Novosel, 2014; Cannon, 2007; Catalano & Phillips, 2016; Jesse, 2012; Magliaro, 2011; Robertson & Felicilda-Reynaldo, 2015; Tewell & Angell, 2015). Although the instrument was developed in 2005, the terminology used in the questions is still current.

 

ILAS-ED consists of 35 questions. Questions 1 and 2 deal with general self-efficacy, questions 3 through 6 deal with students’ library instruction history, questions 7 through 28 test IL skills, and questions 29 through 35 collect demographic data. For the purpose of this study, the author excluded questions 1 and 2 because self-efficacy was measured with more granularity by M-SPIL-Q. Question 3 was omitted as irrelevant to SLIS’s online students because it dealt with attending “a tour or physical orientation of the library” (Beile, 2007, p. 19). The study also omitted questions 4 through 6, which concerned receiving instruction in the library, in the classroom, and one-on-one; without contextual information about how long ago the instruction took place, by whom or at which institution the instruction was given, or what the instruction covered, this data would provide limited insight about the impact of the instruction on IL self-efficacy or skills.

 

Demographic data about age, ethnicity/race, and gender was collected with the intent to identify patterns (see Appendix A for demographic questions), but preliminary analysis of data about age, gender, and race/ethnicity provided little insight. The survey also solicited information about the highest degree obtained and the number of years since respondents received their most recent degree. The results section details the collected demographic data.

 

Because ILAS-ED was designed for students in education programs, some of the original ILAS-ED questions were modified for use with LIS students with the consent of ILAS-ED’s author (P. Beile, personal communication, June 20, 2017). (Appendix B presents the modified questions along with the corresponding ILAS-ED question number.) The modified form of the ILAS-ED will be referred to as M-ILAS-ED.

 

Spring 2018

 

Although the data from the M-SPIL-Q and M-ILAS-ED instruments gave the author valuable insights, the fall 2017 assessment measured some aspects of IL with multiple questions, while others were measured with only a few; for example, seven questions were used to assess the ability of students to access information, but only two questions were used to assess students’ citation skills (see Table 1).

 

When the study was repeated for additional data collection in spring 2018, the author chose to use Michalak and Rysavy’s (2016) unmodified SPIL-Q to measure self-efficacy and their Information Literacy Assessment (ILA) instrument to measure IL skills. Michalak and Rysavy granted the author permission (R. Michalak, personal communication, Dec. 7, 2017) to use the unmodified SPIL-Q instrument and a minimally modified version of their ILA instrument (2016). The SPIL-Q and ILA instruments were developed together, so each question on the ILA corresponds to a SPIL-Q item, and each IL skill was measured by the same number of questions.

 

There were only two modifications to the ILA instrument. Module 1, question 5 was modified to reflect LSU’s name and library’s name. Module 2, question 8 was changed from “Materials in the Hirons Library are organized . . .” to “Materials in most major university libraries in the United States are organized . . .” to make the question applicable to U.S. universities in general. The survey used the same demographic questions used in the fall 2017 study; the results section reports the collected demographic data.

 

Table 1

Fall 2017 IL Self-Efficacy Beliefs Keyed to M-ILAS-ED Questions

M-SPIL-Q Self-Efficacy Belief

Corresponding M-ILAS-ED Questions

Locating Information

Questions 8, 9, 10, 12, 20

Accessing Information

Questions 11, 13, 14, 15, 16, 17, 18

Evaluating Information

Questions 7, 19, 21, 23

Citing

Questions 24, 25

 

Data Analysis Techniques

 

The author transferred the data collected in Qualtrics to SPSS. Each item from M-ILAS-ED was keyed to one of the four self-efficacy beliefs (locate, access, evaluate, and cite) measured by the four questions from M-SPIL-Q, allowing individual skills to be measured against self-efficacy beliefs for possible correlations. Table 1 provides a breakdown of skills keyed to questions.

 

The total M-SPIL-Q and M-ILAS-ED scores were used to calculate the Pearson correlation coefficient to determine whether there was a possible correlation between students’ IL perceived self-efficacy and demonstrated IL skills.

 

The same procedure was followed in spring 2018, this time using the SPIL-Q and ILA scores. Again, each question in the ILA was keyed to a self-efficacy belief (developing a topic, locating information, accessing information, evaluating information, writing, and citing) in SPIL-Q. The librarian keyed the questions as described by Michalak and Rysavy (2016).

 

Results

 

Fall 2017

 

Of the 31 respondents to the gender question, the majority of respondents were female (n = 24, 75%); the remainder were male (n = 7, 22%) or preferred not to answer (n = 1, 3%). Of the 32 respondents to the race question, 20 (63%) were white, 5 (16%) were black or African American, 2 (6%) identified themselves as Hispanic of any race, 3 (9%) identified themselves as two or more races, and 2 (6%) preferred not to answer. Of the 19 respondents to the age question, 2 (11%) were between 20 and 24 years of age, 7 (37%) were between 25 and 29 years of age, 2 (11%) were between 30 and 34 years of age, 2 (11%) were between 35 and 39 years of age, 3 (16%) were between 40 and 44 years of age, and 3 (16%) were between 45 and 49 years of age.

 

The highest degree obtained by respondents was a bachelor’s degree (n = 23, 72%), followed by a master’s degree (n = 8, 25%), and a doctoral degree (n = 1, 3%). Twenty respondents (63%) had earned their degree within the last 5 years, 6 (19%) had earned their most recent degree within the last 6 to 10 years, 5 (16%) within the last 11 to 15 years, and 1 (3%) within the last 16 to 20 years.

 

Table 2 reports the mean M-SPIL-Q and M-ILAS-ED scores. The highest possible M-SPIL-Q score was 20; the highest possible M-ILAS-ED score was 18.

 

Tables 3 and 4 summarize the mean scores of each of the four areas (locating information, accessing information, evaluating information, and citing) tested by M-SPIL-Q and M-ILAS-ED.

 

The author used SPSS to calculate Pearson’s r to determine if a correlation existed between M-SPIL-Q scores and M-ILAS-ED scores (Table 5). The results indicate a moderate positive correlation that is statistically significant (p <.005), meaning that as self-efficacy scores increased so did IL scores.

 

 

Table 2

Fall 2017 M-SPIL-Q and M-ILAS-ED Scoresa

 

n

Minimum

Maximum

Mean

SD

M-SPIL-Q

32

8

20

15.37 (77%)

3.28

M-ILAS-ED

32

6

15

10.88 (60%)

2.34

aNumbers are rounded to the nearest hundredth; percentages are rounded to the nearest percent.

 

 

Table 3

Fall 2017 M-SPIL-Q Scores (n = 32) by Subcategorya

M-SPIL-Q Subcategory

Minimum

Maximum

Mean

SD

 Locate

1

5

3.81 (76%)

1.00

 Access

1

5

3.56 (71%)

1.05

 Evaluate

1

5

4.03 (81%)

.90

 Cite

1

5

3.97 (79%)

1.03

aNumbers are rounded to the nearest hundredth; percentages are rounded to the nearest percent.

 

 

Table 4

Fall 2017 M-ILAS-ED Scores (n = 32) by Subcategorya

M-ILAS-ED Subcategoryb

Minimum

Maximum

Mean

SD

 Locate

1

4

2.88 (58%)

.94

 Access

1

7

3.91 (56%)

1.45

 Evaluate

0

4

2.28 (57%)

.99

 Cite

0

2

1.81 (91%)

.47

aNumbers are rounded to the nearest hundredth; percentages are rounded to the nearest percent.

bThe total possible points for each subsection of the M-ILAS-ED are as follows: Locate 5, Access 7, Evaluate 4, Cite 2.

 

 

Table 5

Correlation between M-ILAS-ED and M-SPIL-Q Scores

 

M-SPIL-Q

M-ILAS-ED

M-SPIL-Q

Pearson Correlation

1

.561a

Sig. (2-tailed)

 

.001

n

32

32

M-ILAS-ED

Pearson Correlation

.561a

1

Sig. (2-tailed)

.001

 

n

32

32

aCorrelation is significant at the 0.01 level (2-tailed).

 

Spring 2018

 

Of the 19 respondents, 14 (74%) were female and 5 (26%) were male. The majority of respondents were white (n =13, 68%). Three respondents (16%) identified themselves as Black or African American, 1 respondent (5%) identified as “Hispanic of any race,” 1 respondent (5%) selected American Indian or Alaskan Native, and 1 respondent (5%) preferred not to answer. The largest percentage of respondents (n = 7, 37%) were between 25 and 29 years of age, 2 respondents (11%) were between 20 and 24 years of age, 2 respondents (11%) were between 30 and 34 years of age, 2 respondents (11%) were between 35 and 39 years of age, 3 respondents (16%) were between 40 and 44 years of age, and 3 respondents (16%) were between 45 and 49 years of age.

 

Of the 19 respondents, 9 (47%) held a bachelor’s as their highest degree, 8 (42%) held a master’s degree as their highest degree, and 2 (11%) held a doctoral degree. Over half of respondents (n = 10, 53%) earned their most recent degree within the last 5 years, almost a third (n = 6, 32%) had earned their most recent degree within the last 6 to 10 years, 2 respondents (11%) within 11 to 15 years, and 1 respondent (5%) 21years ago or more.

 

Table 6 reports the mean SPIL-Q and ILA scores. The highest possible SPIL-Q score was 30; the highest possible ILA score was 60.

 

 

Table 6

Spring 2018 SPIL-Q and ILA Scoresa

 

n

Minimum

Maximum

Mean

SD

SPIL-Q Total

19

12

30

24.53 (82%)

5.23

ILA Total

19

37.33

56.49

49.59 (83%)

5.42

aAll numbers are rounded to the nearest hundredth.

 

Mean scores were calculated in each of the six tested areas for both the SPIL-Q and ILA (Tables 7 and 8).

 

The author used SPSS to calculate Pearson’s r to determine if a correlation existed between SPIL-Q and ILA scores (Table 9). The results indicate a moderate positive correlation that is statistically significant (p < .005).


 

Table 7

Spring 2018 SPIL-Q Scores (n = 19) by Subcategory a

SPIL-Q Subcategory

Minimum

Maximum

Mean

SD

Develop a Topic

2

5

4.05 (81%)

1.08

Locate

1

5

4.05 (81%)

1.18

Access

2

5

4.05 (81%)

.91

Evaluate

2

5

4.16 (83%)

1.02

Write

2

5

4.11 (82%)

.94

Cite

2

5

4.11 (82%)

.99

aNumbers are rounded to the nearest hundredth; percentages are rounded to the nearest percent.

 

 

Table 8

Spring 2018 ILA Scores (n = 19) by Subcategory a

ILA Subcategory

Minimum

Maximum

Mean

SD

 Develop a Topic

5

10

8.26 (83%)

1.52

 Locate

5

10

7.53 (75%)

1.12

 Access

2.84

10

7.68 (77%)

2.04

 Evaluate

5.5

10

7.87 (79%)

1.25

 Write

6.5

10

9.26 (93%)

.96

 Cite

7

10

9.00 (90%)

1.00

aNumbers are rounded to the nearest hundredth; percentages are rounded to the nearest percent.

 

 

Table 9

Correlation between SPIL-Q and ILA Scores

 

SPIL-Q

ILA

SPIL-Q

Pearson Correlation

1

.668a

Sig. (2-tailed)

 

.002

n

19

19

ILA

Pearson Correlation

.668a

1

Sig. (2-tailed)

.002

 

n

19

19

aCorrelation is significant at the 0.01 level (2-tailed).

 

 

Discussion

Research Question 1: Self-Efficacy

 

Michalak and Rysavy (2016) defined students who felt they had adequate skills in an area as those who selected 4 or 5 (agree or strongly agree). In fall 2017, the only mean score above 4 was for evaluating information, although the mean scores for citing information (3.97) and locating information (3.81) were close to this cutoff. The mean score for students’ confidence in accessing information (3.56) suggests more ambivalence.

 

The students in the spring 2018 cohort were more confident; the mean score for each area was above 4.0, indicating most students felt like they had adequate skills in all six areas.

 

The majority of students in both cohorts felt like their skills were adequate, supporting the findings of Pinto, Fernandez-Ramos, Sanchez, and Meneses (2013) and Saunders et al. (2015) that LIS students have positive IL self-efficacy.

 

Research Question 2: Demonstrated IL Skills

 

In fall 2017, the mean M-ILAS-ED scores in the four tested IL skill areas showed that students performed best in citing, followed by locating, evaluating, and accessing information.

 

Mean scores present a general overview of skills, but item level analysis gives granular insight into the specific skills of the incoming students and indicates specific weaknesses. In fall 2017, at least half of the respondents incorrectly answered 7 of 18 questions. The seven items and percentage of students answering incorrectly follow:

 

·         72% were unable to identify the best source to locate a brief history and summary of a topic (ILAS-ED, question 8).

·         78% were unable to identify options offered in advanced search interfaces (ILAS-ED, question 11).

·         50% were unable to identify the best place to find recent scholarly articles in a particular subject (ILAS-ED, question 13).

·         62% were unable to select the best set of synonyms and terms related to a concept (ILAS-ED, question 15).

·         66% respondents were unable to identify a citation for chapter in a book (ILAS-ED, question 19).

·         59% were unable to select the best way to locate a journal article using the library’s catalog (ILAS-ED, question 20).

·         59% were unable to able to determine the reliability of a story on the Internet (ILAS-ED, question 23).

Five of the questions on which 50% or fewer respondents answered correctly had been modified (questions 8, 13, 15, 20, and 23 on the original ILAS-ED). Although the changes to the questions were minor (see Appendix B), the possibility of poor adaptation may have contributed to the respondents’ lower performance. Despite the modifications, the findings point to a gap in knowledge to some fundamental skills used in locating, accessing, and evaluating information.

 

In spring 2018, the mean scores of entering MLIS students in developing a topic; locating, accessing, and evaluating information; writing; and citing as measured by the ILA instrument were all 75% or above (see Table 8). However, looking at the results on the item level highlights weaknesses:

 

·         53% of respondents did not identify that information in a library is selected through a review process as the best description of what distinguishes the information in the library from information on the Web (module 2, question 1).

·         47% of respondents were unable to identify the Library of Congress Classification system as that most often used in major U.S. universities (module 2, question 8).

·         58% of respondents did not know how to search for different endings of a word by using truncation (module 3, question 3).

·         58% of respondents were unable to identify the least important action in evaluating a resource when writing about the history of a topic (module 4, question 2).

·         37% of respondents indicated that not every website needs to be evaluated before using information found on it (module 4, question10).

 

In both fall 2017 and spring 2018, a high percentage of respondents demonstrated a lack of basic knowledge and skills. These findings are similar to Conway’s (2011) experience with LIS graduate students. Librarians who serve LIS graduate students cannot assume that incoming students possess skills and knowledge that are considered fundamental in the LIS discipline. Librarians should keep this gap in mind when constructing LibGuides and other resources for LIS graduate students. If librarians are providing one-shot instruction or embedding in an LIS graduate course, they may want to consider administering a pretest before designing instruction and activities so they can address gaps in knowledge and skills.

 

Research Question 3: Correlation

 

In the first phase of the study (fall 2017), there was a modest positive correlation between M-SPIL-Q and M-ILAD-ED scores (r = .561, p < .005). There was also a moderate positive correlation (r = .668, p < .005) between SPIL-Q and ILA scores in spring 2018, again indicating a possible positive correlation between perceived IL self-efficacy and actual IL skills. These moderate positive correlations echo the correlation between IL self-efficacy and skills found by Robertson & Felicilda-Reynaldo’s (2015) study of graduate nursing students.

 

Although there is a positive correlation between IL self-efficacy and skills, there are indications of discrepancies between perceived and actual IL skills. This study reveals specific examples of students misjudging their skill level. For example, in fall 2017, the mean M-SPIL-Q score of 15.38 (SD 3.28) indicates that students were confident about their IL skills, but the mean M-ILAS-ED skill score, 9.5 (SD 2.578) out of a possible score of 18, indicates a low skill level (see Table 2). In addition, in fall 2017, 84% of respondents rated their ability to evaluate information as adequate, but the mean score for demonstrated ability to evaluate information was 2.28/4.0 (57%) (see Table 4).

 

These discrepancies confirm the concern that LIS graduate students overestimate their IL skills put forth by Pinto, Fernandez-Ramos, Sanchez, and Meneses (2013), which was based on personal observations, and by Saunders et al. (2015), which was based on the self-reported information behaviors of LIS graduate students. In both cases, the researchers did not have data about demonstrated IL skills. When working with individual LIS graduate students, practitioners should remember that a student’s skill level may not may measure up to the student’s confidence; librarians should probe to identify the student’s actual competence or knowledge instead of relying on the student’s self-reported understanding and ability.

 

Research Limitations

 

The study’s most significant limitation is its small, self-selected sample size. A larger sample of students from multiple MLIS programs across the country would yield more reliable data and generalizable results.

 

The use of two different instruments in fall 2017 and fall 2018 introduced additional limitations. Although using two different instruments gave the author insight into which test might be more suitable for large scale use, it did prevent the author from establishing a clear baselines of IL self-efficacy and skills for MLIS students, and although general trends could be identified, results between the two cohorts could not be directly compared.

 

The timing of the survey in fall 2017 was problematic. To collect students’ answers before they were exposed to IL instruction in graduate LIS classes, the survey needed to be distributed at the beginning of the semester; however, the beginning of the semester corresponded to the catastrophic destruction caused by Hurricane Harvey. Although most of Hurricane Harvey’s destruction was in Texas, portions of Louisiana also experienced flooding. SLIS’s MLIS program in an online degree program, and students are scattered across the United States; however, many of them live in Louisiana. Some eligible students may have been affected directly, and those living in other areas of the state may have had family in devastated areas. There is no way to measure the effect that Hurricane Harvey had on the response rate or on respondents’ performance. It is impossible to quantify the emotional impact of the storm on those it affected either directly or indirectly. Completing an optional survey would have been a low priority for affected students.

 

Funding limited the measurement instruments available for use. Many standardized measurement instruments for IL with rigorous testing for reliability and validity, such as the Research Readiness Self-Assessment (RRSA) (Ivanitskaya, Laus, & Casey, 2004) and the Standardized Assessment of Information Literacy Skills (SAILS) (Radcliff, Oakleaf, & Van Hoeck, 2014) are fee-based (Sparks, Katz, & Beile, 2016). With additional funding, the study could be repeated with an established instrument across multiple LIS graduate programs.

 

Future Considerations

 

Repeating the study across multiple institutions would yield a larger sample size that could help librarians target specific groups for outreach. For example, in fall 2017, students having completed their most recent degree in the last five years (n = 20) had the lowest mean IL score, 10.70 (SD 2.54), and students between 20 and 24 years of age (n = 6) had the lowest mean IL score of all age groups. This finding suggests that outreach and instruction efforts should focus on younger students and more recent graduates, but the results apply only to this small cohort of students at a single university. A large sample size that includes students from different institutions would make analysis of the data on age, highest degree earned, and years since most recent degree useful for librarians planning outreach to incoming LIS graduate students.

 

Additional demographic questions could reveal useful insights into student needs. For example, questions about previous areas of study could indicate whether students beginning MLIS programs with degrees in particular subjects enter with higher or lower IL skills. Questions about library work experience could give insight into its impact on IL. Working in a library is often cited as an motivation to enroll in an LIS graduate program (Ard et al., 2006; Kim, Chiu, Sin, & Robbins, 2007; Taylor, Perry, Barton, & Spencer, 2010). Data could substantiate or refute the assumption that students with library work experience may score higher in both self-efficacy and demonstrated IL skills than students with no history of working in a library. The number of online MLIS students has grown rapidly. In the 2003–2004 academic year, approximately 67% of LIS programs responding to the ALISE survey reported offering internet or web-based classes (Saye, 2008); by 2013–2014 , 96% of programs responding to the survey reported offering online courses (Albertson, Spetka, & Snow, 2015). Research suggests that online MLIS students have a unique profile (Oguz, Chu, & Chow, 2015). It is possible that the scores of online students could differ from those of face-to-face students.

 

Although LIS graduate students have been reported to consult librarians more frequently than graduate students in other programs (Tracy & Searing, 2014), LIS graduate students in the United States are still more likely to consult with their instructors and classmates than with librarians (Saunders et al., 2015). Tracy and Searing’s (2014) survey study on LIS graduate students as library users found that LIS students “need to learn search strategies and resources as much as other graduate students” (p. 377). LIS library liaisons can use the data collected from assessments of skill and self-efficacy to guide their outreach efforts to the areas of greatest weakness, especially if self-efficacy exceeds assessed skills.

 

Results could be used in collaborations between LIS professors and LIS librarian liaisons to address gaps in knowledge in a systematic way, such as the program described by Lamb (2017) at the Department of Library Science at Indiana University at Indianapolis. In this program, students are given diagnostic pretests that are used to prescribe a series of self-paced tutorials designed to address the varying degrees of technological proficiency of incoming LIS students (Lamb, 2017). Students who score 85% or above on a pretest are exempt from completing the corresponding tutorial, so students only need to complete the tutorials for skills in which they are not deemed proficient (Lamb, 2017).

 

There are indications that LIS faculty are aware of that some incoming LIS graduate students lack foundational IL skills (Lamb, 2017; Pinto, Fernandez-Ramos, Sanchez, and Meneses, 2013). This suggests an additional opportunity for research comparing how LIS professors rate LIS graduate students’ information literacy proficiency to how LIS graduate students rate their own skill level.

 

Conclusion

 

This exploratory, cross-sectional, descriptive study measured both the IL self-efficacy and demonstrated IL skills of students entering an MLIS program. The collected data suggests that a moderate positive correlation exists between IL self-efficacy and skills.

This study also tests the feasibility of a larger, multi-institution study that would fill a gap in the literature about LIS graduate students and provide other librarians who support these students with data to inform their instruction and outreach plans. This study may also be the first part of a longitudinal study of how MLIS students’ IL self-efficacy and skills develop as students progress through their graduate program.

 

References

 

Albertson, D., Spetka, K., & Snow, K. (2015). ALISE library and information science education statistical report 2015. Seattle, WA. Retrieved from http://www.alise.org/assets/documents/statistical_reports/2015/alise_2015_statistical_report.pdf

 

Alfonzo, P., & Batson, J. (2014). Utilizing a co-teaching model to enhance digital literacy instruction for doctoral students. International Journal of Doctoral Studies, 9, 61–71. Retrieved from https://pdfs.semanticscholar.org/ddc5/79e2dcbf825d7a11e122615a89a27646f482.pdf

 

American Library Association. (1989). Presidential Committee on information literacy: Final report. Chicago, IL. Retrieved from http://www.ala.org/acrl/publications/whitepapers/presidential

 

Ard, A., Clemmons, S., Morgan, N., Sessions, P., Spencer, B., Tidwell, T., & West, P. J. (2006). Why library and information science? Reference & User Services Quarterly, 45(3), 236–248. Retrieved from https://www.jstor.org/stable/20864520

 

Association of College and Research Libraries. (2016). Framework for information literacy for higher education. Retrieved from http://www.ala.org/acrl/standards/ilframework

 

Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84, 191–215. Retrieved from https://doi.org/10.1037/0033-295X.84.2.191

 

Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, N.J.: Prentice-Hall.

 

Batarelo Kokić, I., & Novosel, V. (2014). The ball is in your court: Information literacy self-efficacy and information literacy competence relation. In S. Kurbanoglu, S. Špiranec, E. Grassian, D. Mizrachi, R. Catts (Eds.), Information Literacy. Lifelong Learning & Digital Citizenship in the 21st Century (pp. 512–520). https://doi.org/doi:10.1007/978-3-319-14136-7_54

 

Beile, P. M. (2005). Development and validation of the Beile Test of Information Literacy for Education (B-TILED). University of Central Florida. Retrieved from http://works.bepress.com/penny-beile/18/

 

Beile, P. M. (2007). The ILAS-ED: A standards-based instrument for assessing pre-service teachers’ information literacy levels. In Society for Information Technology & Teacher Education International Conference (Vol. 2007, pp. 1–27). Retrieved from http://eprints.rclis.org/16928/

 

Boucher, C., Dalziel, K., Davies, M., Glen, S., & Chandler, J. (2009). Are postgraduates ready for research? Poster presented at the Librarians Information Literacy Annual Conference (LILAC). Cardiff, UK. Retrieved from http://rrsa.cmich.edu/documents/RRSA_poster_LILAC_2009.pdf

 

Cannon, T. (2007). Closing the digital divide: An assessment of urban graduate teacher education students’ knowledge of information literacy and their readiness to integrate information literacy into their teaching. University of San Francisco. Retrieved from https://repository.usfca.edu/diss/252

 

Catalano, A. (2010). Using ACRL standards to assess the information literacy of graduate students in an education program. Evidence Based Library & Information Practice, 5(4), 21–38. https://doi.org/10.18438/B8V62B

 

Catalano, A., & Phillips, S. R. (2016). Information literacy and retention: A case study of the value of the library. Evidence Based Library and Information Practice, 11(4), 2–13. https://doi.org/10.18438/B82K7W

 

Conway, K. (2011). How prepared are students for postgraduate study? A comparison of the information literacy skills of commencing undergraduate and postgraduate Information Studies students at Curtin University. Australian Academic & Research Libraries, 42(2), 121–135. https://doi.org/10.1080/00048623.2011.10722218

 

Crosetto, A., Wilkenfeld, P., & Runnestrand, D. (2007). Responding to the needs of our graduate students: A pilot information literacy course in graduate education. In T. Jacobson & T. Mackey (Eds.), Information literacy collaborations that work (pp. 41–56). New York, NY: Neal-Shuman Publishers.

 

Green, R. (2010). Information illiteracy: Examining our assumptions. Journal of Academic Librarianship, 36(4), 313–319. https://doi.org/10.1016/j.acalib.2010.05.005

 

Gross, M., & Latham, D. (2012). What’s skill got to do with it?: Information literacy skills and self-views of ability among first-year college students. Journal of the American Society for Information Science & Technology, 63(3), 574–583. https://doi.org/10.1002/asi.21681

 

Harrington, M. R. (2009). Information literacy and research-intensive graduate students: Enhancing the role of research librarians. Behavioral & Social Sciences Librarian, 28(4), 179–201. https://doi.org/10.1080/01639260903272778

 

Islam, M. A., & Tsuji, K. (2010). Assessing information literacy competency of Information Science and Library Management graduate students of Dhaka University. IFLA Journal, 36(4), 300–316. https://doi.org/10.1177/0340035210388243

 

Ivanitskaya, L., Laus, R., & Casey, A. M. (2004). Research Readiness Self-Assessment: Assessing students' research skills and attitudes. Journal of Library Administration, 41(1/2), 167–183. https://doi.org/10.1300/J111v41n01_13

 

Jackson, C. (2013). Confidence as an indicator of research students’ abilities in information literacy: A mismatch. Journal of Information Literacy, 7(2), 149–152. https://doi.org/10.11645/7.2.1848

 

Jesse, S. (2012). Subject specific information literacy curriculum and assessment. Christian Librarian, 55(1), 2–16. Retrieved from http://digitalcommons.georgefox.edu/cgi/viewcontent.cgi?article=1477&context=tcl

 

Kim, K.-S., Chiu, M.-H., Sin, S.-C. J., & Robbins, L. (2007). Recruiting a diverse workforce for academic/research librarianship: Career decisions of subject specialists and librarians of color. College & Research Libraries, 68(6), 533–552. https://doi.org/10.5860/crl.68.6.533

 

Kurbanoglu, S. (2003). Self-efficacy: A concept closely linked to information literacy and lifelong learning. Journal of Documentation, 59(6), 635–646. https://doi.org/10.1108/00220410310506295

 

Kurbanoglu, S., Akkoyunlu, B., & Umay, A. (2006). Developing the information literacy self-efficacy scale. Journal of Documentation, 62(6), 730–743. https://doi.org/10.1108/00220410610714949

 

Lamb, A. (2017). Debunking the librarian “gene”: Designing online information literacy instruction for incoming library science students. Journal of Education for Library & Information Science, 58(1), 15–26. https://eric.ed.gov/?id=EJ1150595

 

Magliaro, J. (2011). Comparing information literacy needs of graduate students in selected graduate programs through the Technology Acceptance Model and Affordance Theory. University of Windsor. Retrieved from https://scholar.uwindsor.ca/etd/424

 

Mahmood, K. (2016). Do people overestimate their information literacy skills? A systematic review of empirical evidence on the Dunning-Kruger Effect. Communications in Information Literacy, 10(2), 199–213. Retrieved from https://doaj.org/article/cf4e25b16ec2420c9168033977cdcd3d

 

Michalak, R., & Rysavy, M. D. T. (2016). Information literacy in 2015: International graduate business students’ perceptions of information literacy skills compared to test-assessed skills. Journal of Business & Finance Librarianship, 21(2), 152–174. https://doi.org/10.1080/08963568.2016.1145787

 

Michalak, R., Rysavy, M. D. T., & Wessel, A. (2017). Students’ perceptions of their information literacy skills: The confidence gap between male and female international graduate students. Journal of Academic Librarianship, 43(2), 100–104. https://doi.org/10.1016/j.acalib.2017.02.003

 

Oguz, F., Chu, C. M., & Chow, A. S. (2015). Studying online: Student motivations and experiences in ALA accredited LIS programs. Journal of Education for Library & Information Science, 56(3), 213–231. https://files.eric.ed.gov/fulltext/EJ1074636.pdf

 

Perrett, V. (2004). Graduate information literacy skills: The 2003 ANU skills audit. Australian Library Journal, 53(2), 161–171. https://doi.org/10.1080/00049670.2004.10721622

 

Pinto, M., Fernandez-Ramos, A., Sanchez, G., & Meneses, G. (2013). Information competence of doctoral students in information science in Spain and Latin America: A self-assessment. Journal of Academic Librarianship, 39(2), 144–154. https://doi.org/10.1016/j.acalib.2012.08.006

 

Radcliff, C., Oakleaf, M., & Van Hoeck, M. (2014). So what? The results and impact of a decade of IMLS-funded information literacy assessments. In S. Durso, S. Hiller, M. Kyrillidou, & A. Pappalarado (Eds.), Proceedings of the 2014 Library Assessment Conference: Building effective, sustainable, practical assessment (pp. 801–809). Seattle, WA: Association of Research Libraries. Retrieved from http://old.libraryassessment.org/bm~doc/24radcliffpanel.pdf

 

Robertson, D. S., & Felicilda-Reynaldo, R. F. D. (2015). Evaluation of graduate nursing students’ information literacy self-efficacy and applied skills. Journal of Nursing Education, 54(3, Suppl), S26–30. https://doi.org/10.3928/01484834-20150218-03

 

Sadler, E. (Bess), & Given, L. M. (2007). Affordance theory: A framework for graduate students’ information behavior. Journal of Documentation, 63(1), 115–141. https://doi.org/10.1108/00220410710723911

 

Saunders, L., Kurbanoglu, S., Boustany, J., Dogan, G., Becker, P., Blumer, E., … Terra, A. L. (2015). Information behaviors and information literacy skills of LIS students: An international perspective. Journal of Education for Library & Information Science, 56, S80–S99. Retrieved from https://eric.ed.gov/?id=EJ1073544 

 

Saunders, L., Severyn, J., Freundlich, S., Piroli, V., & Shaw-Munderback, J. (2016). Assessing graduate level information literacy instruction with critical incident questionnaires. Journal of Academic Librarianship, 42(6), 655–663. https://doi.org/10.1016/j.acalib.2016.08.008

 

Saye, J. D. (2008). Library and information science education statistical report 2005. Chicago, IL.

 

Sparks, J. R., Katz, I. R., & Beile, P. M. (2016). Assessing digital information literacy in higher education: A review of existing frameworks and assessments with recommendations for next-generation assessment. Research report. ETS RR-16-32. ETS Research Report Series. Princeton, NJ: ETS Research Report Series. https://doi.org/10.1002/ets2.12118

 

Tang, Y., & Tseng, H. W. (2013). Distance learners’ self-efficacy and information literacy skills. Journal of Academic Librarianship, 39(6), 517–521. https://doi.org/10.1016/j.acalib.2013.08.008

 

Taylor, S. D., Perry, R. A., Barton, J. L., & Spencer, B. (2010). A follow-up study of the factors shaping the career choices of library school students at the University of Alabama. Reference & User Services Quarterly, 50(1), 35–47. Retrieved from http://www.jstor.org/stable/20865334

 

Tewell, E., & Angell, K. (2015). Far from a trivial pursuit: Assessing the effectiveness of games in information literacy instruction. Evidence Based Library & Information Practice, 10(1), 20–33. https://doi.org/10.18438/B8B60X

 

Tracy, D. G., & Searing, S. E. (2014). LIS graduate students as library users: A survey study. Journal of Academic Librarianship, 40(3/4), 367–378. https://doi.org/10.1016/j.acalib.2014.05.004

 

Appendix A

Demographic Questions

To which gender do you most identify? (radio button)

Female

Male

Non-binary/third gender

Prefer to self-describe _____

Prefer not to answer

 

Age (drop down)

19 or less

20-24

25-29

30-34

35-39

40-44

45-49

50-54

55 or over

 

Race ethnicity (drop down)

Hispanic of any Race

American Indian or Alaskan Native

Asian

Black or African American

Native Hawaiian or Pacific Islander

White

Two or More Races

International

Race or Ethnicity Unknown

Prefer not to answer

 

Years since obtaining your most recent degree (radio button)

5 or less

6-10

11-15

16-20

21 or more

 

Highest degree earned (radio button)

Bachelors

Masters

Doctorate

Appendix B

Questions Modified from ILAS-ED

 

Question 7

Which of the following characteristics best indicates scholarly research?

a.       available in an academic library

b.       indexed by an academic database

c.        reviewed by experts for publication

d.       written by university faculty

 

Question 8

You are unfamiliar with the topic of the whole language movement, so you decide to read a brief history and summary about it. Which of the following sources would be best?

a.       a book on the topic, such as Perspectives on whole language learning: A case study

b.       a general encyclopedia, such as Encyclopedia Britannica

c.        an article on the topic, such as “Whole language in the classroom: A student teacher’s perspective”

d.       an education encyclopedia, such as Encyclopedia of Education

 

Question 10

You are looking for a peer-reviewed article about the librarian’s role in open education resources and textbook affordability efforts. The most appropriate place to look is:

a.       a library & information science database

b.       Wikipedia

c.        a news resources database

d.       both (a) and (c)

 

Question 12

Research studies in library and information science are generally first communicated through:

a.       books published by library associations

b.       library science encyclopedia entries

c.        newsletters of library associations

d.       professional conferences and journal articles

 

Question 13

You have been assigned to write a short class paper on effective library instruction techniques. Your professor indicated three recent scholarly sources would be sufficient. Which strategy is best to locate items?

a.    search a general academic database and a library and information science database for journal articles

b.    search a library and information science database for journal articles

c.     search the library catalog for books

d.    search the library catalog for encyclopedias

 

Question 14

Select the set of search terms that best represent the main concepts in the following:

What are the benefits associated with library use for low-income students?

a.    library use, benefits, low-income students

b.    library use, benefits, students

c.     library use, low income, students

d.    library, low-income students, use

 

Question 15

Select the set that best represents synonyms and related terms for the concept “college students.”

a.    colleges, universities, community colleges…

b.    Millennials, students, undergraduates…

c.     graduate students, freshmen, sophomores...

d.    university, adult learners, educational attendees...

 

Question 16

While researching library patrons, you find that they are also sometimes called “library customers” or “library clients.” You decide to look for information on the subject in a database that indexes library science literature. To save time you write a search statement that includes all three terms. Which of the following is the best example to use when you have fairly synonymous terms and it does not matter which of the terms is found in the record?

a.    patrons and customers and clients

b.    patrons or customers or clients

c.     patrons, customers and clients

d.    patrons, customers or clients

 

Question 18

You have a class assignment to investigate how summer reading programs impact student achievement. A keyword search in an academic database on “summer reading programs” has returned over 600 items. To narrow your search, which of the following steps would you next perform?

a.    add “impact” as a keyword

b.    add “student achievement” as a keyword

c.     limit search results by date

d.    limit search results by publication type

 

Question 20

Your professor suggested you read a particular article and gave you the following citation:

Thomas, W., & Shouse, D. (2014). This is not a dumpsite: The problem of evaluating gift books. Library Collections, Acquisitions & Technical Services, 38(3-4), 63-69.

Which of the following would you type into the library’s catalog to locate the actual article?

a.    author search: Thomas

b.    journal title search: Library Collections, Acquisitions & Technical Services

c.     journal title search: This is not a dumpsite: The problem of evaluating gift books

d.    subject search: gift books

 

Question 21

The following item was retrieved from a database search. What kind of source is it?

Title: The Effect of Library Instruction Learning Environments on Self-Efficacy Levels and Learning Outcomes of Graduate Students in Education

Author(s): Beile, Penny

Publication Year: 2002

Abstract: The purpose of the study was to examine the effectiveness of three learning environments: (1) campus-based students who attended a classroom library instruction session; (2) campus-based students who completed a Web-based library tutorial; and (3) distance students who completed a Web-based library tutorial on library skills self-efficacy levels and learning outcomes among graduate students of education.

Notes: Presented at the Annual Meeting of the American Educational Research Association (New Orleans, LA, April 1-5, 2002)

Number of Pages: 8

Accession Number: ED453084

a. a book

b. a book chapter

c. a conference paper

d. a journal article

 

Question 23

While researching the U.S. legislative system, you find the following story on the Internet:

Congress Launches National Congress-Awareness Week WASHINGTON, DC—Hoping to counter ignorance of the national legislative body among U.S. citizens, congressional leaders named the first week in August National Congress Awareness Week. “This special week is designed to call attention to America’s very important federal lawmaking body,” Speaker of the House Dennis Hastert said. The festivities will kick off with a 10-mile Walk for Congress Awareness. The item is from a newspaper Web site, which states it is “America’s Finest News Source.”

 

Given this, the following action is in order:

a.    you can use the story as it’s obviously from a reputable news source

b.    you decide to investigate the reputation of the publisher by looking at their Web site

c.     you decide to investigate the reputation of the publisher by looking at other Web sites

d.    you should not use the story because Web information is not always trustworthy

 

Question 24

Based on the following paragraph, which sentence should be cited?

(1) Libraries were once quiet spaces reserved for readers. (2) As libraries increased their community programming, they began to shift to the more social (and unquiet) places with which we are familiar today. (3) Many libraries try to preserve some aspects of their quiet past while continuing to offer engaging programing. (4) The public seems to want this as well; in a Pew research poll, 61% of Americans said that they believe libraries should have completely separate locations or spaces for quiet and social activities.

a.    1

b.    2

c.     3

d.    4