Previous Contents Next
Issues in Science and Technology Librarianship
Fall 2015
DOI:10.5062/F4M32SS9

[Refereed]

Impact of a Library Instruction Session on Bibliographies of Organic Chemistry Students

John Kromer
Physical Sciences Librarian
Brown University Library
Providence, Rhode Island
john_kromer@brown.edu

Abstract

Students in Chemistry 254: Organic Chemistry for Majors were required to write a paper about an organic name reaction. Before turning in this assignment, students had the option of attending a one-hour library instruction session covering SciFinder, sources for spectra, ACS Style, and print resources about organic name reactions. Twenty-five students attended library sessions while 28 did not. Bibliographies were collected and graded for all students. Comparisons were made between those students who attended the session and those who did not, on such criteria as use of scholarly sources, properly citing articles and spectra, and correct use of ACS Style. Students who attended an instruction session received 14% higher bibliography scores than those who did not. Other significant differences were found in students using scholarly sources, using and citing appropriate spectra, and citing the article in which the name reaction was originally published.

Introduction

Librarians frequently extol the virtues of instruction sessions to faculty members, asserting that students will produce better papers, cite better sources, and better understand the scholarly process. However, librarians rarely have more than anecdotal evidence or student feedback to back up these claims. This paper describes an attempt to quantify the improvements to students' papers following attendance at a library instruction session, through citation analysis.

Literature Review

Several articles have explored the use of citation analysis to quantify the impact of library instruction. Perhaps the earliest example comes from Kirk (1971), in which he compared the scores that general biology students received on their bibliographies, depending on which of two types of library instruction they received: "lecture-demonstration" or "guided exercise." A more thorough treatment of the subject was addressed by Gratch (1985), in which she both explored the existing attempts at library instruction quantification and offered suggestions for designing an appropriate evaluation method. Many of her points remain relevant, such as making sure that the evaluation technique used matches the research question. For example, if a researcher is interested in search patterns, then just evaluating papers or bibliographies will not reveal anything about a student's search process. If the researcher is trying to imply cause-and-effect between library instruction and student improvement, then a control group is necessary to reduce variability. These and many other of her points have been incorporated into this study where possible.

Some significant studies in the field have been identified, beginning with King and Ory (1981), who compared three groups: students who received no library instruction, students who received library instruction from teaching assistants, and students who received library instruction from librarians. Their use of a control group is important, in that it helped create a baseline from which to compare the librarian impact. However, their focus was on library use: number of sources, number of databases used, and types of materials used. They collected their data through surveys, rather than citation analysis. Davis and Cohen (2001) conducted a longitudinal study of economics students, in which they were evaluating to what extent students were using and citing the web in their research papers. They, like King and Ory, were interested in quantity and types of materials used, and used the students' bibliographies to determine this. They did not use a control group, as they were more interested in whether students were citing web sites more, and not in the impact of librarian instruction. Dykeman and King (1983) conducted perhaps the most similar experiment to this one, in which one class section received instruction from a librarian and from the writing center while a control section did not. In addition, they conducted a survey to determine demographics and resources used in completing their term papers.

Background

Assignment Description

Students were required to write a paper describing the history, significance, current uses, and acceptance of an organic name reaction. The name reaction the students chose had to start with the same letter of the alphabet as their last name. Most students came to the library instruction session with a name reaction already selected. The students were also required to submit the mechanism of the reaction and spectra (mass spectrometry, infrared, and 13C and 1H NMR where available) for a chemical commonly used in their chosen name reaction. For their bibliographies, students were only permitted to use journal articles as references, and their bibliographies were to be completed using proper ACS Style.

Study Population

Students involved in this study were all chemistry majors enrolled in Chemistry 254: Organic Chemistry, meaning that the students were primarily second-year college students.

Library Instruction

In order to conduct the research required for this assignment, students were encouraged to attend a library instruction session. The session covered basic library resources (LibGuides, librarian contact information, hours of the reference desk, etc.), resources dedicated to name reactions (name reaction books like Li (2003) and The Merck Index), and SciFinder for the purposes of finding sources and spectra. Students were shown how to use Explore References to find information on the uses and significance of their reaction, including "as entered" and "containing the concept" differentiations when searching and some of the Refine features to help narrow down their searches by document type and language. Students were also shown Chemical Structure and Substance Identifier searches to find their compound of interest, and once on the Substance Detail page, they could access the spectra for their compound. It also served to introduce the students to the librarian, in the hopes that they will know whom to contact with questions throughout their college career.

Methods

It is important to note that students were allowed to choose whether or not they attended a library instruction session. This served to create a control group, so that the impact of library instruction could better be quantified.

Upon completion and submission of their assignments, students' papers were anonymized and forwarded to the librarian. The librarian did not know whether or not a student had attended a library instruction session. The librarian graded the bibliographies on the requirements set forth in the assignment: using scholarly journal articles (no web sites, books, or news articles), using and citing spectra from verified sources, and using ACS Style for their citations.

Results

Twenty-five students in the class chose to attend a library instruction session (attendees), while 28 chose not to attend (non-attendees).

Scholarly Sources

Attendees cited, on average, nine (9) sources in their papers (not counting citations to spectra), of which 96% were to scholarly sources (students were only permitted to use journal articles for this assignment, which is what is meant by "scholarly sources" in this article). Non-attendees cited, on average, ten (10) sources, but only 81% of these came from scholarly sources. The majority of these non-scholarly sources were references to web sites. See Table 1 for a summary. Where presented, all statistical significance calculations were done using a one-tailed t-test.

Table 1. Comparison of attendees' and non-attendees' usage of scholarly (journal articles) sources.a

Average Number of Sources

Average Number of Scholarly (Journal) Sources

Average Proportion of Citations Were Scholarlyb

Session Attendees (n=25)

8.9

8.5

0.94 ± 0.06

Session Non-Attendees (n=28)

9.8

8.0

0.80 ± 0.23

aCitations to spectra were excluded from calculations. Values for one standard deviation are included for proportion of scholarly articles.
bStatistically significant difference (p = 0.0019)

Citing Spectra

Of the 53 students in the course, 13 did not include spectra in their papers: 4 attendees (16%) and 9 non-attendees (32%). This finding was statistically significant, with p = 0.030.

Of the 21 attendees who included spectra, 18 (86%) used and cited spectra from a verified source (journal articles or spectra found through SciFinder). Of the 19 non-attendees who included spectra, only 6 (32%) used and cited spectra from a verified source. This proportion was also statistically significant, with p = 0.0054.


Table 2. Comparison of students' citations to spectra.

Total

Number Who Included Spectra in Their Papersc

Number Who Used Spectra from Appropriate Sourcesd

Attendees

25

21

18

Non-Attendees

28

19

6

cStatistically significant difference (p = 0.0301)
dStatistically significant difference (p = 0.0054)

ACS Style

Attendees and non-attendees alike struggled with ACS Style. Common errors were inclusion of article titles (75% of all students, although ACS Style does allow article titles as optional), lack or absence of formatting such as bolds and italics (60%), inconsistent use or complete absence of journal abbreviations (57%) and simply choosing to use a different citation style altogether (21%). Even papers that had some completely correct citations often had other citations that showed a wide array of errors.

Bibliography Score

Students who attended a library instruction session averaged a score of 8.2 (out of 10) with a standard deviation of 1.0 on their bibliographies, while students who did not attend a library instruction session averaged 6.8 with a standard deviation of 1.6. This yields a probability of 0.00013, indicating a statistically significant difference in bibliography scores.

Breadth of Coverage

While the focus of this study was to determine if "quality" of students' bibliographies improved through library instruction by increasing their use of appropriate resources, it also seemed interesting to study the breadth of materials used. In particular, average year cited, range of years cited, standard deviation of years cited, and number of unique journals cited, were evaluated for any significance between attendees and non-attendees. In addition, it was considered likely that students who attended a library instruction session would be more likely to find and cite the original article (the article in which the name reaction was first described). Moreover, it was thought that citations to these original articles would skew the range, average, and standard deviation of students' citation years, as many of these articles are from as early as the 1850s, so averages were calculated both with and without the original article citation year included.

Twenty (80%) of the students who attended a library instruction session cited the original article in their paper compared to 17 (61%) of those who did not attend a session. This fell barely outside the 95% confidence level, with p=0.063. No other significance was found in the average year cited, standard deviation of years cited, or range of years cited, with or without the original citation.

Table 3. Measures of breadth of coverage.

Total

Number Citing Original Articlee

Average Year Citedf

Average Standard Deviation of Years Citedf

Average Range of Years Citedf

With Original Citation

Without Original Citation

With Original Citation

Without Original Citation

With Original Citation

Without Original Citation

Attendees

25

20

1991.9

1997.7

24.2

8.2

67.0

57.8

Non-Attendees

28

17

1990.8

1995.4

21.9

12.9

42.9

38.8

eStatistically significant difference (p = 0.063)
fNot statistically significant

Discussion

It can be clearly seen that students' attendance at a library instruction session had an impact on their bibliographies. If nothing else, an increase in the students' bibliography grades would likely be enough to show some faculty members the value of library instruction. While many of the aspects of the bibliography rubric focused on ACS Style, such as correct formatting, correct use of journal abbreviations, and not including the article title, neither attendees nor non-attendees had much success in following ACS Style. A higher proportion of the bibliography grade was allotted to using appropriate sources: in the assignment, the students were told explicitly that only journal articles would be acceptable as resources. However, each student was allowed to use one book, as it is often very difficult to find the original citation for an organic name reaction without looking in a book specifically dedicated to name reactions. Web sites were completely forbidden, as the course instructor wanted this to provide the students with an immersion into the scholarly chemical literature. While use of books and web sites could not be prevented while students were conducting their research, citations to these sources resulted in a lower bibliography score.

One point of contention about the use of web sites arises in the use of SDBS (Spectral DataBase System for Organic Compounds) when citing spectra. Students are often directed to SDBS in their courses, as it is an easily-accessible resource for some of the most common spectra types, for a wide variety of organic compounds. However, because the instructor viewed this assignment as a way to introduce students to the scientific literature (explicitly journals), students' citations to SDBS were considered as "inappropriate" in the grading of their assignments. Some leniency was given to those students citing SDBS in comparison to those who did not include or cite spectra at all. Many previous citation analyses have found that student bibliographies have higher currency (papers published recently) when students have attended a library instruction session (Oppenheim & Smith 2001; Heller-Ross 2003; Smith 2003). However, this study did not find the same results. Attendees and non-attendees alike had an average citation year in the 1990s, and there was no statistically significant difference between attendees' and non-attendees' average years.

Several factors may contribute to this lack of currency. First as this is a historical assignment, it is expected that some of the articles will be older, particularly for the original article and subsequently if a paper was published on the mechanism of the reaction. Further, it was surmised that students who did not attend a library instruction session would be unlikely to use SciFinder, opting instead to use Google Scholar or some other database with which they are more familiar. A brief analysis of whether or not the full text articles cited by non-attendees could be found in Google Scholar suggested that students did not use Google Scholar exclusively for their research, and perhaps did not even use it at all.

Some students also indicated that their teaching assistants had shown them SciFinder earlier in the semester. Considering SciFinder defaults to sort by accession number, which correlates to the date that the article was indexed, it seemed that students using SciFinder would find more current articles in their searches. This is particularly likely, considering that users are unlikely to look past the first page of search results (Jansen et al. 1998; Silverstein et al. 1999; Spink et al. 2001). Taking these points into consideration, it seems likely that some of the non-attendees also used SciFinder, albeit not for finding spectra.

Limitations

Students self-selected whether or not they would attend a library instruction session. It is possible that all of the students who attended were just ambitious students who wanted to get as good of a grade as possible, which would skew the results. It just as easily could have been students who were struggling in the class and hoped that doing very well on this paper would improve their overall grade. Perhaps the two possibilities helped to counterbalance each other, but because overall paper grades and course grades could not be obtained, no definitive answer can be made. In a similar thread, it is unknown what percentage of the final grade this paper was worth, despite several requests of the instructor for this information, so some students may not have put much work into it if it would have little impact on their grade.

Three different library instruction sessions were offered. This was done primarily to allow students to attend if there was a scheduling conflict, but also because not enough students attended the first two sessions to allow for statistics to be calculated with any certainty. Two sessions were offered over a week before the assignment was due, which a total of seven students attended. The third and final session was offered two days before the assignment was due, and 18 students attended. It is most likely that students were not thinking about the assignment in time for the first two sessions, but were in time for the third. This may also speak to the "motivated student" theory too, in that those seven students who attended the first two sessions were in fact more motivated to do well on this assignment, for whatever reason.

The other reason for pointing out that three different library sessions were offered is that there was likely some variability between them, thus leading the students who attended to get slightly different take-aways. No comparison was attempted between the students who attended each of the three sessions.

This study only represents the assignments of one group of students from one year of organic chemistry. While broad implications or assertions may be limited, this cohort nonetheless represents a complete population, as it is comprised of every sophomore chemistry major at the University.

Future Directions

This study represented one year's sophomore chemistry majors. It would be useful to compare these students with a subsequent year's students, or more students in future years. To reduce the confounding variable of "motivation," it would be valuable to ask students to create a "pre-bibliography" at the beginning of the semester, offer a library instruction session to the entire class, and then evaluate their final bibliographies for this assignment. Even the possibility of comparing these students' overall assignment grades or course grades with their bibliography scores would provide clearer evidence of the value of these library sessions, but neither was available for this study. It could also be interesting to examine these students' knowledge retention over time, by comparing these sophomore bibliographies with their senior capstone papers.

Another use of this data would be to evaluate the costs associated with the assignment. Libraries are constantly justifying their budgets, so being able to show how much it costs to complete a single assignment for a 200-level class would likely be interesting.

Conclusion

Student bibliographies clearly benefited from library instruction. A bibliography score increase of 1.4 points out of 10 is significant. It is further hoped that the students improved the quality of their paper overall because they found better sources. It can also be hoped that the students learned techniques and information that will help them in future courses. Studies of more students' papers can lead to a better understanding of their information literacy needs, but even this small sample size points to areas where library instruction helps students and other areas where it can be improved. Particularly, students who attended a library session used a higher proportion of scholarly articles and cited spectra from verified sources more often, but future library instruction sessions clearly need a stronger emphasis on the importance and structure of ACS Style.

Note

Author was a librarian at Miami University, Oxford, OH, when the study was conducted. He has since moved to Brown University.

References

Davis, P. M. & Cohen, S. A. 2001. The effect of the web on undergraduate citation behavior 1996-1999. Journal of the American Society for Information Science and Technology. 52(4):309-314.

Dykeman, A. & King, B. 1983. Term paper analysis: a proposal for evaluating bibliographic instruction. Research Strategies. 1(1):14-21.

Gratch, B. 1985. Toward a methodology for evaluating research paper bibliographies. Research Strategies. 3(4):170-177.

Heller-Ross, H. 2003. Assessing outcomes with nursing research assignments and citation analysis of student bibliographies. The Reference Librarian. 37(77):121-140.

Jansen, B. J., Spink, A., Bateman, J. & Saracevic, T. 1998. Real life information retrieval: a study of user queries on the web. SIGIR Forum. 32(1):5-17.

King, D.N. & Ory, J.C. 1981. Effects of library instruction on student research: a case study. College & Research Libraries. 42(1):31-41.

Kirk, T. 1971. A comparison of two methods of library instruction for students in introductory biology. College & Research Libraries. 32(6):465-474.

Li, J.J. 2003. Name Reactions. 2nd ed.; New York: Springer.

The Merck Index. 2001. 13th ed.; Whitehouse Station, NJ: Merck & Co.

Oppenheim, C. & Smith, R. 2001. Student citation practices in an information science department. Education for Information. 19:299-323.

SciFinder; Chemical Abstracts Service: Columbus, OH.

Silverstein, C., Marais, H., Henzinger, M. & Moricz, M. 1999. Analysis of a very large web search engine query log. SIGIR Forum. 33(1):6-12.

Smith, E.T. 2003. Assessing collection usefulness: an investigation of library ownership of the resources graduate students use. College & Research Libraries. 64(5):344-355.

Spink, A., Wolfram, D., Jansen, M.B.J. & Saracevic, T. 2001. Searching the web: the public and their queries. Journal of the American Society for Information Science and Technology. 52(3):226-234.

Previous Contents Next

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License. W3C 4.0   Checked!