659 They Found It—Now Do They Bother? An Analysis of First-Year Synthesis Michael J. Carlozzi* This paper presents assessment data from a first-year writing library partnership to examine the relationship between student source use and written synthesis. It finds that first-year students could locate peer- reviewed, scholarly sources but that these sources were poorly integrated in their arguments—if they were used at all. In contrast, it finds that stu- dents attempted to synthesize their in-class reading material, suggesting that students “tack on” outside sources. Ultimately, this paper argues that librarians may want to consider shifting their instructional focus from tradi- tional one-shot sessions to other solutions recommended by the literature. Introduction Information literacy (IL) knows few bounds, covering everything within libraries from mutable technology to the art of research. Academic librarians embrace these challenges, educating diverse student populations in many informational contexts.1 The current state of library instruction is such that virtually all academic libraries in the United States make IL a “critical part of their mission,”2 with the fourth most com- mon librarian-related duty being “training/instruction,” found in almost 40 percent of sampled library job advertisements.3 Yet, despite its prominence within academic library departments, IL instruction largely remains relegated to “universally lamented” one-shot sessions.4 Although some librarians teach for-credit courses,5 most library instruction occurs through course- integrated delivery (such as one-shot or embedded sessions), what Barbara Fister labels “the most common approach libraries take, and the easiest to implement.”6 In such sessions, librarians often teach resource discovery skills (for example, database searching) as well as other IL proficiencies such as source evaluation. However, Tom Eland argues that, given the limited format of these sessions, librarians are “unable to teach anything but basic skills.”7 Perhaps unsurprisingly, then, the assessment of student work with regard to IL rarely happens. One meta-analysis on assessment measures identified nine methods used by academic librarians to assess IL gains, with only about 7 percent of studies reviewing student essays at all; assessment of student work tends to analyze bibliographies.8 Of more recent data (2016) taken from the As- sociation of College and Research Libraries’ (ACRL) Action Team Projects, 20 percent * Michael J. Carlozzi is Library Director of the Wareham Free Library in Massachusetts, e-mail: carlotsee@ gmail.com. ©2018 Michael J. Carlozzi, Attribution-NonCommercial (http://creativecommons.org/licenses/ by-nc/4.0/) CC BY-NC mailto:carlotsee@gmail.com mailto:carlotsee@gmail.com http://creativecommons.org/licenses/by-nc/4.0/ http://creativecommons.org/licenses/by-nc/4.0/ 660 College & Research Libraries July 2018 of the participating campuses appeared to analyze some manner of student work outside of bibliographies.9 This is problematic since sources have value insofar as they complement larger projects; as Stephanie Rosenblatt argues, “the bibliography is not the end product …the creation of a research paper, project or presentation, i.e. the end product, is usually seen as a means of demonstrating the acquisition of competencies set by the requirements of the relevant discipline.”10 This essay adds to the study of IL by assessing two levels of first-year students’ source integration skills. The students received IL instruction from academic librarians and, as part of assignment requirements, were asked by their classroom instructors to synthesize peer-reviewed, scholarly material into their final essays. Specifically, this project analyzes how well students succeeded in 1) finding outside sources that matched assignment parameters and 2) synthesizing sources—both class readings and those found independently. Although previous studies have connected outside sources to student work, this study extends the research by examining how students synthesized their class readings as well. In sum, students could locate peer-reviewed, scholarly sources but struggled to synthesize both that material and their class read- ings. Yet, unlike with their class readings, students appeared to make little attempt to synthesize outside material, thereby complicating the priorities of instruction librarians. Literature Review IL has value beyond library science. After all, gains from the typical one-shot IL session— which usually targets academic databases—become obsolete when students graduate and lose their institutional database access. Given the overwhelming amount of information available online today, an ability to understand, evaluate, and apply information has be- come critical. IL consequently finds broad appeal throughout the academic community; the Association of American Colleges and Universities labels it an “essential learning outcome,”11 and a recent survey of academic faculty found overwhelming support for IL instruction.12 Nor is the value of IL limited to the university classroom; a recent study found that many K–12 students could not recognize “fake news,” concerning the researchers: “democracy is threatened by the ease at which disinformation about civic issues is allowed to spread and flourish.”13 And the study of information-literate workers has shown that employers value “moderate information evaluation skills.”14 Somewhat surprisingly, then, many faculty members do not directly teach IL (to be fair, they must cover considerable ground in their own disciplines).15 William Badke describes the typical faculty approach to IL as one where “students do research, on their own, out there, with minimal help and lots of condemnation when they do it badly.”16 Indeed, the well-documented shortcomings of library one-shot sessions persist in no small part from the teaching faculty’s preference that students find and indepen- dently negotiate research. Data appear to support Badke’s impression; the 2010 Project Information Literacy Study found that, despite the “seismic changes in the way that information is now created and delivered, 83% of instructors’ handouts for research assignments… called for the standard research paper.”17 Some teaching faculty have lamented this lackadaisical approach to IL, such as compositionist Doug Brent, who thanks librarians for the work they do “to clean up our messes when we create ill- conceived research assignments.”18 Similarly, writing professor Rolf Norgaard laid the blame partially on his own field for viewing librarians and their instruction as a “quick field trip, the scavenger hunt, the generic stand-alone tutorial, or the dreary research paper.”19 Traditional IL instruction, then, has filled a need identified by teaching faculty. Certainly some library research has observed successful outcomes. Shao and Purpur, for example, identified a positive correlation between IL and student writing skills and They Found It 661 course performance.20 Bowles-Terry found a similar correlation between participation in IL sessions and grade point average,21 as did Wong and Cmor.22 But these studies are correlational. Students who succeed on IL tests (or who at- tend IL sessions) are more likely to succeed on their written work and in their courses; observational studies notoriously struggle to answer causation. When researchers transition from analyzing correlations and bibliographies to student work, they often fail to detect transfer. Kohl and Wilson, for example, found “no statistically significant relationship between [assignment] grades and the bibliography ratings [assigned by librarians].”23 In other words, student work did not appear to benefit from source acquisition, although this analysis is limited by grades as the outcome measure. An alternative and plausible interpretation may be that faculty members assigned little value to source acquisition; Scharf et al. drew this inference when their statistical models suggested that “the concept of information literacy is not yet a significant fac- tor used in grading by individual writing instructors.”24 Back in 1993, Barbara Fister found that the overall literature suggested “no positive correlation between students’ ability to find and evaluate texts and their ability to write effective research papers. Indeed, some research suggests there is a negative relationship.”25 Of the literature Fister cites, one source has particular relevance here: a small study by Vince Toilers, whose students who received library instruction performed worse on their research projects than those without it, possibly because they “simply found more and didn’t know what to do with it.” The idea that students “found more and didn’t know what to do with it” has been somewhat explored since Fister’s and Kohl’s dated publications. Scharf et al. ana- lyzed 100 writing portfolios from senior-level undergraduates to find that students scored 6.05 out of 12.00 on “integration” of sources (in other words, synthesis), the lowest score for all categories measured.26 Luetkenhaus, Borrelli, and Johnson exam- ined first-year essays in a librarian-faculty collaboration course to find significant improvements in bibliographic and citation skills but not argument building and source analysis.27 Campuses participating in the ACRL’s Action Projects drew similar conclusions. Arcadia University, for example, found that students “only minimally conceptualized research as a part of the writing process …[suggesting] that students are not transferring [information literacy knowledge] to first year writing courses,”28 and the College of the Holy Cross found that students “may find better resources, but don’t know how to use them.”29 Similarly, Rosenblatt analyzed essays from a small set of upper-division students to find that 85 percent met or exceeded “their professor’s bibliographic requirements” but that half “provided little or no evidence that they derived any benefits from the literature they were required to consult.”30 This is a longstanding concern for Rosenblatt who, alongside Margy MacMillan, asked bluntly: “What good is teaching students how to find scholarly resources if they can’t read them?”31 MacMillan and Rosenblatt argue that “at least part of the problem [with poorly in- tegrated student source use] [is] reading.”32 Essentially, they argue that, because lower- division students struggle to read advanced outside material (such as discipline-specific peer-reviewed sources), these students make little effort to integrate or synthesize that material into their assignments. Although noting that “reading” is not an explicitly defined ACRL standard, MacMillan and Rosenblatt argue that librarians should not “simply throw their hands up like many other faculty and continue to complain about how students don’t appreciate/use/cite the wealth of resources we lay before them in IL sessions.”33 Librarians, they argue, are “best equipped” to manage advanced outside material, as they are often asked to read outside of their primary discipline, similar to many students, giving them “strategies …to pass on to students.”34 662 College & Research Libraries July 2018 This paper seeks to explore that argument. First, how well can students retrieve outside material appropriate to the assignment or class? This is traditionally a core concern for academic librarians, as the vast majority of assessment measures identified by Walsh focused on source retrieval.35 Second, to what extent does the acquisition of this material relate to student work in the course? Method Data were provided by an IRB-approved project where the author belonged to a re- search team piloting an embedded librarianship program at a Northeastern, four-year public university. The project was funded through a campus multidisciplinary seed grant. In the project, the English department partnered with the university’s library, as they had already been collaborating on first-year writing IL instruction (one-shot sessions) and wanted to pilot another instructional model. Embedded librarianship appeared to be a promising change from one-shot sessions.36 The project aimed to compare student performance on various outcomes between two groups: students who received one-shot library sessions against those who received multiple library visitations throughout the semester. In the year leading up to this project, the university fortuitously hired several full-time lecturers to teach first-year writing. This allowed the research team to control for instruction by assigning five of these full-time lecturers both a control (one-shot) and an experimental (embedded) class of English 102, a required first-year writing course. Curriculum was also con- trolled because those instructors who agreed to participate in the project taught the departmental syllabus; course assignments and the course calendar were standardized. In total, ten classes of English 102 participated in the study. The project assessed student IL skills through various measures, but the outcome of relevance here concerns written synthesis. Alongside analysis and academic con- tribution, synthesis is one of three primary components in the university’s first-year writing program. Synthesis, however, is the skill the English department has observed that students struggle with most. Therefore, synthesis became the first-year writing program’s central task, essentially forming the core of the curriculum. The final English 101 assignment was a “contributing to the conversation” essay. Students were asked to synthesize class readings into “camps” concerning the central research question of “what makes good writing?” Class readings were a mix of popular, canonical, and scholarly work; 8 of the 14 total readings were scholarly. The departmental syllabus called for considerable class time to be spent covering all of these readings. For the contribution essay, students were asked to find at least one outside peer-reviewed source to synthesize into one of their camps. The research team had hypothesized that students who received embedded IL instruction would develop stronger written synthesis because they might be able to retrieve more relevant sources and might also have more time with those sources (that is to say, they would not wait until “the eleventh hour” to start researching). However, no significant differences were found between the control and experimental groups with regard to written synthesis. This appears not so much a failing with the project itself as with this specific hypothesis; in retrospect, it seems inappropriate to assume that an embedded librarianship project, without any focus on reading complex material or enhancing writing skills, would have affected written synthesis any differently from the standard one-shot model. Regardless, because no significant differences were found, all essays have been combined into one data set for purposes of the current analysis. To score the students’ written synthesis, the research project leader, who was also the Director of First-Year English, helped develop a rubric with which to score essays. The rubric was a simplified (in other words, focused exclusively on source use and They Found It 663 synthesis) version of that which the English department used for its ongoing, yearly first-year writing assessment (see appendix for the rubric). Essays were scored by the Director of First-Year English and a former writing instructor at the university. To ensure reliability of findings, a norming session was conducted prior to scoring; as the assessment of synthesis was a central focus of the first-year writing program, the scorers were well grounded in the features of first-year synthesis. For borderline cases, the scorers collaborated to reach consensus. The scorers first identified a sample of eligible synthesis papers. Starting with the 150 papers collected by the research team, they observed whether papers had ob- tained outside sources. Within this subset of papers (130), they assigned the essay a “Synthesis Score” from 0 to 3, with “0” equaling “no synthesis,” 1 equaling “emerging synthesis,” 2 equaling “proficient synthesis,” and 3 equaling “accomplished synthesis” (see appendix for examples of these scores). To be examined further, essays must have included both a class reading and an outside article (N = 110). While the contribution essay assignment required students to find scholarly, peer-reviewed sources, some students acquired outside readings that did not satisfy this requirement (19%); these essays were still included in the final data set (110). As students often synthesized multiple sources in one paper, the scorers had to choose which material to assess. The scorers recorded the outside source the students synthesized most successfully (that is, scored the highest); the student’s individual Synthesis Score, then, reflected the student’s most successful attempt. Selecting a class reading to score was more problematic. Choosing the first reading could positively bias results, since students might be more likely to try harder earlier in their papers; the last source would presumably hold the inverse concern. Therefore, the second class reading mentioned in the essay was scored. Results and Discussion Students in this sample were able to retrieve peer-reviewed, scholarly sources; 81 per- cent of all eligible students used at least one such source. The library intervention ap- peared to achieve one of its primary goals: to facilitate the acquisition of peer-reviewed, scholarly material, requiring an understanding of what constitutes academic material as well as an ability to navigate the library’s databases. This is somewhat notable. However, students could not synthesize outside material, averaging 0.62 on a 3-point scale with 52.7 percent scoring zero; in other words, roughly half of the students made no discernible effort to synthesize their research. This project’s data, then, support findings of previous research that students have trouble synthesizing outside research. But students also struggled to synthesize their class readings. Although students scored considerably better synthesizing class readings than outside sources (1.27 to 0.57), these averages are misleading; the discrepancy resulted mostly from zeroes. Stu- dents received far fewer zeroes when synthesizing class readings than outside sources (6.4% to 52.7%). This suggests that many students made little or no effort to synthesize outside material, unlike with their class readings. Table 1 presents the Synthesis Scores for all outside sources and class readings. Note that this table aggregates academic and nonacademic class readings. However, scorers did not notice either a pattern of students struggling to synthesize certain kinds of class material (such as academic articles) or a preponderance of nonacademic source selection; in fact, one of the course readings many students analyzed was Mike Rose’s “Rigid Rules, Inflexible Plans, and the Stifling of Language: A Cognitivist Analysis of Writer’s Block.” Some students likely chose to analyze academic articles because, given their complexity, they often received considerable class coverage; the departmental syllabus used in this project indicated significant time spent on the course’s denser academic texts. 664 College & Research Libraries July 2018 Synthesis Scores were then tested for statistical significance. Table 2 divides outside sources into two categories: scholarly, peer-reviewed articles and nonscholarly articles. To determine whether any of these differences were significant, an analysis of variance (ANOVA) was conducted. Since convention suggests that P values less than 5 percent are statistically signifi- cant, a statistically significant difference was found: F (2,217) = 30.96 (P < 0.001). This result means that the difference in scores was unlikely to have occurred by chance; if the null hypothesis were true (that is to say, there’s no difference between groups), then the probability (P value) of obtaining this result, or one more extreme, is less than 0.001%. As an omnibus test, however, ANOVA cannot locate specific significant differences. ANOVA can only show that a significant difference existed somewhere among the three groups. Thus the ANOVA was followed with a common post-hoc test, Tukey’s HSD (Honest Significant Difference), to identify significant differences among the three pairings (see table 3). Tukey’s test compares each of the pairings to locate significant differences. The Q statistic (like the F statistic in the ANOVA) is the test statistic generated by these comparisons; the larger a value, the more unlikely it is to have occurred by chance. TABLE 1 Synthesis Score Mean and Percentage of Essays Scoring Zero (N = 110) Category Synthesis Score (0–3) Mean % Scoring Zero Outside Source (All) 0.57 52.7% Class Readings 1.27 6.4% TABLE 2 Synthesis Score Mean and Percentage of Essays Scoring Zero (N = 110) for Scholarly, Peer-Reviewed Articles, Nonscholarly Outside Articles, and Class Readings Category N Synthesis Score (0–3) Mean % Scoring 0 Scholarly, Peer-Reviewed Outside 89 0.62 50.6% Nonscholarly Outside 21 0.38 61.9% Class Readings 110 1.27 6.4% TABLE 3 Tukey’s HSD Results: The Variable “Class Readings” Differs Significantly from Both Groups Pair Tukey’s HSD Q Statistic Tukey’s HSD P-value Nonscholarly vs. Scholarly 2.06 0.32 Nonscholarly vs. Class Readings 7.89 0.001* Scholarly vs. Class Readings 9.68 0.001* *Significant at the P < 0.01 Level They Found It 665 The largest gap (in terms of Q statistic) was between “scholarly” sources and “class readings.” Class readings also differed significantly from “nonscholarly” sources. The slight difference in means between scholarly and nonscholarly outside sources was nonsignificant (Tukey’s HSD P = 0.32). Some researchers have argued that competent written synthesis exceeds the abili- ties of first-year students.37 Synthesis, after all, is a difficult cognitive task with which many first-year students lack experience. Because so many students struggled with synthesis at this university, it had become the first-year writing program’s primary focus. The data here lend some credence to this argument, since students also struggled to synthesize class material—sources that likely received significant in-class attention. The vast majority of class syntheses received scores of 1 (80%), indicating an ability to build only superficial connections between sources (in other words, naming a con- nection but leaving it undeveloped). The difference in means stems from the fact that many students did not bother to synthesize outside material; the score of 1.27 for class readings, which more than doubles that of 0.62 from outside sources, results largely from zeroes. These data, then, reinforce the impression that outside sources become “tacked on” to fulfill assignment requirements. In fact, 65 percent of student essays mentioned their outside source only once in contrast to class readings, which were mentioned only once 18 percent of the time. This leads to a situation described by Brent, whose students “learned how to find information in the library and how to document it…. But their research papers, by and large, remained hollow imitations of research, collections of information gleaned from sources with little synthesis, evaluation, or original thought.”38 When students encounter such readings in class, guidance and structured engagement foster at least rudimentary synthesis moves. But students “tacked on” material they located independently, whether scholarly or not. In response to MacMillan and Rosenblatt’s question—they’ve found it; can they read it?—the answer appears to be, at least for many students, that they “don’t bother.” Conclusions, Limitations, and Future Research This paper compared the synthesis of class readings with outside material. It found that students located assignment-appropriate outside sources, satisfying a primary IL interest. But it also found that students struggled to synthesize both sets of material while making little effort to synthesize outside sources. It bears emphasis that this study cannot answer why students failed to synthesize material. While future research could probe that question, this project’s data require interpretation. One plausible explanation comes from MacMillan and Rosenblatt, who suggest that students struggle to understand/read difficult outside material and, as a result, do not try to integrate it. Indeed, educational psychologists have found that text difficulty supports engagement when students can choose their own reading material—but such choice does not often happen in college classrooms.39 It appears logical, then, that some first-year students “give up” on incorporating advanced outside material, especially since even juniors and seniors struggle to process such work.40 In this study, however, students who obtained presumably more comprehensible nonscholarly material also failed to incorporate it, although these students may have stood to perform worse on their writing tasks as they had flouted an assignment require- ment (obtain a peer-reviewed source). Notwithstanding that caveat, this observation leads to an alternative explanation, relating to the instructor’s engagement. That many instructors do not relate IL to their course goals has been an ongoing concern;41 some researchers have even argued that librarians should publish in journals outside of librarianship to convince teaching faculty of information literacy’s value.42 Instructors 666 College & Research Libraries July 2018 in this study may not have taken the outside source requirement seriously, permitting students to “tack on” outside material without a significant grade penalty. Students therefore may have prioritized those readings that instructors valued (in other words, class readings). That so many students received zeroes also supports this interpreta- tion; while students attempted to synthesize class readings, they made no attempt to do so with their outside sources, suggesting that they had not been directed to do otherwise—during the drafting process, for instance. Corroborating this interpretation, Lowe et al. found that only through assignment collaboration with teaching faculty were students’ IL skills meaningfully improved by a library intervention.43 Isolating IL instruction from assignments appears problematic. This study’s overall relevance for librarians, then, involves their role in course- integrated instruction. That so many students essentially ignored outside sources ques- tions the value in librarians pursuing course-integrated instructional models, which appear to facilitate what Eland calls “basic skills.” Eland argued that the “challenge for academic librarians is to alter our traditional paradigm, which sees us as supporters of the teaching and research mission, to a new paradigm that sees us as active teachers and researchers, and as ‘the experts’ in information literacy.”44 This argument is not new; back in 1928, Charles Shaw, librarian at Swarthmore College, remarked that the “haphazard, unscientific teaching librarians now undertake must be scrapped” in favor of bibliography departments and for-credit courses.45 After nearly a century, this argu- ment remains relevant, as most library instruction still occurs in a course-integrated environment. Current instruction methods struggle to demonstrate conclusive suc- cess; scholarship in library science lacks consensus on the value of course-integrated library instruction, with some studies reporting success and others disappointment.46 Because studies also present wildly different outcomes and measures, meta-analysis of a particular intervention’s effectiveness (such as embedded librarianship vs. one-shot sessions) becomes difficult. These mixed results suggest that positive findings rely less on the investigated model’s utility than they do on factors specific to the study (in other words, design, assessment measures, faculty investment, and luck). Fister finds these results disheartening: “While many faculty are enthusiastic partners with librarians, it’s hard to point to evidence that these partnerships made a difference [on student learning].”47 Yet, despite these mixed results, it does not follow that librarians should altogether abandon IL instruction. Information literacy has considerable value, especially in an age of “alternative facts” and “fake news.” It should be taught, and it should be done in a meaningful, generalizable manner for all the reasons identified in this paper’s literature review. MacMillan and Rosenblatt make a persuasive case that librarians must concern themselves with students’ reading abilities and higher-order skills. As Rosenblatt asks, “shouldn’t …instruction librarians, be concerned about students’ abilities to use the information they have discovered?”48 At the very minimum, MacMillan and Rosenblatt argue, librarians “can raise awareness about the difficulties students face,” while trying to understand how they can be assisted campuswide.49 MacMillan and Rosenblatt are hardly alone; as one example, Dianne VanderPol and Emily Swanson state that “too often librarians suggest that it is the role of the college or university’s teaching faculty to help students become critical readers, adept at synthesizing information sources, and capable of recognizing interrelationships among concepts-outcomes.”50 The most resource-intensive solution involves librarians teaching for-credit courses. For-credit courses naturally facilitate more complex instruction and assessment, al- lowing librarians an opportunity to develop students’ higher-order skills and to cover IL concepts beyond resource discovery. Unfortunately, such an initiative likely rests outside of the library’s control. As Fister notes, it can “be difficult to gain institutional They Found It 667 support, both to list such a course in the catalog and to staff it with instructors.”51 Many librarians might like to teach for-credit courses but their administrators will not allow it; Nadine Cohen et al. surveyed almost 700 academic libraries to find that only 19 percent taught for-credit courses.52 And it should be mentioned that teaching for-credit courses carries many assumptions other than university buy-in, not the least of which are adequate teacher training and ongoing support. The viability of this sug- gestion, then, relies on institutional climate and other significant educational factors. Librarians unable or unwilling to teach their own courses, then, can still try alterna- tive teaching methods and vary their instructional content. Of course, many academic librarians have already shifted their instructional focus away from basic resource discovery, even in one-shot sessions. As early as the 1980s, Kohl and Wilson called for library instruction that “begins with the student’s research question rather than the library tool.”53 And, much more recently, Bowles-Terry and Donovan “served notice on the one-shot,” arguing in favor of developing alternative instructional methods.54 As they argued, the one-shot model at its worst “marginalizes the pedagogical expertise of librarians whose efforts could result in more sustainable and influential educational initiatives, given the space, time, and support to move toward new roles.”55 Many librarians have offered promising alternatives. Rosenblatt’s own practice provides a useful example: “Time previously spent on ensuring that students practice keyword searching will in future be allocated to modeling the synthesis of disparate sources and to dissecting the work of experts to see the purpose the literature serves in scholarly work.”56 Such a project might helpfully ally with other academic departments since synthesis modeling requires subject-matter expertise. Bronshteyn and Baladad, as an- other example, propose that librarians use paraphrasing exercises in their instructional sessions to help students demonstrate that they understand the acquired material.57 Jen Green, exasperated with traditional one-shot sessions, borrowed a concept from landscape planning called “desire lines” (in other words, paths naturally created by actual people) to emphasize useful but familiar search engines such as Google Scholar. Rather than tell students where to go, Green recommends instead learning their natural pathfinding.58 And Holliday and Rogers argued for an altogether different approach to IL, framing it as “learning about” subjects rather than “simply locating information,” requiring a new vocabulary, training, and general philosophy.59 These solutions ap- proach the problems of course-integrated instruction from novel directions. Will they necessarily work better than traditional methods? One cannot say for sure, especially as some occur in the limiting one-shot session, but piloted work has the advantage of not having been demonstrated as ineffective. Or, in the vernacular: it’s worth a shot. Current methods of teaching IL, in place for roughly twenty years among American institutions of higher learning, cannot claim wild success. While this paper’s findings may not generalize beyond first-year students, this paper’s analysis shows that first-year students attempted to synthesize in-class readings but not outside material despite ap- parent assignment requirements. It questions the value in continuing course-integrated instruction models without at the very least “buy-in” from teaching faculty. It also suggests that librarians should pursue different methods of instruction depending on their institutional climate. Rather than continuing methods that have not shown much success, helping students negotiate complex material seems a promising reorientation. 668 College & Research Libraries July 2018 Appendix This rubric was used to score each essay and was based on the English department’s first-year English assessment rubric. An ongoing interest in the English department has been to define synthesis claims. 0, “no synthesis”: Student made no attempt to synthesize the outside source: in other words, explaining a source in isolation or using a source as evidence for a student’s own individual claims and not part of a synthesis camp. 1, “emerging synthesis”: Student built a superficial connection between the outside source and the synthesis camp, naming a connection but leaving it undeveloped (for example, “Like Adichie, Marco Caracciolo contributes an argument that proves authors have a personal impact on readers”). 2, “proficient synthesis”: Student built a specific connection among authors, although the connection still requires the reader to connect the dots: for example, “This camp believes that writers are not writing about anything important. Currey argues that e- mails taking over letters is hindering the quality of writing because writers don’t think about what they’re saying. Nehring presents the lack of reading by people today being due to writers not talking about important subjects that would be worth reading. Prato also complains that the largest problem facing the news industry is sloppy writing by reporters that no one wants to read.” 3, “accomplished synthesis”: Student built a specifically named and fully supported connection; example: “This Creative Camp, instead of paying attention to audience, sees writing as a way to create ideas and be creative. Mason Currey argues how writing is an outlet that gives the writer the ability to create new ideas without any limitations. Currey views letter writing as a way of ‘easing in and out of a state of mind,’ which permits the writer to create more meaningful and ‘in-depth work’ (Currey). The idea is that letter writing is what writing should be. Similarly, Flower and Hayes argue that authors should free write, and in so doing, build on previous ideas through creation: ‘this act of creating ideas, not finding them, is at the heart of significant writing’ (22). A similar stance is found in Lou LaBrant’s work, who believes that good writing allows the writer to focus on expression, writing without any limits. In all of these authors, writers should not be restricted by any rules, and are truly able to convey the thoughts that they have—writing is basically a way to create and to solve problems, not so much to reach an audience. This excerpt connects the outside source to the synthesized per- spective and then gives this set of authors a concrete description of shared values.”60 Notes 1. Loanne Snavely and Natasha Cooper, “The Information Literacy Debate,” Journal of Aca- demic Librarianship 23, no. 1 (1997): 9–14. 2. Barbara Fister, “The Library’s Role in Learning: Information Literacy Revisited,” Library Issues: Briefings for Faculty and Administrators 33, no. 4 (2013): 2. 3. “Emerging Career Trends for Information Professionals,” available online at http://ischool. sjsu.edu/sites/default/files/content_pdf/career_trends.pdf [accessed 2 December 2016]. 4. Eugene Engeldinger, “Teaching Only the Essentials: The Thirty-Minute Stand,” Reference Services Review 16, no. 4 (1988): 47–50. 5. Jean Marie Cook, “A Library Credit Course and Student Success Rates: A Longitudinal Study,” College & Research Libraries 75, no. 3 (2014): 272–83; Tom Eland, “A Curriculum-Integrated Approach to Information Literacy,” Information Literacy Handbook (Chicago: ACRL, 2008); Rui Wang, “The Lasting Impact of a Library Credit Course,” portal: Libraries and the Academy 6, no. 1 (2006): 79–92. 6. Fister, “The Library’s Role in Learning,” 3. http://ischool.sjsu.edu/sites/default/files/content_pdf/career_trends.pdf http://ischool.sjsu.edu/sites/default/files/content_pdf/career_trends.pdf They Found It 669 7. Eland, “A Curriculum-Integrated Approach to Information Literacy,” 108. 8. Andrew Walsh, “Information Literacy Assessment: Where Do We Start?” Journal of Librari- anship and Information Science 41, no. 1 (2009): 19–28. 9. Association of College and Research Libraries, “Assessment in Action,” ACRL, available online at https://apply.ala.org/aia/public [accessed 9 September 2017]. Institutions that did not clearly explain their methodologies were excluded. 10. Stephanie Rosenblatt, “They Can Find It but They Don’t Know What to Do with It: Describ- ing the Use of Scholarly Literature by Undergraduate Students,” Journal of Information Literacy 4, no. 2 (2010): 52. 11. Association of American Colleges & Universities, “Essential Learning Outcomes,” available online at https://www.aacu.org/leap/essential-learning-outcomes [accessed 8 September 2017]. 12. Meredith Schwartz, “Closing the Gap in Librarian, Faculty Views of Academic Libraries,” Library Journal, available online at http://lj.libraryjournal.com/2015/09/academic-libraries/closing- gap-librarian-faculty-views-research/ [accessed 8 September 2017]. 13. Sam Wineburg, Sarah McGrew, Joel Breakstone, and Teresa Ortega, “Evaluating Informa- tion: The Cornerstone of Civic Online Reasoning,” Stanford Digital Repository (2016): 5, available online at http://purl.stanford.edu/fv751yt5934 [accessed 8 September 2017]. 14. Dale Cyphert and Stanley Lyle, “Employer Expectations of Information Literacy: Iden- tifying the Skills Gap,” in Information Literacy: Research and Collaboration across Disciplines, eds. Barbara D’Angelo, Sandra Jamieson, Barry Maid, and Janice Walker (Boulder: University Press of Colorado, 2017), 68. 15. Brian Jackson, Margy MacMillan, and Michelle Sinotte, “Great Expectations: Results from a Faculty Survey of Students’ Information Literacy Proficiency,” Proceedings of the IATUL Confer- ence, available online at http://docs.lib.purdue.edu/cgi/viewcontent.cgi?article=2036&context=iatul [accessed 10 December 2016]; Linda Adler-Kassner and Heidi Estrem, “Reading Practices in the Writing Classroom,” WPA: Writing Program Administration 31, no. 1/2 (2007): 35–47. 16. William Badke, “Teaching Information Cultures,” Online Searcher 37, no. 2 (2013): 69. 17. Alison Head, “Project Information Literacy: What Can Be Learned about the Information- Seeking Behavior of Today’s College Students?” available online at www.ala.org/acrl/sites/ala. org.acrl/files/content/conferences/confsandpreconfs/2013/papers/Head_Project.pdf [accessed 4 December 2016]. 18. Doug Brent, “The Research Paper and Why We Should Care,” WPA: Writing Program Administration 37, no. 1 (2013): 43. 19. Rolf Norgaard, “Writing Information Literacy: Contributions to a Concept,” Reference & User Services Quarterly 43, no. 2 (2003): 124. 20. Xiaorong Shao and Geraldine Purpur, “Effects of Information Literacy Skills on Student Writing and Course Performance,” Journal of Academic Librarianship 42, no. 6 (2016): 670–78. 21. Melissa Bowles-Terry, “Library Instruction and Academic Success: A Mixed-Methods Assess- ment of a Library Instruction Program,” Evidence Based Library and Information Practice 7, no. 1 (2012). 22. Rebekah Shun Han Wong and Dianne Cmor, “Measuring Associations between Library Instruction and Graduation GPA,” College & Research Libraries 72, no. 5 (2011): 464–73. 23. David Kohl and Lizabeth Wilson, “Effectiveness of Course-Integrated Bibliographic In- struction in Improving Coursework,” RQ 27, no. 2 (1986): 210. 24. Davida Scharf, Norbert Elliot, Heather Huey, Vladimir Briller, and Kamal Joshi, “Direct Assessment of Information Literacy Using Writing Portfolios,” Journal of Academic Librarianship 33, no. 4 (2007): 469. 25. Barbara Fister, “Teaching the Rhetorical Dimensions of Research,” Research Strategies 11, no. 4 (1993): 211. 26. Scharf et al., “Direct Assessment of Information Literacy.” 27. Holly Luetkenhaus, Steve Borrelli, and Corey Johnson, “First Year Course Programmatic Assessment: Final Essay Information Literacy Analysis,” Reference and User Services Association 55, no. 1 (2015): 49–60. 28. Larissa Gordon and Daniel Schall, “Reflective Writing & The Research Process,” available online at https://apply.ala.org/attachments/26621 [accessed 10 September 2017]. 29. Alicia Hansen, Jennifer Whalen, Denise Bell, and Stephanie Yuhl, “First Year Students and Source Selection: Assessing Personal Research Sessions in Montserrat,” available online at https://ala-ppo-apply-attachments.s3.amazonaws.com/answerable/attachment/file/26778/AiA- Poster-Final_resize.jpg?AWSAccessKeyId=AKIAJO4KSYKHL77IOFJQ&Signature=gICasCj51D bTPMyeusuTdp%2BTFH0%3D&Expires=1505178006 [accessed 10 September 2017]. 30. Rosenblatt, “They Can Find It but They Don’t Know What to Do with It,” 60. 31. Margy MacMillan and Stephanie Rosenblatt, “They’ve Found It. Can They Read It? Adding Academic Reading Strategies to Your IL Toolkit,” ACRL 2015, available online at www.ala.org/ acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2015/MacMillan_Rosenblatt. pdf [accessed 1 December 2016]. https://apply.ala.org/aia/public https://www.aacu.org/leap/essential-learning-outcomes http://lj.libraryjournal.com/2015/09/academic-libraries/closing-gap-librarian-faculty-views-research/ http://lj.libraryjournal.com/2015/09/academic-libraries/closing-gap-librarian-faculty-views-research/ http://purl.stanford.edu/fv751yt5934 http://docs.lib.purdue.edu/cgi/viewcontent.cgi?article=2036&context=iatul http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2013/papers/Head_Project.pdf http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2013/papers/Head_Project.pdf https://apply.ala.org/attachments/26621 https://ala-ppo-apply-attachments.s3.amazonaws.com/answerable/attachment/file/26778/AiA-Poster-Final_resize.jpg?AWSAccessKeyId=AKIAJO4KSYKHL77IOFJQ&Signature=gICasCj51DbTPMyeusuTdp%2BTFH0%3D&Expires=1505178006 https://ala-ppo-apply-attachments.s3.amazonaws.com/answerable/attachment/file/26778/AiA-Poster-Final_resize.jpg?AWSAccessKeyId=AKIAJO4KSYKHL77IOFJQ&Signature=gICasCj51DbTPMyeusuTdp%2BTFH0%3D&Expires=1505178006 https://ala-ppo-apply-attachments.s3.amazonaws.com/answerable/attachment/file/26778/AiA-Poster-Final_resize.jpg?AWSAccessKeyId=AKIAJO4KSYKHL77IOFJQ&Signature=gICasCj51DbTPMyeusuTdp%2BTFH0%3D&Expires=1505178006 http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2015/MacMillan_Rosenblatt.pdf http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2015/MacMillan_Rosenblatt.pdf http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2015/MacMillan_Rosenblatt.pdf 670 College & Research Libraries July 2018 32. MacMillan and Rosenblatt, “They’ve Found It. Can They Read It?” 757. 33. Ibid., 759. 34. Ibid., 760. 35. Walsh, “Information Literacy Assessment: Where Do We Start?” 36. For a full description of the research project and the other outcomes, see Alexis Teagarden and Michael Carlozzi, “Time Enough?: Experimental Findings on Embedded Librarianship,” WPA: Writing Program Administration 41, no. 1 (2017): 12-32. In that project, we observed the gulf in Synthesis Scores between class readings and outside sources but did not pursue further analysis. 37. Justin Young and Charlie Potter, “The Problem of Academic Discourse: Assessing the Role of Academic Literacies in Reading Across the K–16 Curriculum,” Across the Disciplines 10, no. 4 (2013), available online at https://wac.colostate.edu/atd/reading/young_potter.cfm [accessed 10 December 2016]. 38. Brent, “The Research Paper and Why We Should Care.” 39. Sara Fulmer, Sidney D’Mello, Amber Strain, and Art Graesser, “Interest-based Text Preference Moderates the Effect of Text Difficulty on Engagement and Learning,” Contemporary Educational Psychology 41 (2015): 98–110. 40. Margy MacMillan and Allison MacKenzie, “Strategies for Integrating Information Literacy and Academic Literacy: Helping Undergraduate Students Make the Most of Scholarly Articles,” Library Management 33, nos. 8/9 (2012): 525–35. 41. Scharf et al., “Direct Assessment of Information Literacy.” 42. Colleen Boff and Kristin Johnson, “The Library and First-Year Experience Courses: A Nationwide Study,” Reference Services Review 30, no. 4 (2002): 277–87; Robert Tomaszewski, Karen MacDonald, and Sonia Poulin, “Publishing in Discipline-Specific Non-Library Journals for Promoting Information Literacy,” Journal of Academic Librarianship 39, no. 4 (2012): 321–29. 43. M. Sara Lowe, Char Booth, Sean Stone, and Natalie Tagge, “Impacting Information Literacy Learning in First-Year Seminars: A Rubric-Based Evaluation,” portal: Libraries and the Academy 15, no. 3 (2015): 489–512. 44. Eland, “A Curriculum-Integrated Approach to Information Literacy,” 112. 45. Mary Salony, “The History of Bibliographic Instruction,” Reference Librarian 24, nos. 51/52 (1995): 37. 46. Susan Gardner Archambault, “Library Instruction for Freshman English: A Multi-Year Assessment of Student Learning,” Evidence Based Library and Information Practice 6, no. 4 (2011); Teagarden and Carlozzi, “Time Enough?”; Smiti Gandhi, “Faculty-Librarian Collaboration to Assess the Effectiveness of a Five-Session Library Instruction Model,” Community & Junior Col- lege Libraries 12, no. 4 (2004): 15–48; Timothy Daugherty and Elizabeth Carter, “Assessment of Outcome-Focused Library Instruction in Psychology,” Journal of Instructional Psychology 24, no. 1 (1997): 29–33; Amrita Dhawan and Ching-Jung Chen. “Library Instruction for First-Year Stu- dents,” Reference Services Review 42, no. 3 (2014): 414–32; Jon Hufford, “What Are They Learning? Pre- and Post-Assessment Surveys for LIBR 1100, Introduction to Library Research,” College & Research Libraries 71, no. 2 (2010): 139–58; Kristina Howard, Thomas Nicholas, Tish Hayes, and Christopher Appelt, “Evaluating One-Shot Library Sessions: Impact on the Quality and Diversity of Student Source Use,” Community & Junior College Libraries 20, no. 1/2 (2014): 27–38. 47. Fister, “The Library’s Role in Learning,” 3. 48. Rosenblatt, “They Can Find It but They Don’t Know What to Do with It,” 60. 49. MacMillan and Rosenblatt, “They’ve Found It. Can They Read It?” 760. 50. Dianne VanderPol and Emily Swanson, “Rethinking Roles: Librarians and Faculty Collaborate to Develop Students’ Information Literacy,” Journal of Library Innovation 4, no. 2 (2013): 135. 51. Fister, “The Library’s Role in Learning,” 4. 52. Nadine Cohen, Liz Holdsworth, John M. Prechtel, Jill Newby, Yvonne Mery, Jeanne Pfander, and Laurie Eagleson, “A Survey of Information Literacy Credit Courses in US Academic Libraries: Prevalence and Characteristics,” Reference Services Review 44, no. 4 (2016): 564–82. 53. Kohl and Wilson, “Effectiveness of Course-Integrated Bibliographic Instruction.” 54. Melissa Bowles-Terry and Carrie Donovan, “Service Notice on the One-Shot: Changing Roles for Instruction Librarians,” International Information & Library Review 48, no. 2 (2016). 55. Bowles-Terry and Donovan, “Serving Notice on the One-Shot.” 56. Rosenblatt, “They Can Find It but They Don’t Know What to Do with It,” 60. 57. Karen Bronshteyn and Rita Baladad, “Librarians as Writing Instructors: Using Paraphrasing Exercises to Teach Beginning Information Literacy Students,” Journal of Academic Librarianship 32, no. 5 (2006): 533–36. 58. Jen Green, “Library Instruction for First-Year Students: Following the Students’ Path,” College & Research Libraries News 75, no. 5 (2014): 266. 59. Wendy Holliday and Jim Rogers, “Talking about Information Literacy: The Mediating Role of Discourse in a College Writing Classroom,” portal: Libraries and the Academy 13, no. 3 (2013): 268. 60. Examples transcribed from Teagarden and Carlozzi, “Time Enough..” https://wac.colostate.edu/atd/reading/young_potter.cfm _GoBack _uwy2n63sd9cg _w83aallawc7d _h0rufgohqks7 _7nv76kehpy4