EDitoRiaL | tRuitt 3 Marc TruittEditorial I doubt that many of the Blog People are in the habit of sustained reading of complex texts. —Michael Gorman, 2005 S o, three plus years after the fact, why am I opening with Michael Gorman’s unfortunate characteriza- tion of those he labeled “Blog People”? I have no interest in reopening this debate, honestly! But the problem with generalizations, however unfair, is that at their heart there is just enough substance to make them “stick”—to give them a grain or two of credibility. Gorman’s words struck a chord in me that existed before his charge and has continued to exist to this day. The substance in Gorman’s words had little to do with these “Blog People” as such; rather, my interest was piqued by the implications in his remark about how we all deal with “complex texts” and the “sustained reading” of the same. In a time of wide availability of full-text electronic articles, it has become so easy and tempting to cherry pick the odd phrase here or there, without study of the work as a whole. How has scholarship especially been changed by the ease with which we can reduce works to snippets with- out having considered their overall context? I’m not arguing that scholarly research and writing hasn’t always been at least in part about finding the perfect juicy quotation around which we then weave our own theses. Many of us well recall the boxes of 3x5” citation and 5x8” quotation files that we or our patrons laboriously assembled through weeks, months, and years of detailed research. But if the style of compil- ing these files that I witnessed (and indeed did) is any guide, their existence was the product of precisely that “sustained reading of complex texts” of which Gorman spoke. My vague, nagging sense is that what is changing is this style of approaching whole texts. I wondered then about how much scholarly research today is driven by keyword searches of digitized texts that then essentially produce “virtual quotation files” without our having had to struggle with their context in the whole of the original source text? Fast forward three years. Lately, several articles touch- ing on our changing ways of interacting with resources have appeared in both scholarly and popular venues, and these have served to underline my sense that we are miss- ing something because of our growing lack of engage- ment with whole texts. Writing in the July/August issue of The Atlantic Monthly, Nicholas Carr asks “Is Google Making Us Stupid?” Drawing an analogy to the scene in the film 2001: A Space Odyssey, in which astronaut Dave Bowman disables supercomputer HAL’s memory circuits, Carr says I can feel it, too. Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concen- tration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.1 Carr goes on to explain that “what the Net seems to be doing is chipping away my capacity for concentra- tion and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.”2 Carr’s nagging fear found similar expression among some tech-savvy participants of library online forums; one of the more interesting comments appeared on the Web4Lib electronic discussion list. In a discussion of the article, Tim Spalding of LibraryThing observed that he himself had experienced what he dubbed “the Google effect” and noted Something is lost. . . . Human culture often advances by externalizing pieces of our mental life—writing externalizes memory, calculators externalize arithmetic, maps, and now GPS, externalize way-finding, etc. Each shift changes the culture. And each shift comes with a cost. Nobody memorizes texts anymore, nobody knows the times tables past ten or twelve and nobody can find their way home from the stars and the side of the tree the moss grows on.3 Meanwhile, another article appeared on a closely related topic, this time in the journal Science. James A. Evans observed that, because “scientists and scholars tend to search electronically and follow hyperlinks rather than browse or peruse,” the easy availability of electronic resources was resulting in an “ironic change” for scientific marc truitt (marc.truitt@ualberta.ca) is Associate Director, Bibliographic and Information Technology Services, University of Alberta Libraries, Edmonton, Alberta, Canada, and Editor of ITAL. 4 inFoRmation tEcHnoLoGY anD LiBRaRiEs | sEptEmBER 2008 scholarship, in that as more journal issues came online, the articles referenced tended to be more recent, fewer journals and articles were cited, and more of those citations were to fewer journals and articles. The forced browsing of print archives may have stretched scientists and scholars to anchor findings deeply into past and present scholarship. Searching online is more efficient and following hyper- links quickly puts researchers in touch with prevailing opinion, but this may accelerate consensus and narrow the range of findings and ideas built upon.4 Evans’s research highlights an additional irony: an unintended benefit to the scholarly process in the paper- based world was “poor indexing,” since it encouraged browsing through less relevant, older, or more marginal literature. This browsing had the effect of “facilitat[ing] broader comparisons and led researchers into the past. Modern graduate education parallels this shift in pub- lication—shorter in years, more specialized in scope, culminating less frequently in a true dissertation than an album of articles.”5 What is one to make of all of this? At the outset, I wish to state clearly that I am not some sort of anti e-text Luddite. Electronic texts are a fact of life, and are becoming moreso every day. Even though they are in their infancy as a medium, they’ve already transformed the landscape of bibliographic access. My interest is not with the tool, but with the manner in which we are using it. I began by suggesting that I share with Gorman a concern about how we increasingly engage with “com- plex texts” today. Unlike him, though, my concern is not limited only to the so-called Blog People (whomever they may be), but indeed, it includes all of us. With the explosion in easily accessible electronic texts, our ideas and habits concerning interaction with these texts are changing, sometimes in unintended ways. In a recent informal survey I conducted of my colleagues at work, I asked, “Have you ever read an e-book (not just a journal article) from (virtual) cover to (virtual) cover?” For those whose answer was affirmative, I also asked, “How many such books have you read in their entirety?” Out of twenty-odd responses, three individuals answered that yes, they had had occasion to read an entire e-book (for a total of six books among the three “yes” respondents, which seemed surprisingly high to me). Of greater interest, though, were those who chose to question the premise of the survey, arguing that people don’t “read” e-books the way that they read paper ones. It does make one wonder, then, how Amazon thinks it possesses a viable business model in the Kindle e-book reader, for which it currently lists an astounding 140,000+ available e-books. Clearly, some e-books are being read as whole texts, by some people, for some purposes. But I suspect that’s another story.6 Carr and Evans use slightly differing imagery to describe a similar phenomenon. Carr closes with a refer- ence back to the death of 2001’s HAL, saying, “As we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.”7 Evans, on the other hand, com- pares contemporary scientific researchers to Newton and Darwin, each of whom produced works that “not only were engaged in current debates, but wove their proposi- tions into conversation with astronomers, geometers, and naturalists from centuries past.” Twenty-first-century scientists and scholars, by contrast, are able because of readily available electronic resources “to frame and pub- lish their arguments more efficiently, [but] they weave them into a more focused—and more narrow—past and present.” 8 Perhaps the most succinct statement, though, comes from LibraryThing’s Tim Spalding, who summa- rized the problem thusly: “We advance by becoming dumber.”9 An ITAL research and publishing opportunity for an inquisitive and enterprising scholar, perhaps? I’d wel- come the manuscript! Shameless Plugs Department. By the time you read this, we at ITAL will have launched our new blog, ITALica (http://ital-ica.blogspot.com). ITALica addresses a need we on the ITAL editorial board have long sensed; that is, an area for “letters to the editor,” updates to articles, supplementary materials we can’t work into the jour- nal—you name it. One of the most important features of ITALica will be a forum for readers’ conversations with our authors: We’ll ask authors to host and monitor dis- cussion for a period of time after publication so that you’ll then have a chance to interact with them. ITALica is currently a pilot project. For our first issue we will have begun with a discussion hosted by Jennifer Bowen, whose article “Metadata to Support Next-Generation Library Resource Discovery: Lessons from the eXtensible Catalog, Phase I” was published in the June 2008 issue of ITAL. For our second ITALica, we plan to expand coverage and discussion to include all articles and other features in the September issue you now have in hand. ITALica is sure to become a stimulat- ing supplement to and forum for topics originating in ITAL. We look forward to seeing you there! References and Notes Extract. Michael Gorman, “Revenge of the Blog People!” Library Journal (Feb. 15, 2005) www.libraryjournal.com/article/ CA502009.html (accessed July 21, 2008). 1. Nicholas Carr, “Is Google Making Us Stupid?” The Atlantic Monthly 301 (July/Aug. 2008) www.theatlantic.com/ doc/200807/google (accessed July 23, 2008). EDitoR’s coLumn | tRuitt 5 2. Ibid. 3. Tim Spalding, “Re: ‘Is Google Making Us Stupid? What the Internet is Doing to Our Brains,’” Web4Lib discussion list post, June 19, 2008, http://article.gmane.org/gmane.education .web4lib/12349 (accessed July 24, 2008). 4. James A. Evans, “Electronic Publication and the Narrow- ing of Science and Scholarship,” Science (July 18, 2008) www .sciencemag.org/cgi/content/full/321/5887/395 (accessed July 24, 2008). Emphasis added. 5. Ibid. 6. As of 5:30PM (EST), July 24, 2008, Amazon’s website listed 145,591 “Kindle books.” www.amazon.com/s/qid=1216934603/ ref=sr_hi?ie=UTF8&rs=154606011&bbn=154606011&rh=n%3A1 54606011&page=1. 7. Carr, “Is Google Making Us Stupid?” 8. Evans, “Electronic Publication and the Narrowing of Sci- ence of Scholarship.” 9. Spalding, “Re: ‘Is Google Making Us Stupid?’”