304 Learning Analytics and the Academic Library: Professional Ethics Commitments at a Crossroads Kyle M.L. Jones and Dorothea Salo* In this paper, the authors address learning analytics and the ways aca- demic libraries are beginning to participate in wider institutional learning analytics initiatives. Since there are moral issues associated with learning analytics, the authors consider how data mining practices run counter to ethical principles in the American Library Association’s “Code of Eth- ics.” Specifically, the authors address how learning analytics implicates professional commitments to promote intellectual freedom; protect patron privacy and confidentiality; and balance intellectual property interests between library users, their institution, and content creators and vendors. The authors recommend that librarians should embed their ethical posi- tions in technological designs, practices, and governance mechanisms. “We cannot allow our ethical principles to stand as mere monoliths…. That our values pose ethical dilemmas for us in practice is a good thing and should be welcomed by us: from debating such dilemmas comes strength of mission.”— David McMenemy1 Introduction Higher education institutions (HEIs) are aggregating and analyzing student data, in- formation, and the digital trails they leave in information systems to better understand student behaviors. Among other things, these data-driven insights hold the potential to cast light into the black box of learning and student life. The primary aim is to use data analytics to help HEIs address urgent concerns related to low retention rates and extended time-to-degree measures, which can then improve on the efficiency and ef- fectiveness of institutional practices. Educational data mining employs Big Data methods; in doing so, it also elicits Big Data’s band of problems. Big Data uncovers personally identifiable information and flows that information to a variety of different actors. But it is problematic that these new flows are opaque to the individuals they represent, yet they hold significant influ-  Kyle M.L. Jones is an Assistant Professor in the School of Informatics and Computing, Department of Library and Information Science at Indiana University–Indianapolis (IUPUI); e-mail: kmlj@iupui.edu. Dorothea Salo is a Faculty Associate in the Information School, University of Wisconsin-Madison; e-mail: salo@wisc.edu. ©2018 Kyle M.L. Jones and Dorothea Salo, Attribution-NonCommercial (http://creativecom- mons.org/licenses/by-nc/4.0/) CC BY-NC. doi:10.5860/crl.79.3.304 mailto:kmlj@iupui.edu mailto:salo@wisc.edu http://creativecommons.org/licenses/by-nc/4.0/ http://creativecommons.org/licenses/by-nc/4.0/ https://doi.org/10.5860/crl.79.3.304 Learning Analytics and the Academic Library 305 ence in those individuals’ lives. Rising to the top are complicated and troubling issues about informational privacy, transparency, personal autonomy, and other questions of what is morally acceptable. The question, then, is how to ethically pursue the benefits of educational data mining while accounting for the potential harms. Ethical practice is often supported by codified principles, and researchers have constructed broad codes of ethics to minimize harms to students.2 Similarly, a few HEIs have developed campus-wide policies.3 While institutions may feel justified and morally at ease in pursuing the benefits of educational data mining, their interests and practices may lead to “unintended consequences and perverse incentives” to analyze sensitive student data in ways that harm students and conflict with ethical commit- ments made by professional communities, such as librarians.4 In this paper, we address learning analytics (LA)—a form of educational data mining—and the ways academic libraries are beginning to create capacity for LA or participate in wider institutional LA initiatives. Since there are moral issues associated with LA, we consider how data mining practices run counter to ethical principles in the American Library Association’s Code of Ethics. Specifically, we address how LA implicates professional commitments to promote intellectual freedom; protect patron privacy and confidentiality; and balance intellectual property interests among library users, their institution, and content creators and vendors. We make three arguments aligned to these three ethical commitments. First, it is plausible that LA negatively affects the conditions necessary for the free pursuit and dissemination of ideas by tracking and influencing intellectual behaviors. Second, LA is by design a surveillance technology, and it can opaquely pry into students’ privacy without empowering student choice over information flows. Third, the ability to sur- veil students also allows institutions and content vendors to expand the use of digital rights management technologies; moreover, it creates conditions where institutions and content vendors may use student data as a bartering chip in contract negotiations. We end the paper by positing that librarians have a professional responsibility to re- flect on and advocate for their ethical positions when participating in LA. If LA matures further, librarians’ ethical commitments may be challenged by institutional actors who do not share their professional values. We recommend that librarians advocate for their ethical positions within and outside the boundaries of their institution, participate in data governance practices to embed their values in information flows, and work closely with policy makers to design policies in ways that consider their professional ethics. Big Data and Learning Analytics An Understanding of Big Data Learning analytics (LA) is a type of Big Data practice. After boyd and Crawford, we capitalize “Big Data” to highlight it as a cultural, technological, and scholarly phe- nomenon, a “paradigm rather than a particular technology.”5 Defining Big Data as a sociotechnical phenomenon fruitfully moves the conversation away from technocentric and myopic definitions of Big Data.6 Big Data practices differ significantly from the flows of information currently con- sidered the norm in scholarly research and assessment. Current research practices, often called “small data,” involve actors developing and curating limited datasets to answer preset and particular questions.7 Smaller datasets and related practices generally respect informational norms, or expectations of information flow, because actors seek consent for data gathering beforehand and use the data as means toward explicitly agreed-upon and respected ends.8 The Big Data ethos, however, motivates actors to develop boundless datasets to be used in ways that are often unanticipated at time of collection. Individuals, organiza- 306 College & Research Libraries April 2018 tions, and institutions implementing Big Data initiatives argue that taking an “n = all” approach and aggregating all the data they can get will optimize their data-driven proj- ects.9 Immense datasets, made possible by aggregating and intertwining disparate and diverse sources of data, no longer require painstaking curation of scientific, statistically powerful samples. Instead, data scientists can conduct “fishing expeditions … to look hard for patterns and report any comparisons that happen to be statistically significant.”10 Learning Analytics as a Big Data Practice LA is commonly defined as the “measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.”11 To maximize LA’s potential, HEIs implementing LA are working to “de-silo”12 so-called “static data” about and created by students stored in departmental offices across campus by building up institution- wide data warehouses13 to capture “fluid data.”14 Shacklock explains the difference between static and fluid data this way: Static data is that which is collected, recorded and stored by institutions [includ- ing] student records, staff data, financial data and estates data. Fluid data …is generated through the increasingly digital way a student interacts with their university, such as card swipe data from access-controlled campus buildings, log-ins to the virtual learning environment (VLE) and e-books or online journal downloads.15 Tightly guarded information and technically inaccessible data hinders LA’s efficacy. Ubiquitous campus educational, professional, and personal information systems store a significant amount of data and information created by or about students. Data captures begin even before a student applies for admission and extend to alumni status, enabling colleges and universities to probe student behaviors as a snapshot or over time.16 Students are easily accessible data subjects, and HEIs can avoid seeking student consent for data collection, retaining and analyzing student data with impunity because the data is generated as part of the educational process.17 Emerging troves of student data can be enlarged by capturing the metadata, known as “data exhaust”18 or “digital footprints”19 students leave as they interact with information systems. Motivations Driving LA HEIs deploy LA to understand student behaviors within learning contexts and opti- mize learning environments. Examining student data may directly or only tangentially benefit students, however, or may benefit only the institution itself. Goals tangential to student achievement include resource optimization and revenue increases.20 HEIs often believe that LA will unveil hidden cost savings and streamline operational prac- tices.21 Improved fiscal management may enable HEIs to defend themselves against accountability calls and measures due to ongoing tuition increases and the rise in student loan debt.22 Beyond its business intelligence aspects, LA initiatives focus on directly improving student learning experiences.23 Advocates of LA argue that analyzing the emerging trove of student data will finally unpack the “black box” of learning, enabling better comprehension of inputs and outputs relevant to successful learning experiences.24 Specifically, proponents of data-driven education argue that datifying the learning ex- perience will create actionable insights into student success and failure, and predictive algorithms built on large datasets will provide HEIs the tools to determine students’ future educational progress.25 Learning Analytics and the Academic Library 307 Applications of Learning Analytics Because data on prospective students is valuable to HEIs well before enrollment, ad- missions departments are developing troves of data and analytic capacities in novel ways. Some of this information comes from standard sources, such as ACT and SAT scores, which admissions departments purchase en masse to microtarget prospective students.26 These same admissions officers are also mining data from social media,27 looking at how prospective students interact with their institution’s website,28 and consulting with outside analytics companies.29 HEIs argue that the more they know about their potential incoming class, the more informed and effective their admissions decisions will be.30 Comprehensive data about prospective students can help HEIs understand the level of a given student’s interest, the probability that she will apply and enroll, and whether past personal information and academic performance predict her chances of future success and graduation while enrolled. Once students enroll, institutions gather data on “engagement,” hoping to retain them. Whether a student is participating in campus life has historically been difficult to measure, but campus mobile applications can log interactions to track what students are interested in and need help with, or whether they have lost contact with campus.31 HEIs can also examine RFID-embedded student IDs and swipe card activity to learn what campus services and events they are participating in and when.32 Similarly, campuses such as Northern Arizona University have moved to automate attendance tracking via RFID technologies, arguing that attendance correlates with higher aca- demic achievement.33 Some universities automatically notify a student’s advisor when an attendance system records multiple absences.34 Others like Class120 use mobile applications’ geolocation ability to create “digital fences” around classroom areas. When students are not inside the fence when the class is in session, the system marks them absent and can alert institutional representatives—or even the students’ parents.35 Advising is a key variable in student academic achievement,36 so matching students with the most appropriate academic path is often a driving goal for LA initiatives. HEIs have turned to eAdvising systems that use Big Data tools and techniques to analyze student profiles, predict a student’s chances of success, and suggest interventions to keep them on track to graduation. Systems like Degree Compass consider a student’s selected major and his or her achievement in past courses, comparing that achievement with similar peers.37 The system then nudges students to enroll in particular classes that “will best help undergraduates move through their programs of study most suc- cessfully—and most expeditiously.”38 Other systems penalize students for not making adequate progress on their degrees or enrolling in courses not recommended to them.39 Repercussions include forcing students to meet with advisors or choose a new major. Learning management systems (LMSs), where instructors facilitate online learning experiences for distance and on-campus students alike, often include relatively mature LA technology. These spaces capture metadata related to system ingress and egress and student interactions with peers and learning objects, in addition to content that students create and grades they earn. LA tools like Course Signals or Desire2Learn’s Insights measure student progress and peer engagement via social network graphs, and from these data try to identify at-risk students. Purdue University’s institutional research on their implementation of Course Signals found that it improved retention, graduation rates, and grades,40 though some have called into question the statistical validity of the university’s findings.41 Some institutions and researchers seek to augment learner data in LMSs with biomet- ric and other sensor data from fitness trackers. Oral Roberts University now pressures its incoming classes to purchase Fitbits, arguing that measuring student movement is part and parcel of fulfilling the institution’s mission of educating the mind as well as 308 College & Research Libraries April 2018 the body and spirit.42 Step and heart-rate data from the Fitbits are automatically sent to the LMS and graded, but grades are lowered if they opt out.43 Similarly, the Bill and Melinda Gates Foundation has funded $1.4 million in research to build and develop a so-called “engagement pedometer,” which uses galvanic skin response sensors to inform instructors in real time “which kids are tuned in and which are zoned out.”44 In the same vein, emerging research is adapting facial recognition software to man- age attendance-taking in large classrooms and determine how engaged students are in lessons by analyzing facial movements.45 Other research used passive sensors from mobile phones to infer social and study behaviors by analyzing conversations and GPS-plotted movements.46 The data were then correlated with grade point averages to determine what kind of behaviors related to higher grade outcomes. Analytics from emerging flows of student information are usually accessible to institutional actors (such as instructors, advisors, and administrators) only, though some proponents of LA argue that enabling students to view their data dashboards could reap benefits. Couched in terms of the quantified-self movement, researchers argue that visualizing learning progress and behaviors can improve student decisions, motivate them, and enhance their autonomy as well as enabling them to set goals and see how behavioral changes affect learning outcomes.47 LA tools like Purdue Univer- sity’s “Pattern” mobile application are beginning to integrate graphs and metrics to inform students of their progress and how they compare with their peers. To date, the level of access a student has to data and analytics about herself is still low, but access by institutional actors is high. Academic Libraries, Learning Analytics, and Ethical Challenges Academic Libraries and Learning Analytics Academic libraries have rigorously evaluated their services and collections with quantitative and qualitative methods for decades. More recently, some libraries have built LA capacity to analyze student behaviors in library information systems, often to attest to a library’s value with respect to student learning outcomes.48 Research projects have begun to consider the relationship between library services and grades, retention, and achievement, representing a “significant turn” in assessment and evalu- ation.49 The shift from studying student experience in libraries to student achievement reflects contemporary pressures on HEIs, according to the Association of College and Research Libraries.50 Its report stresses that libraries are under increased scrutiny by institutional accreditors and other stakeholders to prove that library practices are aligned with institutional needs and enable positive learning outcomes. Duderstadt argues that “the university library may be the most important observa- tion post for studying how students really learn,” such that joining datasets within and outside of the library will be key to discovering the impact of library services and spaces on student learning.51 While libraries and library researchers have pursued data mining for niche projects,52 larger-scale, Big Data-style mining has proven difficult until libraries began working with other campus departments to develop data warehouses and analytic capacity.53 For instance, Jantti and Cox joined “as many datasets as ethically, politically, and technically possible,” mirroring the Big Data ethos motivating many LA initiatives.54 The Huddersfield University (HU) Library Impact Data Project (LIDP) sought a cor- relation between library activity data and degree attainment.55 Researchers analyzed circulation data from the university’s integrated library system, eResource access, and student entries into the library. They removed student identifiers and minimized da- tasets, which maximized analyses of demographic and academic trends (for instance, to find out correlations among undergraduates in humanities departments) while nominally protecting against student privacy invasions. Learning Analytics and the Academic Library 309 The University of Minnesota–Twin Cities (UM-TC) also researched the correlation between library usage and student outcomes such as retention and grade point aver- age.56 UM-TC aggregated usage data from “as many different library services and resources as possible,” including: circulation loans and renewals; interlibrary loan request; database, e-journal, e-book, and website usage; instruction activity such as workshop registrations; and online reference transactions.57 Whereas HU stripped identifying characteristics from their datasets, UM-TC kept them intact at the “broad activity” level without tracking granular digital footprints. For example, UM-TC tracked a student’s entry into a database, but not specific queries; they collected checkout data, but without item titles and metadata. The library LA literature cites the University of Wollongong (UoW) for pioneering projects in electronic resource tracking.58 Rubel and Zhang explain how academic libraries use authentication technologies, namely Internet protocol (IP) address filter- ing and proxy servers, to serve licensed content.59 Students identify themselves to au- thentication systems by logging in; once logged in, they leave traceable digital exhaust that reveals their behavior to libraries, campus information technology departments, and even publishers. UoW’s “Library Cube” is “a dataset that joins usage of [UoW library] resources with enterprise systems containing student demographic data and academic performance using a unique identifier, the student number.”60 The Cube tracks library-related data also, such as the total number of student item loans and electronic usage information from proxy server logs, including access points, timestamps, dura- tion data, and the specific electronic resources with which students engage. System limitations reduced the authors’ ability to glean more information about student item loans, though they leave unclear what they would have preferred to analyze. In a re- cent update, Jantti notes that library data now tells instructors whether students use library resources, assuming that students are at academic risk when resource usage is low.61 They report that “[t]he act of intervention has been positive. When students have been contacted by their lecturer to query their limited use of Library [sic] resources, uptake was immediate.”62 E-books also enable data mining of student behavior. Not only do e-book providers commonly track what individuals read and when, but they also analyze the data to provide insight into how students read and their comprehension rate.63 Moreover, e- book add-on services include user profiles that track reader interests and preferences.64 Combined with student demographic data and participatory content from e-book spaces (such as collaborative annotations and resource sharing), e-book activity data may be used in LA research.65 Like wider institutional LA projects, academic libraries seek data from their physi- cal environments via so-called “location intelligence” technologies.66 The Measure the Future team is building technical capacity for physical-space surveillance, using computer vision techniques with open source hardware and software to record and analyze user movements. The project’s leaders argue that sensor data will “allow librar- ians to make strategic decisions that create more efficient and effective experiences for their patrons.”67 Academic libraries are also refining gate count statistics by capturing swipe-card and RFID data from student IDs. For instance, Kent State University kept detailed demographic and timestamp information obtained from card swipes, which students were required to provide after the library went to a 24-hour open schedule.68 Similarly, Georgia State University captured student information from card swipes, correlating ingress and egress at specific points in the library with student residential status and grade point average, among other data points.69 Other libraries, such as the University of Oklahoma, have explored using beacon sensors and iOS devices to push customized content and just-in-time information to users’ mobile devices while track- 310 College & Research Libraries April 2018 ing foot traffic.70 Walsh argues that analyzing sensor data about identifiable students moves libraries toward a more informed understanding of how students interact with and learn in these spaces.71 Ethical Problems with LA Participation It is worthwhile for libraries to positively influence student learning outcomes and develop tighter ties to institutional goals. It is also beneficial for libraries to seek in- sight into their value, especially given that library expenditures continue to increase at significant rates while receiving less and less of their institution’s overall budget.72 Despite its potential benefits, LA is intertwined with moral problems given the sensitiv- ity of emerging datasets and the creation of new information flows.73 These initiatives increasingly develop student surveillance systems, otherwise known as dataveillance systems.74 Such systems allow for the creation of comprehensive digital dossiers that institutional actors can use to make consequential decisions about students that risk their autonomy.75 Institutions are purposefully aggregating massive amounts of stu- dent data without limiting “data dredging” for correlations that could favor values and interests of powerful actors while disenfranchising students.76 Also important to note is that the correlation studies on which LA advocates build their argument do not always account for or prevent against false positives in the statistics; thus, data dredging-based correlation studies may create misleading and harmful paths of action. Contemporary scholarship is interrogating the social, technical, and policy-related issues associated with LA practices.77 Little has yet been written, however, concerning the specific challenges facing libraries participating in LA initiatives. A notable exception is in Showers’ Library Analytics and Metrics book.78 Citing Hellman, Showers argues that academic libraries have been struggling with problems—privacy and otherwise—as- sociated with emerging data and information flows since social media matured and data analytics became more accessible.79 Naturally, libraries want to use these data flows to their benefit and to that of their users, but at what cost? “Libraries,” Showers writes, “may also be undermining some of the values they have traditionally held so dear” by pursuing analytical insights and creating data-driven services.80 Ethical Codes In addition to pursuing answers to questions regarding legal compliance, LA advocates inside libraries and throughout higher education are seeking ethical guidance. Thus far, LA’s problems are so novel and complex that many have more questions than answers. Some useful ethical guidance has developed out of the two Asilomar conventions on “learning research in higher education” and “student data and records in the digital era.”81 Similarly, Jisc’s “Code of Practice for Learning Analytics” sets out fundamental principles for institutions to deploy LA “responsibly, appropriately and effectively.”82 However, overarching codes such as Jisc’s have weaknesses. These codes generally address LA’s potential harms without considering contextual “norms and established values.”83 These contexts are bound by “canonical activities, roles, relationships, power structures, norms (or rules), and internal values (goals, ends, purposes),” according to Nissenbaum.84 So, ethical codes written to address LA issues at the macro level (such as for higher education writ large) fail to account for the nuances of meso-level contexts (such as particular institutions) or even micro-level contexts (such as libraries), much less particular populations those departments serve. For instance, broad ethics codes are not typically tuned to the needs of vulnerable student populations (examples: first- in-family, poor, of color, with disabilities). Ethical codes tuned for the needs of more tightly bound contexts will often be able to address the conflicts and compatibilities with LA, whereas general codes may not be able to address specific friction points with Learning Analytics and the Academic Library 311 existing norms and values. The American Library Association’s (ALA) Code of Ethics is one such fine-tuned code that can directly address issues with LA. The Code of Ethics, which has been consistently reviewed and amended since 1939, guides academic librarians in the United States. In a preface to its eight principles, the code reads: “Ethical dilemmas occur when values are in conflict. The [ALA] Code of Ethics states the values to which we are committed, and embodies the ethical re- sponsibilities of the profession in this changing information environment.”85 The ALA recognizes that the Code is a “framework” for ethical decision making, not a series of commandments, and that its principles “cannot and do not dictate conduct to cover particular situations.” Research on ethical conflicts in the library and information sci- ence literature have shown that the Code of Ethics only serves as a starting point for navigating ethical issues.86 But even though the Code of Ethics does not provide an action pathway for specific circumstances, it still informs librarians of the profession’s values in ways that more general codes cannot. We see several conflicts with the Code of Ethics when libraries provide data to, help with data analysis for, or conduct LA initiatives themselves. LA does not conflict with all eight principles in the Code of Ethics, but the practice clearly implicates the follow- ing three principles (following their numbering in the Code of Ethics): II. We uphold the principles of intellectual freedom and resist all efforts to censor library resources. III. We protect each library user’s right to privacy and confidentiality with respect to information sought or received and resources consulted, borrowed, acquired or transmitted. IV. We respect intellectual property rights and advocate balance between the interests of information users and rights holders. We take each ethical conflict in turn below in relationship to actual and reasonable hypothetical practices. Ethical Frictions with Learning Analytics Intellectual Freedom and Censorship Modern academic librarianship staunchly advocates for intellectual freedom, defined as “the right of every individual to both seek and receive information from all points of view without restriction [that is to say, censorship].”87 This focus on receiving informa- tion, however, is only one aspect of intellectual freedom. Dresang adds that intellectual freedom requires particular conditions that do not restrict thoughts, beliefs, and asso- ciation with others.88 And Johnson contends that intellectual freedom also enables an individual to create information, not just consume it.89 Summarized, intellectual freedom concerns which information one can access, how one engages with that information in association with others, and the degree to which one may create new information. LA compromises intellectual freedom when institutional actors, system designers, and algorithms limit opportunities to engage in the creation and consumption of intel- lectual materials. So-called “nudging” techniques direct students toward particular learning resources or to change their behaviors.90 What connects nudging in learning environments to intellectual freedom concerns is the ability of an instructor to assess and penalize students for not responding to the nudge. Subsequently, the student may decide always and only to pursue materials suggested by LA systems, replacing personal intellectual values with those pushed by his or her institution.91 312 College & Research Libraries April 2018 Consider the following example. Professor Gutierrez uses LA to track his students’ interactions with library resources. He uses the system’s features to nudge students away from and to other resources he believes are trustworthy and valuable for student learning. Surely, the professor has an instructional responsibility to lead students to engage with materials that will yield positive learning outcomes and experiences. Yet, it is just as plausible that the professor could “whitelist” some texts and “blacklist” others based on personal preference or political persuasion—not instructional merit— using LA to nudge students away from these texts. Students would not be provided the opportunity to engage with these materials, or Professor Gutierrez could penalize them for doing so once the system alerts him of his students’ behaviors. Either way, tracking and nudging supported by LA limits the degree to which his students can pursue intellectual materials according to their own interests. Third-party learning environments that track student behaviors also present intel- lectual freedom issues. Educational platforms that provide students access to packaged articles, e-books, and other resources via library-vendor relationships may suppress access to specific materials to promote their contractual goods. In this case, students have less intellectual freedom in that surveilling actors and systems can carefully craft via rewards, disciplinary processes, and censorship practices the environment and the materials students encounter; in so doing, they limit a student’s opportunity to “seek and receive” information. When students depart their institutions and lose access to vendor-controlled walled gardens, they are liable not to know how to seek appropri- ate information elsewhere. Ethical concerns also arise around tracking student social networks within and outside LMSs. Desire2Learn, a commonly used LMS, can map social networks as they develop in course sites to provide instructors insights into who is communi- cating with whom, when, and how often. Instructors and advisors could use such maps to route students to communicate with particular peers. While it may be beneficial to redirect student communications to include students on the fringes of the network or even match students who have not connected, individuals may just as easily manipulate the network and do harm. For instance, instructors may direct students away from peers that present opinions in opposition to their own, or individuals they deem trouble students. Research has uncovered how institutions are already correlating student presence in campus physical spaces such as librar- ies via WiFi access logs.92 Using such analytics to try to direct student relationships (for instance, to nudge students to interact with higher-achieving peers) would be egregious, but institutions might believe this surveillance-based practice may bolster student achievement. Associated intellectual freedom issues are less about the liberty to interact with and consume materials, and more about the social condi- tions necessary to exchange ideas freely among peers. When institutions use social network data to manipulate interactions in academic spaces, they damage the free and organic exchange of ideas. Privacy and Confidentiality LA naturally invokes privacy issues and concerns about confidentiality of personal information. Student use of materials (such as books, articles, and other materials) may be recorded, analyzed, shared with a variety of actors, and used to intervene in student learning and life choices. These practices in turn damage intellectual freedom. While not explicitly developed in the Code of Ethics, the concern over intellectual freedom and user privacy and confidentiality refers to theories of “intellectual privacy” developed by Richards.93 Intellectual privacy is the protected sphere in which we can cognitively function and safely develop speech without intrusion. Learning Analytics and the Academic Library 313 Librarians commonly help instructors develop LMS-hosted courses by embedding resources—articles, links, and the like—for students to access. This service was benign and welcome in the past. Today, however, LA technology tracks when students click on resources linked from or embedded within course sites, as well as how long students read (that is to say, view) the resource, which enables actors with the right privileges to keep a detailed audit of these activities and, consequently, judge students’ behaviors. LA data are often inaccurate because they cannot discern when students are actively reading and interacting with a resource based on clickstream data alone. It may be that students open a PDF in the LMS only to walk away to socialize with peers in the hall of their dormitory or purchase a cup of coffee; alternately, students could simply print off the PDF. Furthermore, it could also be that students track down the reading using a citation instead of clicking through to the reading in the LMS. Any of these three situations would incorrectly calculate student engagement scores. As LA matures and the technology moves away from clickstream data, it could get more accurate engagement data by capturing mouse movements or eye movements via computer cameras. We have seen in our own experience how instructors are increasingly relying on these inadequate data as proxies for attendance and to assess student engagement, which aligns with insights from Kruse and Pongsajapan.94 Besides being unfairly graded, the harm for students is that they may shift the intellectual resources they pursue toward those included in the surveillance net and used to grade their progress, eliminating other options. Moreover, they may look upon their instructors, librarians, and others with access to their data with “coded suspicion,”95 which can lead to distrustful rela- tionships and negatively impact “intellectual risk-taking.”96 UoW’s Library Cube analytics project tracked student use of library resources, not- ing how usage of library sources immediately increased when instructors informed students that their usage was low. Uptake responding to an intervention may seem to be a benefit of library LA projects, but this finding does not consider potential adverse psychosocial effects. Without empirical data to say otherwise, it is just as plausible that students responded with fear that what they thought to be personal, private informa- tion (that is, which resources they accessed) would be used to their disadvantage. In response, they increased their library usage to regain favor with their instructor, but not because they believed it would intellectually benefit them. It is equally disconcerting that some students may come to believe there is some “correct” threshold of library usage or “right” materials they should access. We expect these initiatives to increase library anxiety and decrease trust toward librarians among students, especially urban students of color who have historically been the target of school surveillance.97 Students may chill their intellectual behaviors due to privacy invasion from librar- ians participating in LA. We should be concerned not only about how this negatively impacts individual student lives, but also how chilling can impact campus and class- room communities. Chilling brought about by LA can harm the sharing of viewpoints and decrease pushing intellectual boundaries, both of which often require learner communities to engage in free, unrestricted speech and inquiry.98 Intellectual Property Rights The nascent value that LA attaches to student data could significantly affect intellectual property (IP) negotiations between digital content vendors and HEIs. These IP con- cerns interact with digital rights management (DRM) requirements (such as protect- ing copyrights) attached to library-vendor contracts, and surface how informational and algorithmic products derived from student data could become trade secrets or marketable products. Since access to data is enticing for vendors and institutions alike, 314 College & Research Libraries April 2018 librarians will have to navigate ethical quandaries around capturing and providing interaction data about student access to library resources with vendors. Content vendors track student usage of their digital materials, nominally to protect against illegal access and sharing. In response, the ALA recently proposed guidelines addressing increased tracking: [M]ost e-book and digital content vendors collect and use library patron data for a variety of reasons, including digital rights management, consumer analytics, and user personalization. Libraries and vendors must work together to ensure that the contracts and licenses governing the provision and use of digital informa- tion reflect library ethics, policies, and legal obligations concerning user privacy and confidentiality.99 The statement highlights a growing tension between digital content providers wish- ing to track access and librarians who advocate for user protections against tracking. As Rubel notes, however, resolving the “trade-offs between patron privacy and access” to digital resources has proved challenging.100 The Harvard Business Review (HBR) controversially limited use of their articles in digital course packs and electronic reserves by inserting licensing language into an EBSCO package contract that included HBR.101 HBR held that its license—as enacted through EBSCO—trumped any fair-use claims to link to or copy articles for student access outside EBSCO’s system. Such licensing language and restricted user access funnels users to vendor-managed systems that track users. Lambert, Parker, and Bashir’s research into popular digital content vendors found that vendors often col- lect, analyze, and share personally identifiable information with third parties, even though public policies stated otherwise.102 And Rubel and Zhang’s investigation into 42 unique licensing agreements uncovered the wide spectrum of data collection and sharing protections in existence (or in some cases lacking altogether).103 This suggests that libraries are participating in a “Faustian bargain” whereby vendors and vendor systems provide access to valuable resources at the cost of degrading student privacy and weakening librarians’ professional ethics commitments.104 Vendors or other actors seeking access to student data creates another IP concern: trade secrets. Institutional advantages gained through LA are believed to enable HEIs to position themselves strategically in a “competitive and dynamic” marketplace.105 To give student data or derived findings away could diminish the value gained. Therefore, it is plausible that HEIs are likely to protect their data assets as trade secrets, gaining revenue therefrom through the sale or license of specially curated databases and in- formational reports, analytic workflows, or custom-made algorithms. Efforts by HEIs to maximize trade-secret revenue may force librarians into data mining they find unsavory. Early library LA research suggests important correlations between library usage data and student learning outcomes, which HEIs may seek to understand at more granular levels. Doing so, however, will require librarians to monitor student behaviors in physical and digital spaces to determine which behaviors correlate with higher academic achievement or social integration into campus life. HEIs may also pressure libraries to negotiate for licensing contracts that require vendors to contribute student behavior data from their systems to institutional LA data warehouses. Recommendations Embedded Values ALA’s Code of Ethics and the values that inform it give professionals strength and direction during times of sociotechnical change. Although the technological momen- Learning Analytics and the Academic Library 315 tum behind Big Data seems to legitimize similar practices in many corners of society, LA is still an open system that many actors—including librarians—can still influence. The question, then, is how librarians should react to the development of LA initiatives. We believe librarians should proactively embed their professional values in LA practices and technologies through advocacy. Emerging sociotechnical systems often demand political resistance, especially when powerful parties are positioned to be the primary benificiaries.106 As LA matures, the values that librarianship supports risk damage from institutional actors who do not share the same ethical perspective and values. Though library values cannot overrule other departments and offices supported by other professional ethics codes, librarian inactivity will cause professional values to be ignored or suppressed. It is not our aim herein to suggest principles or strict policy recommendations around LA, though work in this area is ongoing. The National Information Standards Organiza- tion’s work on user privacy and library, publisher, and vendor systems provides useful policy recommendations, as does Asher and Hinchliffe’s work specifically related to LA and libraries.107 Instead, we advance recommendations for librarian participation in the evolution of LA initiatives on their respective campuses. None of the following recommendations preclude librarians from pursuing assessment studies, but they do require a higher level of ethical practice should they seek to study student behaviors as evinced in data-based systems. We recommend first that librarians explain and advance the profession’s ethics while participating in dialogue around LA initiatives with their institutional and professional peers. Second, librarians can help build campus-wide data governance mechanisms to oversee potentially harmful information flows. Finally, librarians should develop library-specific information policies as well as participating in building institutionwide information policy to govern the ethical use of LA technology. We take each of these recommendations in turn in the following sections. Advocacy and Constructive Dialogue It is the responsibility of librarians to advocate for library values and ethical positions by participating in conversations about and design of LA systems at their institutions and within the profession. Wearing librarianship’s “ethical standards as badges of honour [sic]” without acting on them risks eroding those standards.108 Librarians must communicate why their values are important, how their ethical perspective conflicts with LA practices, and the ways in which LA initiatives harm students, especially with respect to intellectual freedom and privacy. Librarians can also serve as ethical exemplars for other HEI actors. Taking the ethical high ground puts pressure on institutional peers to make explicit commitments about data security, governance, and appropriate use and justify data ethics. For instance, several libraries have consciously employed consent-based research using ethnographic methods to demonstrate library value and develop an improved understanding of how library services and collections interact with learning outcomes.109 To leverage their principled actions, librarians need to get a prominent seat at the tables where LA conversations occur. Librarians should talk about their past successes and discuss how they can gain actionable insights without imperiling student intellectual freedom and privacy by trawling through student data. It is also important for librarians to extend these conversations outside their institu- tions into their professional circles. Library professionals need to share their successes and failures with LA as well as how they navigated hard conversations concerning ethics. Librarians should be ethically introspective and openly reflect about the jus- tifications they use to push forward or pull back from LA. Librarians have a duty 316 College & Research Libraries April 2018 to, among other things, participate in ALA committees; present at local, state, and international conferences; and publish in professional and scholarly outlets to share their experiences. It is only through these conversations that the profession will find its ethical bearings with respect to LA. Data Governance Developing data warehouses to account for emerging information flows and deter- mining the conditions under which actors should have access to data and information involves significant governance challenges.110 Institutional data governance needs natu- rally overlap with data management services for which librarians have been building capacity during the last few years.111 HEIs would benefit from including librarians in data governance talks because of librarians’ technical and conceptual expertise. More important, however, adding librarians to data governance teams provides another opportunity for library professionals to challenge ethically suspect data flows and ana- lytics while shaping governance practices to protect intellectual freedom and privacy. Information Policy Librarianship’s values would ideally be embedded in library and institutional policies as “normative propositions” that drive data and LA practices, in so doing protecting the interests of students and libraries.112 To accomplish this, librarians need to advocate for the profession’s values with institutional policy-makers when they design internal policy documents, memoranda of understandings with partner institutions, and legally binding contracts with third-party service providers. Moreover, such policies may influence the design of LA systems, as those systems would need to respect policy regulations before institutions enter into contracts with vendors. This might improve librarians’ position in licensing discussions with content ven- dors as well. If librarians are proactive participants in policy construction, they will have the powerful backing of their institutions during these sometimes contentious negotiations. They could, for instance, require vendors to disclose behavioral data they gather, how long they keep it, and what they do with it. Moreover, they could negotiate licenses that minimize the data gathered as well as its retention period, and enable HEI oversight of vendor data practices via license agreements. Conclusion Though pursuing LA may lead to good outcomes for students and their institutions, higher education and the library profession still face an ethical crossroads. LA practices present significant conflicts with the ALA’s Code of Ethics with respect to intellectual privacy, intellectual freedom, and intellectual property rights. We recommend that librarians respond by strategically embedding their values in LA through actively participating in the conversations, governance structures, and policies that ultimately shape the use of the technology on their respective campuses. Information ethicists, empirical researchers, and practitioners alike must deeply consider the ways in which these data-driven practices may pervade and erode professional values. Learning Analytics and the Academic Library 317 Appendix A. Abbreviations ALA = American Library Association DRM = Digital Rights Management HBR = Harvard Business Review HEI = Higher education institution HU = Huddersfield University ILS = Integrated library system IP = Intellectual property LA = Learning analytics LMS = Learning management system UM-TC = University of Minnesota-Twin Cities UoW = University of Wollongong Notes 1. David McMenemy, “Rights to Privacy and Freedom of Expression in Public Libraries: Squaring the Circle” (paper presented at the annual meeting of the International Federation of Library Associations and Institutions, USA, 2016). 2. Niall Sclater, “Code of Practice for Learning Analytics” (2015), available online at https:// www.jisc.ac.uk/sites/default/files/jd0040_code_of_practice_for_learning_analytics_190515_v1.pdf [accessed 26 February 2018]. 3. Charles Sturt University, “CSU Learning Analytics Code of Practice” (2015), available online at www.csu.edu.au/__data/assets/pdf_file/0007/2160484/2016_CSU_LearningAnalyticsCodePrac- tice.pdf [accessed 26 February 2018]; Open University, “Policy on Ethical Use of Student Data for Learning Analytics” (2014), available online at https://www.open.ac.uk/students/charter/ essential-documents/ethical-use-student-data-learning-analytics-policy [accessed 26 February 2018]. 4. Victor M.H. Borden, “The Accountability/Improvement Paradox,” Inside Higher Ed (Apr. 30, 2010), available online at https://www.insidehighered.com/views/2010/04/30/borden [accessed 26 February 2018]; Megan Oakleaf, “The Value of Academic Libraries: A Comprehensive Research Review and Report” (2010), available online at www.ala.org/acrl/sites/ala.org.acrl/files/content/ issues/value/val_report.pdf [accessed 26 February 2018]. 5. danah boyd and Kate Crawford, “Critical Questions for Big Data,” Information, Communica- tion & Society 15, no. 5, doi:10.1080/1369118X.2012.678878; Solon Barocas and Helen Nissenbaum, “Big Data’s End Run around Anonymity and Consent,” in Privacy, Big Data, and the Public Good: Frameworks for Engagement, eds. Julia Lane, Victoria Stodden, Stefan Bender, and Helen Nissen- baum (New York, N.Y.: Cambridge University Press, 2014), 46. 6. Jonathan Stuart Ward and Adam Barker, “Undefined by Data: A Survey of Big Data Defini- tions,” arXiv (2013), available online at http://arxiv.org/abs/1309.5821 [accessed 26 February 2018]. 7. Jules J. Berman, Principles of Big Data: Preparing, Sharing, and Analyzing Complex Information (Waltham, Mass.: Morgan Kaufmann, 2013). 8. Helen Nissenbaum, “Privacy as Contextual Integrity,” Washington Law Review 79, no. 1; Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford, Calif.: Stanford Law Books, 2010). 9. Alan Rubel and Kyle M.L. Jones, “Student Privacy in Learning Analytics: An Information Ethics Perspective,” The Information Society 32, no. 2; Viktor Mayer-Schönberger and Kenneth Cukier, Big Data: A Revolution That Will Transform How We Live, Work, and Think (New York, N.Y.: Houghton Mifflin Harcourt, 2013). 10. Andrew Gelman, “Too Good to Be True,” Slate (July 24, 2013): para. 11, available online at https://www.slate.com/articles/health_and_science/science/2013/07/statistics_and_psychol- ogy_multiple_comparisons_give_spurious_results.html [accessed 26 February 2018]. 11. George Siemens, “Learning Analytics: Envisioning a Research Discipline and a Domain of Practice” (paper presented at the Second International Conference on Learning Analytics and Knowledge, USA, 2012), 4. 12. Joanne Ingham, “Data Warehousing: A Tool for the Outcomes Assessment Process,” IEEE Transactions on Education 43, no. 2: 132, doi:10.1109/13.848064. 13. John P. Campbell, Peter B. Deblois, and Diana G. Oblinger, “Academic Analytics: A New Tool for a New Era,” EDUCAUSE Review 42, no. 4, available online at https://er.educause.edu/ articles/2007/7/academic-analytics-a-new-tool-for-a-new-era [accessed 26 February 2018]; Jerrold M. Grochow, “IT Infrastructure to Support Analytics” (2012), available online at https://library. https://www.jisc.ac.uk/sites/default/files/jd0040_code_of_practice_for_learning_analytics_190515_v1.pdf https://www.jisc.ac.uk/sites/default/files/jd0040_code_of_practice_for_learning_analytics_190515_v1.pdf http://www.csu.edu.au/__data/assets/pdf_file/0007/2160484/2016_CSU_LearningAnalyticsCodePractice.pdf http://www.csu.edu.au/__data/assets/pdf_file/0007/2160484/2016_CSU_LearningAnalyticsCodePractice.pdf https://www.open.ac.uk/students/charter/essential-documents/ethical-use-student-data-learning-analytics-policy https://www.open.ac.uk/students/charter/essential-documents/ethical-use-student-data-learning-analytics-policy https://www.insidehighered.com/views/2010/04/30/borden http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/val_report.pdf http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/val_report.pdf https://doi.org/10.1080/1369118X.2012.678878 http://arxiv.org/abs/1309.5821 https://www.slate.com/articles/health_and_science/science/2013/07/statistics_and_psychology_multiple_comparisons_give_spurious_results.html https://www.slate.com/articles/health_and_science/science/2013/07/statistics_and_psychology_multiple_comparisons_give_spurious_results.html https://doi.org/10.1109/13.848064 https://er.educause.edu/articles/2007/7/academic-analytics-a-new-tool-for-a-new-era https://er.educause.edu/articles/2007/7/academic-analytics-a-new-tool-for-a-new-era https://library.educause.edu/~/media/files/library/2012/10/erb1212-pdf.pdf 318 College & Research Libraries April 2018 educause.edu/~/media/files/library/2012/10/erb1212-pdf.pdf [accessed 26 February 2018]; Ronald Yanosky and Pam Arroway, “The Analytics Landscape in Higher Education” (2015), available online at https://library.educause.edu/~/media/files/library/2015/5/ers1504cl.pdf [accessed 26 February 2018]. 14. Niall Sclater, Alice Peasgood, and Joel Mullan, “Learning Analytics in Higher Education: A Review of UK and International Practice” (2016), available online at https://www.jisc.ac.uk/sites/ default/files/learning-analytics-in-he-v3.pdf [accessed 26 February 2018]; Mark van Harmelen and David Workman, “Analytics for Learning and Teaching” (2012), available online at http://publica- tions.cetis.org.uk/wp-content/uploads/2012/11/Analytics-for-Learning-and-Teaching-Vol1-No3. pdf [accessed 26 February 2018]. 15. Xanthe Shacklock, “From Bricks to Clicks: The Potential of Data and Analytics in Higher Education” (2016), available online at www.policyconnect.org.uk/hec/sites/site_hec/files/re- port/419/fieldreportdownload/frombrickstoclicks-hecreportforweb.pdf [accessed 26 February 2018]. 16. W. Kent Barnds, “Does Big Data Know Best? NSA and College Admissions,” Huffington Post (June 19, 2013), available online at www.huffingtonpost.com/w-kent-barnds/does-big-data- know-best-n_b_3460096.html [accessed 26 February 2018]; Nelia Ereno, Kurt Junshean Espinosa, and Ryan Ciriaco Dulaca, “An Integrated Framework on Alumni Tracking, Individual Profiling for Automated Data Analytics and Engagement,” International Journal of Computer Systems 3, no. 1, available online at www.ijcsonline.com/IJCS/IJCS_2016_0301007.pdf [accessed 26 February 2018]. 17. van Harmelen and Workman, “Analytics for Learning and Teaching.” 18. Mayer-Schönberger and Cukier, Big Data, 113. 19. Phillip Long and George Siemens, “Penetrating the Fog: Analytics in Learning and Edu- cation,” EDUCAUSE Review 46, no. 5 (September/October 2011): 32, available online at https:// er.educause.edu/articles/2011/9/penetrating-the-fog-analytics-in-learning-and-education [accessed 14 February 2018]. 20. C.L. Philip Chen and Chun-Yang Zhang, “Data-intensive Applications, Challenges, Tech- niques and Technologies: A Survey on Big Data,” Information Sciences 275 (2014), doi:10.1016/j. ins.2014.01.015; Steve LaValle, Eric Lesser, Rebecca Shockley, Michael S. Hopkins, and Nina Kruschwitz, “Big Data, Analytics and the Path from Insights to Value,” MIT Sloan Management Review 52, no. 2, available online at http://sloanreview.mit.edu/article/big-data-analytics-and- the-path-from-insights-to-value [accessed 26 February 2018]; James Manyika, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh, and Angela Hung Byers, “Big Data: The Next Frontier for Innovation, Competition, and Productivity” (2011), available online at www.mckinsey.com/business-functions/business-technology/our-insights/big-data-the-next- frontier-for-innovation [accessed 26 February 2018]. 21. Philip J. Goldstein and Richard N. Katz, “Academic Analytics: The Uses of Management Information and T in Higher Education” (2005), available online at https://er.educause.edu/~/ media/files/articles/2007/7/ekf0508.pdf [accessed 14 February 2018]. 22. Angela van Barneveld, Kimberly E. Arnold, and John P. Campbell, “Analytics in Higher Education: Establishing a Common Language” (2012), available online at https://library.educause. edu/~/media/files/library/2012/1/eli3026-pdf.pdf [accessed 14 February 2018]; Darrell M. West, “Big Data for Education: Data Mining, Data Analytics, and Web Dashboards” (2012), available online at www.brookings.edu/research/papers/2012/09/04-education-technology-west [accessed 26 February 2018]. 23. Doug Clow, “An Overview of Learning Analytics,” Teaching in Higher Education 18, no. 6, doi:10.1080/13562517.2013.827653. 24. Paul Black and Dylan Wiliam, “Inside the Black Box: Raising Standards through Classroom Assessment,” Phi Delta Kappan 92, no. 1, doi:10.1177/003172171009200119; Raquel M. Crespo García, Abelardo Pardo, Carlos Delgado Kloos, Katja Niemann, Maren Scheffel, and Martin Wolpers, “Peeking into the Black Box: Visualising Learning Activities,” International Journal of Technology Enhanced Learning 4, no. 1/2, doi:10.1504/IJTEL.2012.048313; Audrey Watters, “How Data and Analytics Can Improve Education,” O’Reilly Radar (July 25, 2011), available online at https://www.oreilly.com/ideas/education-data-analytics-learning [accessed 26 February 2018]. 25. Ryan Baker and George Siemens, “Educational Data Mining and Learning Analytics,” in Cambridge Handbook of the Learning Sciences: 2nd Edition, ed. R. Keith Sawyer (Cambridge: Cam- bridge University Press, 2014). 26. Ry Rivard, “Predicting Where Students Go,” Inside Higher Ed (Sept. 19, 2014), available online at https://www.insidehighered.com/news/2014/09/19/colleges-now-often-rely-data-rather- gut-hunt-students [accessed 26 February 2018]. 27. Emmanuel Felton, “The New Tools Colleges Are Using in Admissions Data Decisions: Big Data,” PBS Newshour (Aug. 21, 2015), available online at www.pbs.org/newshour/updates/ https://library.educause.edu/~/media/files/library/2012/10/erb1212-pdf.pdf https://library.educause.edu/~/media/files/library/2015/5/ers1504cl.pdf https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v3.pdf https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v3.pdf http://publications.cetis.org.uk/wp-content/uploads/2012/11/Analytics-for-Learning-and-Teaching-Vol1-No3.pdf http://publications.cetis.org.uk/wp-content/uploads/2012/11/Analytics-for-Learning-and-Teaching-Vol1-No3.pdf http://publications.cetis.org.uk/wp-content/uploads/2012/11/Analytics-for-Learning-and-Teaching-Vol1-No3.pdf http://www.policyconnect.org.uk/hec/sites/site_hec/files/report/419/fieldreportdownload/frombrickstoclicks-hecreportforweb.pdf http://www.policyconnect.org.uk/hec/sites/site_hec/files/report/419/fieldreportdownload/frombrickstoclicks-hecreportforweb.pdf http://www.huffingtonpost.com/w-kent-barnds/does-big-data-know-best-n_b_3460096.html http://www.huffingtonpost.com/w-kent-barnds/does-big-data-know-best-n_b_3460096.html http://www.ijcsonline.com/IJCS/IJCS_2016_0301007.pdf https://er.educause.edu/articles/2011/9/penetrating-the-fog-analytics-in-learning-and-education https://er.educause.edu/articles/2011/9/penetrating-the-fog-analytics-in-learning-and-education https://doi.org/10.1016/j.ins.2014.01.015 https://doi.org/10.1016/j.ins.2014.01.015 http://sloanreview.mit.edu/article/big-data-analytics-and-the-path-from-insights-to-value http://sloanreview.mit.edu/article/big-data-analytics-and-the-path-from-insights-to-value http://www.mckinsey.com/business-functions/business-technology/our-insights/big-data-the-next-frontier-for-innovation http://www.mckinsey.com/business-functions/business-technology/our-insights/big-data-the-next-frontier-for-innovation https://er.educause.edu/~/media/files/articles/2007/7/ekf0508.pdf https://er.educause.edu/~/media/files/articles/2007/7/ekf0508.pdf https://library.educause.edu/~/media/files/library/2012/1/eli3026-pdf.pdf https://library.educause.edu/~/media/files/library/2012/1/eli3026-pdf.pdf http://www.brookings.edu/research/papers/2012/09/04-education-technology-west https://doi.org/10.1080/13562517.2013.827653 https://doi.org/10.1177/003172171009200119 https://doi.org/10.1504/IJTEL.2012.048313 https://www.oreilly.com/ideas/education-data-analytics-learning https://www.insidehighered.com/news/2014/09/19/colleges-now-often-rely-data-rather-gut-hunt-students https://www.insidehighered.com/news/2014/09/19/colleges-now-often-rely-data-rather-gut-hunt-students http://www.pbs.org/newshour/updates/new-tool-colleges-using-admissions-decisions-big-data/ Learning Analytics and the Academic Library 319 new-tool-colleges-using-admissions-decisions-big-data/ [accessed 26 February 2018]; Eric Hoover, “Facebook Meets Predictive Analytics,” Chronicle of Higher Education (Nov. 6, 2012), available online at https://chronicle.com/blogs/headcount/facebook-meets-predictive-analytics/32770 [accessed 26 February 2018]. 28. Eric Hoover, “Getting Inside the Mind of an Applicant,” Chronicle of Higher Education (Sept. 18, 2015), available online at https://chronicle.com/article/Getting-Inside-the-Mind-of-an/233403 [accessed 26 February 2018]. 29. Eric Hoover, “What’s a Trove of Insights into College Applicants Worth $850-Million,” Chronicle of Higher Education (Dec. 16, 2014), available online at https://chronicle.com/article/ What-s-a-Trove-of-Insights/150887 [accessed 26 February 2018]. 30. Jay W. Goff and Christopher M. Shaffer, “Big Data’s Impact on College Admission Practices and Recruitment Strategies,” in Building a Smarter University: Big Data, Innovation, and Analytics, ed. Jason E. Lane (Albany, N.Y.: SUNY Press, 2014). 31. Carl Straumsheim, “Before the Fact,” Inside Higher Ed (Oct. 18, 2013), available online at https://www.insidehighered.com/news/2013/10/18/u-kentucky-hopes-boost-student-retention- prescriptive-analytics [accessed 26 February 2018]. 32. Karl E. Burgher, “Indiana State University” (n.d.), available online at https://www.campus- management.com/higher-education-resources/webcasts/compete-and-succeed-with-knowledge- driven-crm-q-a/ [accessed 26 February 2018]. 33. Northern Arizona University, “What Are Proximity Card Readers?” (2013), available online at http://www2.nau.edu/lrm22/blackboard/proximity_cards.html [accessed 7 March 2018]. 34. Kathy F. Gates, “Classroom Attendance via ID Card [Msg 6],” message posted to http:// listserv.educause.edu/scripts/wa.exe?A2=CIO;ffcd6452.1401 [accessed 14 February 2018]; Deetra Wiley, “Attendance Tracking Scanners for UM Classrooms,” TECHnews (Jan. 16, 2013), available online at http://technews.olemiss.edu/new-attendance-tracking-scanners-for-um-classrooms/ [accessed 26 February 2018]. 35. Douglas Belkin, “Cracking Down on Skipping Class,” Wall Street Journal (Jan. 14, 2015), available online at www.wsj.com/articles/cracking-down-on-skipping-class-1421196743 [accessed 26 February 2018]. 36. Richard J. Light, Making the Most of College: Students Speak Their Minds (Cambridge, Mass.: Harvard University Press, 2001). 37. Tristan Denley, “Degree Compass Course Recommendation System” (2013), available online at https://library.educause.edu/resources/2013/6/degree-compass-course-recommendation-system [accessed 26 February 2018]. 38. Ben Wildavsky, “Nudge Nation: A New Way to Prod Students into and through College” (2013): para. 16, available online at https://www.air.org/sites/default/files/publications/Nudge. pdf [accessed 26 February 2018]. 39. Marc Parry, “Big Data on Campus,” New York Times (July 18, 2012), available online at www.nytimes.com/2012/07/22/education/edlife/colleges-awakening-to-the-opportunities-of-data- mining.html [accessed 26 February 2018]. 40. Kim E. Arnold and Matthew D. Pistilli, “Course Signals at Purdue: Using Learning Ana- lytics to Increase Student Success” (paper presented at the Second International Conference on Learning Analytics and Knowledge, USA, 2012). 41. Alfred Essa, “Can We Improve Retention Rates by Giving Students Chocolate?” (Oct. 14, 2013), available online at http://alfredessa.com/2013/10/can-we-improve-retention-rates-by-giving- students-chocolates/ [accessed 26 February 2018]; Michael Feldstein, “Course Signals Effectiveness Data Appears to Be Meaningless (and Why You Should Care),” e-Literate (Nov. 3, 2013), available online at http://mfeldstein.com/course-signals-effectiveness-data-appears-meaningless-care/ [ac- cessed 26 February 2018]; Carl Straumsheim, “Mixed Signals,” Inside Higher Ed (Nov. 6, 2013), available online at https://www.insidehighered.com/news/2013/11/06/researchers-cast-doubt- about-early-warning-systems-effect-retention [accessed 26 February 2018]. 42. Oral Roberts University, “Oral Roberts University Integrates Wearable Technology with Physical Fitness Curriculum for Incoming Students” (Jan. 4, 2016), available online at www.oru. edu/news/oru_news/20160104_fitbit_tracking.php [accessed 26 February 2018]. 43. Jeff Stone, “Not All Oral Roberts Students Need to wear Fitbits, and They’re Not Tracked through Campus,” International Business Times (Feb. 3, 2016), available online at www.ibtimes. com/not-all-oral-roberts-students-need-wear-fitbits-theyre-not-tracked-through-campus-2291808 [accessed 26 February 2018]. 44. Liana Heitin, “All in the Wrist: Can a Bracelet Measure Student Engagement?” Educa- tion Week (June 15, 2012): para. 1, available online at http://blogs.edweek.org/teachers/teach- ing_now/2012/06/the_student_engagement_bracelet.html [accessed 26 February 2018]; Stephanie Simon, “Biosensors to Monitor U.S. Students’ Attentiveness,” Reuters (June 13, 2012), available online at www.reuters.com/article/us-usa-education-gates-idUSBRE85C17Z20120613 [accessed http://www.pbs.org/newshour/updates/new-tool-colleges-using-admissions-decisions-big-data/ http://chronicle.com/blogs/headcount/facebook-meets-predictive-analytics/32770 http://chronicle.com/article/Getting-Inside-the-Mind-of-an/233403 http://chronicle.com/article/What-s-a-Trove-of-Insights/150887 http://chronicle.com/article/What-s-a-Trove-of-Insights/150887 https://www.insidehighered.com/news/2013/10/18/u-kentucky-hopes-boost-student-retention-prescriptive-analytics https://www.insidehighered.com/news/2013/10/18/u-kentucky-hopes-boost-student-retention-prescriptive-analytics https://www.campusmanagement.com/higher-education-resources/webcasts/compete-and-succeed-with-knowledge-driven-crm-q-a/ https://www.campusmanagement.com/higher-education-resources/webcasts/compete-and-succeed-with-knowledge-driven-crm-q-a/ https://www.campusmanagement.com/higher-education-resources/webcasts/compete-and-succeed-with-knowledge-driven-crm-q-a/ http://www2.nau.edu/lrm22/blackboard/proximity_cards.html http://listserv.educause.edu/scripts/wa.exe?A2=CIO;ffcd6452.1401 http://listserv.educause.edu/scripts/wa.exe?A2=CIO;ffcd6452.1401 http://technews.olemiss.edu/new-attendance-tracking-scanners-for-um-classrooms/ http://www.wsj.com/articles/cracking-down-on-skipping-class-1421196743 https://library.educause.edu/resources/2013/6/degree-compass-course-recommendation-system https://www.air.org/sites/default/files/publications/Nudge.pdf https://www.air.org/sites/default/files/publications/Nudge.pdf http://www.nytimes.com/2012/07/22/education/edlife/colleges-awakening-to-the-opportunities-of-data-mining.html http://www.nytimes.com/2012/07/22/education/edlife/colleges-awakening-to-the-opportunities-of-data-mining.html http://alfredessa.com/2013/10/can-we-improve-retention-rates-by-giving-students-chocolates/ http://alfredessa.com/2013/10/can-we-improve-retention-rates-by-giving-students-chocolates/ http://mfeldstein.com/course-signals-effectiveness-data-appears-meaningless-care/ https://www.insidehighered.com/news/2013/11/06/researchers-cast-doubt-about-early-warning-systems-effect-retention https://www.insidehighered.com/news/2013/11/06/researchers-cast-doubt-about-early-warning-systems-effect-retention http://www.oru.edu/news/oru_news/20160104_fitbit_tracking.php http://www.oru.edu/news/oru_news/20160104_fitbit_tracking.php http://www.ibtimes.com/not-all-oral-roberts-students-need-wear-fitbits-theyre-not-tracked-through-campus-2291808 http://www.ibtimes.com/not-all-oral-roberts-students-need-wear-fitbits-theyre-not-tracked-through-campus-2291808 http://blogs.edweek.org/teachers/teaching_now/2012/06/the_student_engagement_bracelet.html http://blogs.edweek.org/teachers/teaching_now/2012/06/the_student_engagement_bracelet.html http://www.reuters.com/article/us-usa-education-gates-idUSBRE85C17Z20120613 320 College & Research Libraries April 2018 26 February 2018]. 45. Stan Alcorn, “Facial Recognition in the Classroom Tells Teachers When Students Are Spacing,” Fast Company (Oct. 18, 2013), available online at www.fastcoexist.com/3018861/facial- recognition-in-the-classroom-tells-teachers-when-students-are-spacing [accessed 26 February 2018]; Ben Schiller, “It’ll Be a Lot Harder to Cut Class with This Classroom Facial-Recognition App,” Fast Company (Feb. 17, 2015), available online at www.fastcoexist.com/3042445/itll-be-a- lot-harder-to-cut-class-with-this-classroom-facial-recognition-app [accessed 26 February 2018]. 46. Rui Wang, Gabriella Harari, Peilin Hao, Xia Zhou, and Andrew T. Campbell, “SmartGPA: How Smartphones Can Assess and Predict Academic Performance of College Students” (paper presented at the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Comput- ing, Japan, 2015). 47. Rebecca Eynon, “The Quantified Self for Learning: Critical Questions for Education,” Learning, Media, and Technology 40, no. 4, doi:10.1080/17439884.2015.1100797; Erik Duval, Joris Klerkx, Katrien Verbert, Till Nagel, Sten Govaerts, Gonzalo Parra, Jose Luis Santos, and Bram Vandeputte, “Learning Dashboards and Learnscapes” (paper presented at CHI 2012, USA). 48. Margie Jantti and Brian Cox, “Measuring the Value of Library Resources and Student Aca- demic Performance through Relational Datasets,” Evidence Based Library and Information Practice 8, no. 2: 135, doi:10.18438/B8Q89F. 49. Janice Simmons-Welburn, Georgie Donovan, and Laura Bender, “Transforming the Library: The Case for Libraries to End Incremental Measures and Solve Problems for Their Campuses Now,” Library Administration & Management 22, no. 3: 132, available online at https://journals.tdl. org/llm/index.php/llm/article/view/1740/1020 [accessed 26 February 2018]; Ed Cherry, Stephanie Havron Rollins, and Toner Evans, “Proving Our Worth: The Impact of Electronic Resource Usage on Academic Achievement,” College & Undergraduate Libraries 20, no. 3/4, doi:10.1080/10691316. 2013.829378; Gregory A. Crawford, “The Academic Library and Student Retention and Gradua- tion: An Exploratory Study,” portal: Libraries and the Academy 15, no. 1, doi:10.1353/pla.2015.0003; Joseph R. Matthews, Library Assessment in Higher Education, 2nd ed. (Santa Barbara, Calif.: Librar- ies Unlimited, 2014); Oakleaf, “The Value of Academic Libraries”; John K. Stemmer and David M. Mahan, “Investigating the Relationship of Library Usage to Student Outcomes,” College & Research Libraries 77, no. 3, available online at https://crl.acrl.org/index.php/crl/article/view/16514 [accessed 26 February 2018]. 50. Association of College and Research Libraries, “Standards for Libraries in Higher Educa- tion” (2011), available online at www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/slhe. pdf [accessed 26 February 2018]. 51. James J. Duderstadt, “Possible Futures for the Research Library in the 21st Century,” Journal of Library Administration 49, no. 3: 220, doi:10.1080/01930820902784770. 52. Kevin Cullen, “Delving into Data,” Library Journal 130, no. 13, available online at http:// lj.libraryjournal.com/2005/08/technology/delving-into-data/ [accessed 26 February 2018]; Scott Nicholson, “Approaching Librarianship from the Data: Using Bibliomining for Evidence- based Librarianship,” Library Hi Tech 24, no. 3, doi:10.1108/07378830610692136; Chwei-Shyong Tsai and Mu-Yen Chen, “Using Adaptive Resonance Theory and Data-Mining Techniques for Materials Recommendation Based on the E-library Environment,” Electronic Library 26, no. 3, doi:10.1108/02640470810879455. 53. Margie Jantti, “Libraries and Big Data: A New View on Impact and Affect,” in Quality and the Academic Library: Reviewing, Assessing and Enhancing Service Provision, ed. Jeremy Atkin- son (Cambridge: Chandos Publishing, 2016); John Renaud, Scott Britton, Dingding Wang, and Mitsunori Ogihara, “Mining Library and University Data to Understand Library Use Patterns,” Electronic Library 33, no. 3, doi:10.1108/EL-07-2013-0136. 54. Jantti and Cox, “Measuring the Value,” 166. 55. Ellen Collins and Graham Stone, “Understanding Patterns of Library Use among Under- graduate Students from Different Disciplines,” Evidence Based Library and Information Practice 9, no. 3, doi:10.18438/B8930K; Graham Stone and Bryony Ramsden, “Library Impact Data Project: Looking for the Link between Library Usage and Student Attainment,” College & Research Libraries 74, no. 6, doi:10.5860/crl12-406. 56. Shane Nackerud, Jan Fransen, Kate Peterson, and Kristen Mastel, “Analyzing Demo- graphics: Assessing Library Use across the Institution,” portal: Libraries and the Academy 13, no. 2, doi:10.1353/pla.2013.0017; Krista M. Soria, Jan Fransen, and Shane Nackerud, “Library Use and Undergraduate Student Outcomes: New Evidence for Students’ Retention and Academic Success,” portal: Libraries and the Academy 13, no. 2, doi:10.1353/pla.2013.0010. 57. Nackerud, Fransen, Peterson, and Mastel, “Analyzing Demographics,” 133. 58. Brian L. Cox and Margie Jantti, “Capturing Business Intelligence Required for Targeted Marketing, Demonstration of Value, and Driving Process Improvement,” Library & Information Science Research 34, no. 4, doi:10.1016/j.lisr.2012.06.002; Margie Jantti, “One Score On—The Past, http://www.fastcoexist.com/3018861/facial-recognition-in-the-classroom-tells-teachers-when-students-are-spacing http://www.fastcoexist.com/3018861/facial-recognition-in-the-classroom-tells-teachers-when-students-are-spacing http://www.fastcoexist.com/3042445/itll-be-a-lot-harder-to-cut-class-with-this-classroom-facial-recognition-app http://www.fastcoexist.com/3042445/itll-be-a-lot-harder-to-cut-class-with-this-classroom-facial-recognition-app https://doi.org/10.1080/17439884.2015.1100797 https://doi.org/10.18438/B8Q89F https://journals.tdl.org/llm/index.php/llm/article/view/1740/1020 https://journals.tdl.org/llm/index.php/llm/article/view/1740/1020 https://doi.org/10.1080/10691316.2013.829378 https://doi.org/10.1080/10691316.2013.829378 https://doi.org/10.1353/pla.2015.0003 https://crl.acrl.org/index.php/crl/article/view/16514 http://www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/slhe.pdf http://www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/slhe.pdf https://doi.org/10.1080/01930820902784770 http://lj.libraryjournal.com/2005/08/technology/delving-into-data/ http://lj.libraryjournal.com/2005/08/technology/delving-into-data/ https://doi.org/10.1108/07378830610692136 https://doi.org/10.1108/02640470810879455 https://doi.org/10.1108/EL-07-2013-0136 https://doi.org/10.18438/B8930K https://doi.org/10.5860/crl12-406 https://doi.org/10.1353/pla.2013.0017 https://doi.org/10.1353/pla.2013.0010 https://doi.org/10.1016/j.lisr.2012.06.002 Learning Analytics and the Academic Library 321 Present and Future of Measurement at UOW Library,” Library Management 36, no. 3, doi:10.1108/ LM-09-2014-0103; Jantti, “Libraries and Big Data”; Jantti and Cox, “Measuring the Value”; Alison Pepper and Margie Jantti, “The Tipping Point: How Granular Statistics Can Make a Big Differ- ence in Understanding and Demonstrating Value” (paper presented at the annual meeting of the Australian Library and Information Association Information Online, Australia, 2015). 59. Alan Rubel and Mei Zhang, “Four Facets of Privacy and Intellectual Freedom in Licensing Contracts for Electronic Journals,” College & Research Libraries 76, no. 4, doi:10.5860/crl.76.4.427. 60. Cox and Jantti, “Capturing Business Intelligence,” 310. 61. Jantti, “Libraries and Big Data.” 62. Ibid., 272. 63. Alexandra Alter and Karl Russell, “Moneyball for Book Publishers: A Detailed Look at How We Read,” New York Times (Mar. 14, 2016), available online at www.nytimes.com/2016/03/15/ business/media/moneyball-for-book-publishers-for-a-detailed-look-at-how-we-read.html [ac- cessed 26 February 2018]; Elizabeth Henslee, “Down the Rabbit Hole: E-books and User Privacy in the 21st Century,” Creighton Law Review 49, no. 1. 64. Trina J. Magi, “A Content Analysis of Library Vendor Privacy Policies: Do They Meet Our Standards?” College and Research Libraries 71, no. 3, doi:10.5860/0710254. 65. Sam Van Horne, Jae-eun Russell, and Kathy L. Schuh, “Assessment with E-textbook Analytics” (2015), available online at https://library.educause.edu/resources/2015/2/assessment- with-etextbook-analytics [accessed 26 February 2018]; Marc Parry, “Now E-textbooks Can Report Back on Students’ Reading Habits,” Chronicle of Higher Education (Nov. 8, 2012), available online at https:// www.chronicle.com/blogs/wiredcampus/now-e-textbooks-can-report-back-on-students- reading-habits/40928 [accessed 26 February 2018]. 66. Larry Johnson, Samantha Adams Becker, S.V. Estrada, and A. Freeman, “NMC Horizon Report: 2015 Library Edition” (2015), available online at www.nmc.org/publication/nmc-horizon- report-2015-library-edition/ [accessed 26 February 2018]. 67. “Measure the Future” (n.d.), available online at http://measurethefuture.net [accessed 26 February 2018]. 68. Edith A. Scarletto, Kenneth J. Burhanna, and Elizabeth Richardson, “Wide Awake at 4 AM: A Study of Late Night User Behavior, Perceptions and Performance at an Academic Library,” Journal of Academic Librarianship 39, no. 5, doi:10.1016/j.acalib.2013.02.006. 69. Jennifer Link Jones, “Using Library Swipe-Card Data to Inform Decision Making,” University Library Faculty Presentations (2010), available online at http://scholarworks.gsu.edu/ univ_lib_facpres/21 [accessed 26 February 2018]. 70. Matt Enis, “‘Beacon’ Technology Deployed by Two Library App Makers,” Library Journal (Nov.18, 2014), http://lj.libraryjournal.com/2014/11/marketing/beacon-technology-deployed-by- two-library-app-makers [accessed 26 February 2018]; Eddie Huebsch, “NavApp at the University of Oklahoma Libraries,” EDUCAUSE Review (June 27, 2016), available online at http://er.educause. edu/articles/2016/6/navapp-at-the-university-of-oklahoma-libraries [accessed 26 February 2018]. 71. Andrew Walsh, “Blurring the Boundaries between Our Physical and Electronic Li- braries: Location-aware Technologies, QR Codes and RFID Tags,” Electronic Library 29, no. 4, doi:10.1108/02640471111156713. 72. Association of Research Libraries, “Expenditure Trends in ARL Libraries, 1986–2012” (2012), available online at www.arl.org/storage/documents/expenditure-trends.pdf [accessed 26 February 2018]; Association of Research Libraries, “Library Expenditure as % of Total University Expenditure” (2013), available online at www.arl.org/storage/documents/eg_2.pdf [accessed 26 February 2018]. 73. Paul Prinsloo and Sharon Slade, “Student Privacy Self-management: Implications for Learning Analytics” (paper presented at the Fifth International Conference on Learning Analytics and Knowledge, USA, 2015). 74. Roger Clarke, “Information Technology and Dataveillance,” Communications of the ACM 31, no. 5, doi:10.1145/42411.42413. 75. Ian Kerr and Jessica Earle, “Prediction, Preemption, Presumption: How Big Data Threatens Big Picture Privacy,” Stanford Law Review Online 66 (2013), available online at https://review.law. stanford.edu/wp-content/uploads/sites/3/2016/08/66_StanLRevOnline_65_KerrEarle.pdf [accessed 26 February 2018]; Rubel and Jones, “Student Privacy in Learning Analytics”; Daniel J. Solove, The Digital Person: Technology and Privacy in the Information Age (New York, N.Y.: New York University Press, 2004). 76. Rob Kitchin, “Big Data, New Epistemologies and Paradigm Shifts,” Big Data & Society 1, no. 1, doi:10.1177/2053951714528481; Nassim N. Taleb, Antifragile: Things That Gain from Disorder (New York, N.Y.: Random House, 2012). 77. Jennifer Heath, “Contemporary Privacy Theory Contributions to Learning Analytics,” Journal of Learning Analytics 1, no. 1; Mark MacCarthy, “Student Privacy: Harm and Context,” https://doi.org/10.1108/LM-09-2014-0103 https://doi.org/10.1108/LM-09-2014-0103 https://doi.org/10.5860/crl.76.4.427 http://www.nytimes.com/2016/03/15/business/media/moneyball-for-book-publishers-for-a-detailed-look-at-how-we-read.html http://www.nytimes.com/2016/03/15/business/media/moneyball-for-book-publishers-for-a-detailed-look-at-how-we-read.html https://doi.org/10.5860/0710254 https://library.educause.edu/resources/2015/2/assessment-with-etextbook-analytics https://library.educause.edu/resources/2015/2/assessment-with-etextbook-analytics http://www.chronicle.com/blogs/wiredcampus/now-e-textbooks-can-report-back-on-students-reading-habits/40928 http://www.chronicle.com/blogs/wiredcampus/now-e-textbooks-can-report-back-on-students-reading-habits/40928 http://www.nmc.org/publication/nmc-horizon-report-2015-library-edition/ http://www.nmc.org/publication/nmc-horizon-report-2015-library-edition/ http://measurethefuture.net https://doi.org/10.1016/j.acalib.2013.02.006 https://scholarworks.gsu.edu/univ_lib_facpres/21/ https://scholarworks.gsu.edu/univ_lib_facpres/21/ http://lj.libraryjournal.com/2014/11/marketing/beacon-technology-deployed-by-two-library-app-makers http://lj.libraryjournal.com/2014/11/marketing/beacon-technology-deployed-by-two-library-app-makers http://er.educause.edu/articles/2016/6/navapp-at-the-university-of-oklahoma-libraries http://er.educause.edu/articles/2016/6/navapp-at-the-university-of-oklahoma-libraries https://doi.org/10.1108/02640471111156713 http://www.arl.org/storage/documents/expenditure-trends.pdf http://www.arl.org/storage/documents/eg_2.pdf https://doi.org/10.1145/42411.42413 https://review.law.stanford.edu/wp-content/uploads/sites/3/2016/08/66_StanLRevOnline_65_KerrEarle.pdf https://review.law.stanford.edu/wp-content/uploads/sites/3/2016/08/66_StanLRevOnline_65_KerrEarle.pdf https://doi.org/10.1177/2053951714528481 322 College & Research Libraries April 2018 International Review of Information Ethics 21 (2014), available online at www.i-r-i-e.net/inhalt/021/ IRIE-021-MacCarthy.pdf [accessed 26 February 2018]; Abelardo Pardo and George Siemens, “Ethical and Privacy Principles for Learning Analytics,” British Journal of Educational Technology 45, no. 3, doi:10.1111/bjet.12152; Rubel and Jones, “Student Privacy in Learning Analytics.” 78. Library Analytics and Metrics: Using Data to Drive Decisions and Services, ed. Ben Showers (London, U.K.: Facet Publishing, 2015). 79. Eric Hellman, “Libraries Are Giving Away the User-Privacy Store,” Go to Hellman (Aug. 14, 2013), available online at https://go-to-hellman.blogspot.com/2014/08/libraries-are-giving-away- user-privacy.html [accessed 26 February 2018]. 80. Showers, Library Analytics and Metrics, 154. 81. Asilomar I, “The Asilomar Convention for Learning Research in Higher Education” (2014), available online at http://asilomar-highered.info [accessed 26 February 2018]; Asilomar II, “Stu- dent Data and Records in the Digital Era” (2016), available online at https://sites.stanford.edu/ asilomar/ [accessed 26 February 2018]. 82. Sclater, “Code of Practice for Learning Analytics.” 83. Showers, Library Analytics and Metrics, 156. 84. Nissenbaum, Privacy in Context, 132. 85. American Library Association, “Code of Ethics of the American Library Association,” (2008): 1, available online at www.ala.org/advocacy/sites/ala.org.advocacy/files/content/proeth- ics/codeofethics/Code%20of%20Ethics%20of%20the%20American%20Library%20Association. pdf [accessed 26 February 2018]. 86. Kay Mathiesen and Don Fallis, “Information Ethics and the Library Profession,” in Handbook of Information and Computer Ethics, eds. Kenneth E. Himma and Herman T. Tavani (New York, N.Y.: John Wiley and Sons, 2008); Paul Sturges, “Information Ethics in the Twenty-first Century,” Australian Academic & Research Libraries 40, no. 4, available online at www.ifla.org/files/assets/ faife/publications/sturges/information-ethics.pdf [accessed 26 February 2018]. 87. American Library Association, “Intellectual Freedom and Censorship Q & A” (2016): para. 1, available online at http://www.ala.org/advocacy/intfreedom/censorship/faq [accessed 26 Febru- ary 2018]. 88. Eliza T. Dresang, “Intellectual Freedom and Libraries: Complexity and Change in the Twenty-first Century Digital Environment,” Library Quarterly 76, no. 2, doi:10.1086/506576. 89. Doug Johnson, “The Neglected Side of Intellectual Freedom,” Knowledge Quest (Sept. 26, 2013), available online at http://www.doug-johnson.com/dougwri/the-neglected-side-of- intellectual-freedom.html [accessed 26 February 2018]. 90. Colleen Carmean and Philip J. Mizzi, “The Case for Nudge Analytics,” EDUCAUSE Review (2010), available online at http://er.educause.edu/articles/2010/12/the-case-for-nudge-analytics [accessed 26 February 2018]. 91. Jeffrey Alan Johnson, “The Ethics of Big Data in Higher Education,” International Review of Information Ethics, 21 (2014). 92. Elanor Hall, “University of Melbourne Defends Wi-Fi Tracking of Students as Planning Move amid Privacy Concerns,” ABC News–Australia (Aug. 11, 2016), available online at www. abc.net.au/news/2016-08-12/university-of-melbourne-tracking-students-through-wifi/7723468 [accessed 26 February 2018]. 93. Neil Richards, Intellectual Privacy (New York, N.Y.: Oxford University Press, 2015); Neil M. Richards, “Intellectual Privacy,” Texas Law Review 87, no. 2. 94. Anna Kruse and Rob Pongsajapan, “Student-Centered Learning Analytics,” CNDLS Thought Papers (2012), available online at https://cndls.georgetown.edu/m/documents/thoughtpaper- krusepongsajapan.pdf [accessed 26 February 2018]. 95. D. Knox, “Spies in the House of Learning: A Typology of Surveillance in Online Learning Environments” (paper presented at EDGE 2010–e-Learning: The Horizon and Beyond Conference, Newfoundland, Canada, 2010). 96. Neil Selwyn, “Data Entry: Towards the Critical Study of Digital Data and Education,” Learning, Media and Technology 40, no. 1, doi:10.1080/17439884.2014.921628. 97. Paul Hirschfield, “School Surveillance in America: Disparate and Unequal,” in Schools Under Surveillance: Cultures of Control in Public Education, eds. Torin Monahan and Rodolfo D, Torres (Piscataway, N.J.: Rutgers University Press, 2009). 98. Daniel J. Solove, “‘I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy,” San Diego Law Review 44 (2007). 99. American Library Association, “Library Privacy Guidelines for E-book Lending and Digi- tal Content Vendors” (2015): para. 2, available online at http://www.ala.org/advocacy/privacy/ guidelines/ebook-digital-content [accessed 26 February 2018]. 100. Alan Rubel, “Libraries, Electronic Resources, and Privacy: The Case for Positive Intellectual Freedom,” Library Quarterly 84, no. 2: 184, available online at www.jstor.org/sta- http://www.i-r-i-e.net/inhalt/021/IRIE-021-MacCarthy.pdf http://www.i-r-i-e.net/inhalt/021/IRIE-021-MacCarthy.pdf https://doi.org/10.1111/bjet.12152 https://go-to-hellman.blogspot.com/2014/08/libraries-are-giving-away-user-privacy.html https://go-to-hellman.blogspot.com/2014/08/libraries-are-giving-away-user-privacy.html http://asilomar-highered.info https://sites.stanford.edu/asilomar/ https://sites.stanford.edu/asilomar/ http://www.ala.org/advocacy/sites/ala.org.advocacy/files/content/proethics/codeofethics/Code%20of%20Ethics%20of%20the%20American%20Library%20Association.pdf http://www.ala.org/advocacy/sites/ala.org.advocacy/files/content/proethics/codeofethics/Code%20of%20Ethics%20of%20the%20American%20Library%20Association.pdf http://www.ala.org/advocacy/sites/ala.org.advocacy/files/content/proethics/codeofethics/Code%20of%20Ethics%20of%20the%20American%20Library%20Association.pdf http://www.ifla.org/files/assets/faife/publications/sturges/information-ethics.pdf http://www.ifla.org/files/assets/faife/publications/sturges/information-ethics.pdf http://www.ala.org/advocacy/intfreedom/censorship/faq https://doi.org/10.1086/506576 http://www.doug-johnson.com/dougwri/the-neglected-side-of-intellectual-freedom.html http://www.doug-johnson.com/dougwri/the-neglected-side-of-intellectual-freedom.html http://er.educause.edu/articles/2010/12/the-case-for-nudge-analytics http://www.abc.net.au/news/2016-08-12/university-of-melbourne-tracking-students-through-wifi/7723468 http://www.abc.net.au/news/2016-08-12/university-of-melbourne-tracking-students-through-wifi/7723468 https://cndls.georgetown.edu/m/documents/thoughtpaper-krusepongsajapan.pdf https://cndls.georgetown.edu/m/documents/thoughtpaper-krusepongsajapan.pdf https://doi.org/10.1080/17439884.2014.921628 http://www.ala.org/advocacy/privacy/guidelines/ebook-digital-content http://www.ala.org/advocacy/privacy/guidelines/ebook-digital-content http://www.jstor.org/stable/10.1086/675331 Learning Analytics and the Academic Library 323 ble/10.1086/675331 [accessed 26 February 2018]. 101. Joshua Gans, “Harvard Business School Publishing Crosses the ‘Evil’ Academic Line,” Digitopoly (Oct. 6, 2013), available online at www.digitopoly.org/2013/10/06/harvard-business- school-publishing-crosses-the-evil-academic-line/ [accessed 26 February 2018]; Reference and User Services Association, “RUSA/BRASS Statement on Harvard Business Review Pricing and Access” (2013), available online at www.ala.org/rusa/sections/brass/publications/statement_hbr [accessed 26 February 2018]. 102. April D. Lambert, Michelle Parker, and Masooda Bashir, “Library Patron Privacy in Jeopardy: An Analysis of the Privacy Policies of Digital Content Vendors” (paper presented at the annual meeting of the Association for Information Science and Technology, 2015). 103. Rubel and Zhang, “Four Facets of Privacy.” 104. Michael Zimmer, “Patron Privacy in the ‘2.0’ Era: Avoiding the Faustian Bargain of Library 2.0,” Journal of Information Ethics 22, no. 1, doi:10.3172/JIE.22.1.44. 105. David W. Leebron, “A Look at the Competitiveness of Higher Education,” The Hill (June 20, 2014), available online at http://thehill.com/blogs/pundits-blog/209980-a-look-at-the- competitiveness-of-higher-education [accessed 26 February 2018]. 106. Human Values and the Design of Computer Technology, ed. Batya Friedman (Chicago, Ill.: University of Chicago Press, 1997); Langdon Winner, “Do Artifacts Have Politics?” Daedalus 109, no. 1, available online at www.jstor.org/stable/20024652 [accessed 26 February 2018]. 107. National Information Standards Organization, “NISO consensus principles on User’s Digital Privacy in Library, Publisher, and Software-Provider Systems” (2015), available online at https://groups.niso.org/apps/group_public/download.php/16064/NISO%20Privacy%20Prin- ciples.pdf [accessed 26 February 2018]; Andrew Asher and Lisa Hinchliffe, “All the Data: Privacy, Service Quality, and Analytics” (paper presented at the annual meeting of the American Library Association, San Francisco, Calif., 2015), available online at http://alaac15.ala.org/m/node/28724 [accessed 7 March 2018]. 108. McMenemy, “Rights to Privacy.” 109. Michael Khoo, Lily Rozaklis, and Catherine Hall, “A Survey of the Use of Ethnographic Methods in the Study of Libraries and Library Users,” Library & Information Science Research 34, no. 2, doi:10.1016/j.lisr.2011.07.010. 110. Douglas Blair et al., “The Compelling Case for Data Governance” (2015), available online at https://library.educause.edu/~/media/files/library/2015/3/ewg1501-pdf.pdf [accessed 26 February 2018]. 111. Sheila Corrall, Mary Anne Kennan, and Waseem Afzal, “Bibliometrics and Research Data Management Services: Emerging Trends in Library Support for Research,” Library Trends 61, no. 3, doi:10.1353/lib.2013.0005; Andrew M. Cox and Stephen Pinfield, “Research Data Management and Libraries: Current Activities and Future Priorities,” Journal of Librarianship and Information Science 46, no. 4, doi:10.1177/0961000613492542. 112. Martin Rein, Social Science and Public Policy (New York, N.Y.: Penguin Books, 1976); E. Sam Overman and Anthony G. Cahill, “Information Policy: A Study of Values in the Policy Process,” Policy Studies Review 9, no. 4: 804, doi:10.1111/j.1541-1338.1990.tb01080.x. http://www.jstor.org/stable/10.1086/675331 http://www.digitopoly.org/2013/10/06/harvard http://www.ala.org/rusa/sections/brass/publications/statement_hbr https://doi.org/10.3172/JIE.22.1.44 http://thehill.com/blogs/pundits-blog/209980-a-look-at-the-competitiveness-of-higher-education http://thehill.com/blogs/pundits-blog/209980-a-look-at-the-competitiveness-of-higher-education http://www.jstor.org/stable https://groups.niso.org/apps/group_public/download.php/16064/NISO%20Privacy%20Principles.pdf https://groups.niso.org/apps/group_public/download.php/16064/NISO%20Privacy%20Principles.pdf http://alaac15.ala.org/m/node/28724 https://doi.org/10.1016/j.lisr.2011.07.010 https://library.educause.edu/~/media/files/library/2015/3/ewg1501-pdf.pdf https://doi.org/10.1353/lib.2013.0005 https://doi.org/10.1177/0961000613492542 https://doi.org/10.1111/j.1541-1338.1990.tb01080.x