The author discusses how technology’s impact on learning and pedagogy—distance learning—presents new chal- lenges to the institutional researcher. The Impact of Distance Education on Institutional Research Trudy Bers Why is distance education the focus of an article in a New Directions issue devoted to technology and institutional research? Why, anyway, should institutional researchers pay attention to distance education? The answer is simple: distance education is radically changing higher education. An explosion is taking place in the number of distance education deliv- ery agents and students enrolled in courses. Dolence (1998) notes the fol- lowing statistics: • Forty of the fifty states have adopted virtual university strategies. • More than sixteen thousand courses are indexed on the World Wide Web. • There are already over one million on-line learners. • More than 350 companies produce courseware. • More than one thousand corporations sponsor corporate universities. • Commercial learning centers are proliferating and successful. Dolence’s examples reflect distance education delivered primarily via tech- nology in an asynchronous format, clearly the arena in which the most dra- matic, exponential growth is occurring. However, distance education actually takes many forms, including old-fashioned correspondence courses; courses delivered via audiotapes or videotapes; interactive television courses requir- ing real-time, location-specific participation; and on-line courses delivered asynchronously through the Web. The newer forms of distance education simply could not exist were it not for technology. Distance education is related to other changes occurring in higher edu- cation as well, such as movements toward competency-based education, NEW DIRECTIONS FOR INSTITUTIONAL RESEARCH, no. 103, Fall 1999 © Jossey-Bass Publishers 61 4 62 HOW TECHNOLOGY IS CHANGING INSTITUTIONAL RESEARCH credit for alternative learning experiences, and credentialing through indus- try or corporate mechanisms rather than through formal college degrees. Distance education facilitates and complements these other changes, deliv- ering learning opportunities in nontraditional ways in nontraditional set- tings, often to nontraditional students. I am convinced that distance education will have profound effects on the roles, necessary skills, relationships, and ways of doing business for institutional researchers. In this chapter, I explore what these effects may be. Specifically, I describe ways in which distance education is beginning to influence the current and emerging environment for conducting research about institutions, assessment, and strategic planning. Challenges brought forth by distance education are identified throughout. Next, I present results of a brief survey of institutional researchers in two- and four-year colleges regarding the ways in which they are responding to research demands driven by distance education. I conclude with some broad-based issues raised by distance education and some suggestions for how institutional researchers might address this new educational reality. Change Technology is an agent of change, both directly as it affects the ways in which people do their work and indirectly as it influences relationships and expectations. As Bolman and Deal (1991) note, change affects many aspects of an organization, including roles, necessary skills, power relationships, and existing agreements and pacts. Though distance education is not nec- essarily based on technology—consider old-fashioned correspondence courses, for example—the reality is that the explosion of distance education is due primarily to technology: interactive television courses, on-line courses delivered asynchronously via the Web, and courses available on CD- ROM. Such courses are dramatically affecting the organization, availability, traditional roles, nature of educational providers, delivery systems, and even the definition of what constitutes a “course.” Taken one step further, dis- tance education will inevitably change the way we think about higher edu- cation, because traditional definitions will apply only to a shrinking segment of the industry. Though not directly caused by distance education, other changes emerging in higher education are often linked with it. Such changes include competency-based education, credit for alternative learning experiences, and credentialing through industry or corporate mechanisms rather than through formal college degrees. For simplicity, I include these related changes under the umbrella of “distance education.” To illustrate the importance of these changes, consider two candidates for a job to maintain and support an Oracle database at your institution. One has just earned a bachelor’s degree in computer science; he has little work experience, though he claims to be familiar with Oracle databases. The other has passed the Oracle Certified Database Administrator test and has 63THE IMPACT OF DISTANCE EDUCATION ON INSTITUTIONAL RESEARCH two years of work experience in an environment similar to yours; she has the equivalent of only one year of undergraduate work. Which person is more likely to be hired? (For a brief description of the Certified Oracle Data- base Administrator certification program and test, see Couchman, 1998. Other software companies, including Microsoft and Novell, have similar programs. These certification programs do not require formal study for indi- viduals seeking certification; instead they focus on demonstrated compe- tencies.) Though it seems self-evident that distance education will affect insti- tutional research, the extent to which the research community or key deci- sion makers have thought about these effects is unclear. In 1997, the National Postsecondary Education Cooperative (NPEC)1 convened a panel of experts to explore the impact of technology on data systems. Though not charged to look at distance education per se, many pan- elists’ comments and the papers written afterward tended to define technol- ogy operationally as “distance education.” Because so much reporting of student enrollments, student outcomes, institutional characteristics, and revenues and expenses is in the domain of institution research offices and also because distance education is virtually a creature of technology, the issues identified by the panelists illustrate ways in which technology affects institutional research. Cartwright (1998) identified six major, albeit overlapping, themes that emerged from the NPEC panel. 1. Growth in distance and technology-based education render tradi- tional definitions of student, faculty load, cost, and other measures either meaningless or misleading. 2. Unbundling of educational services, such as curriculum develop- ment, course delivery, advising, and assessment, along with changing pat- terns of student attendance (multiple institutions, stop-in/stop-out), make it difficult to evaluate outcomes. A shift to learner-centered rather than insti- tutionally centered data will further affect the utility and appropriateness of institutional research. 3. Faculty roles, including the definitions of workload and contact hours, are changing rapidly, but metrics for calculating and reporting these have not kept pace with these changes. Even more to the point, policies and practices associated with contracts, compensation, evaluation, and tenure have rarely been adapted or made sufficiently flexible to accommodate emerging faculty roles. Though not examined specifically by the NPEC panel, it should be noted that faculty involvement in activities such as advis- ing is likely to change. For example, will advisers be expected to help stu- dents choose courses from a number of distance education providers to ensure that the courses are not duplicative and, in combination, satisfy degree requirements or otherwise meet students’ objectives? 4. Student participation patterns, such as attendance at multiple insti- tutions simultaneously, taking courses through nontraditional education providers (for example, proprietary schools and corporate-based training 64 HOW TECHNOLOGY IS CHANGING INSTITUTIONAL RESEARCH sites), multiple transfers, and time away from school make it difficult to track students or to assess their educational outcomes. It is even difficult to report such basic indicators as completion or transfer. 5. New instructional delivery models will make it more difficult to evaluate student progress through postsecondary education and to assess their learning gains—cognitive or achievement measures attributable to enrollment in college courses. Competency-based measures are likely to grow in acceptability and feasibility, supplanting more traditional measures of seat time and credits earned. These new models relate as well to chang- ing patterns of teaching and learning. There is growing emphasis on learner- centered instruction and lifelong learning for continual skill upgrading, professional development, and personal enrichment. Students taking courses for these reasons are less likely to want formal college credits than individuals seeking actual degrees. 6. The final theme of the panel was impact on Interdisciplinary Post- secondary Education Data System (IPEDS) financial reporting: whether and to what extent IPEDS can accommodate and accurately portray revenue streams and expenditures associated with distance education. In the next sections, I have chosen to focus on the impact of distance education in three primary areas germane to institutional researchers and often assigned wholly or in part to their offices: research about institutions, assessment, and strategic planning. Impact of Distance Education on Research About Institutions Institutional research often depends on data and information that are defined, entered, designed, and reported by a variety of other offices in the college or university, as well as by external organizations such as state gov- erning or coordinating boards or other educational institutions. Distance education is putting new pressures on these critical aspects of research and adding complexities to them. Failure to make appropriate adjustments and to accommodate distance education issues in existing data definitions, com- pilations, management, and exchanges will seriously erode the validity, com- prehensiveness, and utility of many institutional research projects. Data Definitions and Calculating Variables. A major impact of dis- tance education is that it forces the reconceptualization of data definitions and calculations for many variables that are part and parcel of routine reports and analyses conducted by institutional researchers. To illustrate, think of three “standard” measures: faculty load, student population, and credit hours. Faculty Load. Faculty teaching distance education courses may no longer have the sole responsibility for creating instructional materials, deliv- ering lectures, organizing and facilitating learning activities, and evaluating student performance. Individuals with specific skills, such as developing 65THE IMPACT OF DISTANCE EDUCATION ON INSTITUTIONAL RESEARCH courseware, might be assigned that role, while other faculty members might provide the courseware content. External certification examinations pre- pared by industry representatives or testing agencies could well supplant traditional instructor-developed-and-graded examinations. Several years ago, Armajani, Heydinger, and Hutchinson (1994) pro- posed a new model for higher education. The “Educational Enterprise” par- adigm they conceived unbundles educational services and contracts to provide them through four separate organizations, each of which special- izes in a particular area: teaching, facilities, learning resources, and learn- ing technology. Faculty would be part of the teaching organizations, which would provide instruction under contract to the Educational Enterprise. The Educational Enterprise paradigm is not operational, though aspects of the University of Phoenix, Synergistics, and other agencies that package courses for delivery by instructors hired to implement the delivery but not to design and develop course contents come close. Nevertheless, the concept proposed by Armajani and his colleagues is intriguing for this chap- ter. The enterprise would foster new definitions of faculty roles by enabling faculty members to concentrate on the instructional services each was most interested in providing. The market-driven character of the enterprise means that either the market would sustain faculty members teaching in traditional ways or they would no longer have jobs. The entrepreneurial nature would also promote redefining faculty roles to improve cost effi- ciencies as well as quality of deliverables. The National Center for Higher Education Management Systems (NCHEMS) has put forth the proposition that faculty are assets to an insti- tution and that the nature and expectations for what these assets should be providing are changing ( Jones, 1999). Jones suggests that the primary role of faculty is to deliver instruction. This delivery is distinguished by five activities that could be undertaken by different individuals who are respon- sible for designing the course or curriculum: • Designing the course or curriculum • Developing the course or curriculum through selection of materials and similar activities • Delivering instruction through class meetings that cover previously selected material • Mediating the learning process by helping students understand material • Assessing individual student learning Under the NCHEMS model, one could assign roles to different individuals, with the combination of their work incorporated into a single course. An article in the Chronicle of Higher Education (Guernsey, 1998) pro- vides another illustration of how faculty roles are changing. It describes the emergence of a new career track, that of “instructional designer.” Former faculty members or individuals who had initially sought full-time teaching 66 HOW TECHNOLOGY IS CHANGING INSTITUTIONAL RESEARCH positions appear to be filling these positions, taking either primary or key support roles for preparing instructional materials that in the past faculty members were expected to produce by themselves. A key attribute of instructional designers is their expertise in both academic computing and college teaching. This development is both practical and threatening. It is a natural extension and elevation of work done by audiovisual and academic tech- nology support staff. Instructional designers provide valuable services for faculty who may feel overwhelmed by the demands of keeping up with their disciplines and becoming technologically savvy. At the same time, the more the work traditionally vested in faculty is outsourced, even to employees of the same institution, the less faculty might be perceived as pivotal to the institution, at least in terms of teaching. This has profound implications as well for faculty reward systems and criteria for tenure and promotion, par- ticularly in institutions that give substantial weight to teaching. Student Population. Distance education affects another variable typi- cally used in institutional research, student population. No longer do courses begin and end during specific weeks of a semester. Rather, students may enroll continuously, often at more than one institution, so the course load of a single individual could vary by the week. Determining even the number of students enrolled at a single institution becomes problematic, unless the calculation is done at the end rather than near the beginning of an agreed-on period so as to include enrollments from courses that began at any point during the designated period. This trend has led to some discussion about adding an annual undu- plicated count of students to the IPEDS survey in addition to, or some might suggest in lieu of, the fall headcount now collected. No decisions have been made about this; such a change could certainly affect how and when insti- tutions capture and tabulate data. Even if IPEDS continues to collect fall headcounts only, others interested in the number of students served by insti- tutions will undoubtedly want to know the total number of individuals who take courses over the year. This change also has implications for schools with more transient or cyclical enrollments—for example, those with a sig- nificant number of individuals who enroll in winter, spring, or summer but are not included in the fall headcounts. Institutions themselves may bene- fit from having more complete counts of students, particularly if they want to report total number of individuals served. Credit Hours. A final example of a commonly used variable requiring reconceptualization because of distance education is the credit hour, the currency on which college degrees are based, in that earning a degree depends solely on obtaining a specific number of credits in designated courses at a specified grade level. Implications of rethinking credit hours are challenging, complex, and intersect with other changes taking place in post- secondary education. A fuller discussion of some of the issues and implica- tions is presented later in this chapter. 67THE IMPACT OF DISTANCE EDUCATION ON INSTITUTIONAL RESEARCH Sharing Data and Data Exchanges. Another area in which distance education affects institutional research derives from student mobility and enrollment in multiple institutions. This pattern has actually existed for years; Adelman (1998) found, for example, that 54 percent of students from the high school class of 1982 who had attended a four-year college by the age of thirty had actually enrolled in more than one school. The growth of distance education and the establishment of remote sites by colleges and universities are likely to swell the number of students who take courses from multiple institutions. A growing challenge will be to determine who these students are and how to count them, as well as how much duplication or overlap of services there might be as students avail themselves of assis- tance at more than one institution. The institutional researcher who wants to portray his or her own insti- tution can continue to rely on institutional databases. But to gain a greater understanding of what is really happening to students, it is essential to look beyond a single college or university, perhaps even beyond a single state’s higher education system. For example, we can look at retention or com- pletion within an institution, but from a national perspective and for a richer understanding of what is happening to people, it would be more informative to take a systems approach to retention and completion. What if an institution were to document that 20 percent of its freshmen left but transferred successfully and earned bachelor’s degrees elsewhere? Would that not be an indicator of success for both the institution and those stu- dents? Such an approach demands both student-centered data collection and interpretation and a perspective extending beyond a single college or university. The pressure to share data about students is already great; legislative and public tolerance for accepting answers such as “we don’t know” when colleges are asked about the number of students who graduate or who trans- fer has been nearly exhausted. A 1995 survey conducted by the State Higher Education Executive Officers (SHEEO) revealed that thirty-two states had comprehensive databases at the state level, including both two-year and four-year public institutions. An additional nine states had some level of statewide or significant systemwide databases (Russell, 1995). Independent institutions are more resistant to sharing, but their ability to hold out is likely to erode as legislators call for an accounting of the extent to which distance education, which so far carries a far greater expense than is usually realized, is really leading to greater productivity and efficiency. Most interpretations of the Family Education Rights and Privacy Act (FERPA) continue to shield independent institutions from being required to share or exchange unit record data. Although there is resistance to sharing, there is also competitive pressure to deliver services students and other stakeholders expect and demand. In their quest to understand the full extent and impact of distance education, private institutions will feel ever- growing obligations to share data. 68 HOW TECHNOLOGY IS CHANGING INSTITUTIONAL RESEARCH Revamping Databases and Transcripts. Ewell (1998) suggests both to facilitate transfer and to represent students’ learning outcomes more com- prehensively, transcripts and databases will have to be remodeled. The stu- dent rather than the institution will need to become the principal unit of analysis, and learning experiences beyond traditional credit courses will need to be included. The key point is that as distance education evolves and related changes such as competency-based verification of learning expand, confusion about what constitutes a “credit” will grow. Student mobility across institutions will further exacerbate the confusion unless institutions are willing to accept transfer credits earned through nontraditional means at another college or university. Currently, transcripts focus on courses taken, credits earned, and degrees awarded. Most transcripts provide detail only for courses taken at the institution issuing the transcript, so that a full history of a students’ post- secondary education requires examining transcripts from all institutions attended. Courses are equated to traditional semester or quarter credit hours. Contact hours are the commonly used metric for determining the num- ber of credits associated with each course and ultimately the earning of a degree. For example, fifteen to sixteen hours of lecture usually translate into one semester credit. However, most distance learning, particularly when delivered asynchronously, is self-paced, not tied to a given number of min- utes or hours in class. When distance education courses are, in content, nature of assignments, and expected student outcomes, structured to be comparable to traditional on-campus courses, the number of credits attached to the distance education course is rarely in question. Packer (1998) has proposed the creation of “career transcripts,” which combine features of academic transcripts and résumés. A career transcript will incorporate records of college courses and degrees, competencies doc- umented through vehicles such as industry or corporate certification processes, and educational or workplace experiences and honors that indi- cate achievement or demonstrated abilities. The career transcript recognizes and emphasizes lifelong learning and the expansion of education across space and time. For institutional researchers, replacing traditional transcripts with career transcripts will require dramatic revisions in conceptualizing and then calculating measures of student progress and institutional effective- ness. Who will lead efforts to create career transcripts or similar records is unclear. This may be an area where entrepreneurs both inside and outside the academy take the lead—for example, education administrators who see career transcripts as vehicles for generating fees for services and initiating novel ancillary services for students, or external businesspeople knowl- edgeable about formal colleges and universities but operating outside them who see this as a business opportunity. 69THE IMPACT OF DISTANCE EDUCATION ON INSTITUTIONAL RESEARCH The Relevance of Indicators of Quality. Although accrediting and accountability agencies of all sorts now stress outcomes more than inputs as indicators of quality, input variables are still used in a variety of national surveys and institutional promotions to illustrate the quality of a college. A recent special report prepared by the Institute for Higher Education Policy (1999) notes that distance education has the potential for undermining these traditional indicators of quality. Books in the library, faculty-to- student ratios, and other input measures, which continue to be used as indi- cators of quality, are quite irrelevant in the context of distance education. The Separation of Policy and Practice for Distance Education from IR. The design and implementation of distance education programs, courses, and services may be handled at institutions by individuals who are not accustomed to thinking about data collection and reporting. Typically, these people are unaware of nuances or issues for research and reporting. Thus it is important for institutional researchers to be closely linked with the individuals making both policy-level and operational decisions about distance education. Determining whether this is taking place is problem- atic, however, since it appears that many institutions are pushing to imple- ment at least some distance education to meet governing board or other external funding incentives. Institutions no doubt feel pressures to be “on the cutting edge” without having the time or foresight to think through the implications of their actions. Conversations I have had with administrators at other institutions about subjects such as the contractual implications of distance education and handling services for distance education students suggest that many issues are addressed only when actual questions or prob- lems arise. Impact of Distance Education on Assessment Distance education fosters a number of challenges regarding the assessment of student learning outcomes, a central component of accreditation self- studies, accountability reports, performance funding systems, and other mandates for reporting and accountability. Because assessment is often a responsibility of the institutional research office, which may act, for exam- ple, as assessment coordinator or faculty consultant, it is important for insti- tutional researchers to be aware of these challenges and strategies for addressing them. Modes of Delivery. Ewell (1998) has noted three changes in the teaching- learning environment induced by distance education, each of which affects the assessment of student learning outcomes. The first change is pressures resulting from dispersed modes of instructional delivery; these in turn increase the difficulty of aligning instruction with originally established learning goals and maintaining standards. Moreover, distance education may affect learning in ways that are not yet understood or measured. 70 HOW TECHNOLOGY IS CHANGING INSTITUTIONAL RESEARCH The second change results from pressures created by increasingly asyn- chronous delivery modes. Because students progress at different paces, mon- itoring and measuring their progress must be detached from traditional time-based practices and data systems built on units of time such as contact hours and semesters. Though perhaps not directly germane to institutional researchers, asynchronous delivery and decoupling courses from usual met- rics such as meeting hours or weeks in a semester raise questions about fac- ulty office hours, when best to provide advising, and how long a student should be considered to be “enrolled” in a course and eligible for the insti- tution’s support services even if apparently making no progress toward com- pletion. The third change suggested by Ewell is pressures arising from multi- institutional modes of instruction delivery. Student mobility across institu- tions, complicated by their earning competency-based certifications through nontraditional means, raises real questions about the extent to which a sin- gle institution can assess learning outcomes achieved at that college or even keep track of “credits” and “competency verifications” acquired elsewhere. Related to this is the issue of college transcripts. As noted, the medium of exchange for transfer, the earned credit recorded on institutionally based transcripts, is no longer applicable in a distance education setting. The typ- ical college transcript records grades and credits earned at the institution and the total of credits transferred to that institution from elsewhere or awarded through alternative means such as proficiency credit or portfolio analysis. Institutional transcripts might not, however, list the specific courses or course equivalencies of transfer or alternative credits and are even less likely to include information about noncredit learning experiences or external certifications. Unfortunately, this is the major resource used by institutional researchers to assess student progress and to calculate accepted, if not appropriate, indicators of institutional effectiveness such as gradua- tion and transfer rates. Academic Integrity. Distance education poses other assessment chal- lenges as well. Many faculty remain skeptical about whether students in dis- tance education classes are actually doing the work they submit. Though concerns about academic integrity are not a monopoly of distance educa- tion, they take on new dimensions in environments where instructors might never meet their students face to face, see examples of their handwriting, or hear their voices. In addition, faculty do not have the capability of giv- ing in-person, real-time assignments that provide benchmarks about stu- dents’ knowledge and abilities against which to measure out-of-class work and thereby to verify that the work really was done by the student. Works in Progress. Time and permanency are another set of assess- ment issues. Web-based assessment submissions such as papers or projects, whether for course, programmatic, or institution-level assessment, can be modified continually by students. Unless the person collecting materials prints or saves the work at a specific point in time, it is never clear when 71THE IMPACT OF DISTANCE EDUCATION ON INSTITUTIONAL RESEARCH the material is “final.” This is analogous to the challenge of doing research on live rather than frozen databases. The former may be more current, but continual changes in the database make it virtually impossible to conduct research because one cannot return to the data source with confidence that it is the same each time. Style Versus Substance. Another issue related to assessment is dis- entangling presentation from substance. McLean (1999) notes that skills and creativity are unevenly distributed in classes. Recall that students who take courses via the Web and who submit their papers via the Web have all the stylistic resources of the Web at their disposal. Some students are much more capable or interested than others in accessing these resources to enhance their Web-based assignments, using features such as background color and images, links, animation, and audio. This can be confusing and potentially misleading to the evaluator, who may inadvertently confuse style with content. Complex linkages can pull the evaluator off track and make the flow of the “paper” difficult to follow. The use of nonstandard colors for links can also be problematic. McLean (1999) asserts that “when evaluating dozens of assignment products, the evaluator may come to depend upon the link colors as an indication of visited links (red means we have seen this one) and disoriented if the student elects to reverse the colors (so red means not visited).” But having visited a link, denoted through color, is not the same thing as reading, critiquing, and using information from that site in completing one’s assignment. This is really no different from a student padding a bibliography, claiming to have consulted more references than he or she has, but evaluators will have to train themselves not to be seduced by color cues that come on Web-based assignments. Reconceptualizing the Student Experience. Yet another assessment issue is reconceptualizing the student experience and then creating and administering assessment tools that are meaningful and appropriate to that experience. For example, it is normal to think of residential students as attending real-time, real-location courses, even if we acknowledge that some may supplement traditional classes with distance education classes. But there is a more dramatic pattern that can emerge: students living on cam- pus because they want to be away from home and have the experience of campus life but taking all of their classes through distance education, never going to a classroom or interacting face to face with teachers or fellow stu- dents. How do we assess the experience and learning outcomes of these stu- dents? This scenario, suggested by Dan House, director of institutional research at Northern Illinois University, is but one illustration of the kinds of behavioral changes and attendance patterns that are likely to emerge as distance education becomes more ubiquitous and as students discover and create whole new ways to “attend” college. Assessment of Learning in Traditional Classroom Settings. One of the unexpected consequences of distance education on assessment may be 72 HOW TECHNOLOGY IS CHANGING INSTITUTIONAL RESEARCH the challenge it poses for improved assessment of traditionally delivered edu- cation. Despite nearly a decade of accrediting agency demands for assessment and numerous state agency accountability mandates, institutions are still struggling with assessment. Skepticism about whether students learn through distance education and the need for distance education to “prove itself” may be prompting more thorough research about student learning outcomes in distance education than in traditional courses. But if a key criterion for demonstrating the value of distance education is that its students perform as well or better than on-campus students, then assessment of student out- comes in those on-campus courses and programs has to occur as well. A book by Thomas L. Russel (1999), The No Significant Difference Phenomenon, and a related Web site (teleeducation.nb.ca/nosignificantdifference) review over three hundred studies on the effectiveness of all types of distance sys- tems. Russell concluded that there is no significant difference in learning out- comes when face-to-face and distance learning options are compared for the same populations.2 Some of the most interesting work about assessment and distance edu- cation is taking place in the competency-based curriculum at Western Gov- ernor’s University (WGU), a virtual university. The competency-based credential delivered through WGU is premised on these fundamental assumptions: competencies are skills or knowledge identified by profes- sionals in a particular field as being essential for mastery of that field; one can demonstrate competencies by completing assessments; and assessments take varied forms, including paper-and-pencil or computer-based tests or practical demonstrations of skills (Dolence, 1998). Instead of complet- ing a set of courses to earn a degree or certificate, WGU students in the competency-based curriculum must demonstrate they have acquired a spe- cific set of competencies. It may be that one of the most far-reaching effects of WGU will be advancing a national conversation about competency-based education and credentialing, regardless of setting. Impact of Distance Education on Strategic Planning A third major area in which distance education will affect institutional research is strategic planning. From a broad perspective, institutions may want to examine whether their use of distance education is truly promoting new ways of teaching and learning and of reengineering the institution. A narrower view would be to examine distance education as a new delivery mechanism premised on existing concepts of instruction (Privateer, 1999). The former is more frightening because it calls into question decades, if not centuries, of academic traditions. Therein lies the real challenge to strategic planning posed by distance education. Regardless of which approach is taken, there are some key linkages that ought to be made, but rarely seem now to exist, between strategic planning and distance education. Indeed, it appears that in most institutions, plan- 73THE IMPACT OF DISTANCE EDUCATION ON INSTITUTIONAL RESEARCH ning for distance education is taking place as a separate process from more comprehensive or traditional planning. Strategic Planning for Educational Delivery in an Integrated Sys- tem. In most institutions, IR offices direct and manage or at least lend pri- mary support to strategic planning efforts. Traditional strategic planning relies heavily on environmental scanning to detect external trends likely to have an impact on the institution, on competitive analyses to assess what other postsecondary institutions compete for and offer to the same pool of students, and on the identification of strengths, weaknesses, opportunities, and threats affecting the institution. Enrollment management, which is both strategic and tactical, will also be affected by distance education as oppor- tunities for students to enroll in multiple institutions simultaneously or sequentially expand and as students opt to take certain courses through dis- tance education from an institution other than their “primary” one. In a recent Education Commission of the States policy paper, Mingle and Ruppert (1998) pose five issues regarding which states will have to play a leadership role. The issues exist at the institutional level as well and pro- vide a framework for guiding strategic discussions about distance education and technology. The issues are state (or institutional) goals and priorities, statewide (or institutional) networks, new organizational structures, cost effectiveness, and financing and investment strategies. Rapid Change. There are other factors to consider with respect to strategic planning. One of the most compelling is that changes are occur- ring so rapidly that it is difficult to project or imagine the future beyond two or three years. Thus time horizons for strategic planning need to be adjusted, and flexibility has to be a key element in both strategic planning processes and in plans themselves. Growth of Alternative Providers. Another factor is the exponential growth in the number and variety of agents delivering postsecondary edu- cation and training, described earlier in this chapter. Identifying, under- standing, and addressing “the competition” is growing more complicated, unpredictable, and frustrating. The environment in which a single institu- tion or system operates is no longer constrained by geography, time, national borders, or definitions of entities that have the capability and authority to develop, deliver, and certify learning. Assessing the Consequences of Entering or Not Entering the Dis- tance Education Market. Institutions must decide whether and to what extent they will offer distance education and what the consequences might be if they choose not to. For example, will a college or university that does not offer any distance education be perceived as old-fashioned, unrespon- sive to customer demand, resistant to technology? How can distance edu- cation be effectively and efficiently integrated with other programs and services and be consistent with the institution’s mission? Can the institu- tion afford distance education? And how can the effectiveness and efficiency of its integration be measured? 74 HOW TECHNOLOGY IS CHANGING INSTITUTIONAL RESEARCH Costs. Issues of cost are among the most important factors that need to be considered in strategic planning. Distance education has implications not just for resource use and allocations but for opportunity costs associ- ated with investments in distance education. There are numerous generic and anecdotal assertions that technology will be a cost-effective approach to expanding the delivery of and access to education, enabling schools to do more with less. Privateer (1999) states: The literal presence of computers on campus, together with a decade-old call to “do more for less,” factor heavily in the growing tendency of federal offi- cials, governors, legislators, governing boards, and college and university administrators to envision instructional technologies as a panacea able to maintain the status quo while dramatically cutting delivery costs. The allure is certainly powerful: lower overall operating and administrative costs, more automated and time-independent instruction, less yet more “productive” instructors, greater course availability and offerings, and access to lower cost resource materials all translate into savings [p. 66]. Berge and Schrum (1998) suggest that a first step in assessing costs is to take an inventory of existing resources, including hardware, software, dis- tance delivery technologies, and technical and faculty support staff, as well as to identify technology-enhanced projects already functioning. Armed with baseline data, financial analyses can then be made. Berge and Schrum assert that “technology-enhanced courses usually cost more to produce and deliver than traditional courses. . . . Once the analyses are made, the dis- tance education program needs to be compared to other resource allocation opportunities that are presented to . . . the broader institutional decision- making structure for assessment and decisions on whether to move forward with the program and resource commitments” (p. 5). Jewett (1999) has developed a simulation model, BRIDGE, designed to compare the costs of what they term “distributed instruction” (television or asynchronous network courses) versus traditional lecture or laboratory instruction. The model uses one hundred parameters, which can be modi- fied by users to reflect their own situations or to engage in “what if” sce- narios. Case studies testing the model provide findings about costs and benefits associated with various types of instructional delivery methods. Yet another cost-related impact of distance education may result from students’ choosing to take selected courses elsewhere. Although this has always been an option, the availability of distance education courses that a student can take from one college while enrolled primarily at or without even leaving another increases the potential for this to occur. Consider the fact that many institutions derive a disproportionate amount of their revenue from a small number of high-enrollment, low-cost general education and ser- vice courses offered at the lower division. What budget consequences will there be if students choose to take these courses through distance education from other providers, attracted by course attributes such as nationally known 75THE IMPACT OF DISTANCE EDUCATION ON INSTITUTIONAL RESEARCH faculty, the entertainment value of instructional materials that capitalize on multimedia productions, and the desire to escape large lecture courses? Integrating Planning Across Programs. Another effect of distance education on strategic planning is the need to integrate planning for dis- tance education with planning for all academic programs. Organizationally, it is possible for distance education to be lodged in a separate department or college, much as continuing education is often separated from credit and degree programs. However, such segregation can exacerbate what some observers perceive as competition between traditional and distance educa- tion for resources and for students and promote the view that distance edu- cation is somehow not as legitimate or central to the institution as campus-based courses. It is possible to conceive of a separate institutional research office for distance learning as well. Already some institutions have quite separate offices to conduct enrollment management studies and research assessing student learning outcomes. Fragmenting institutional research responsibil- ities across several offices that may not be in close contact can further com- plicate not just strategic planning but also the institution’s decision-making processes and potential overlap, duplication, or wasting of resources. Real Experiences, Challenges, and Possible Next Steps The literature about distance education and its impact on institutional research or on data and information more generally is largely speculative, looking toward what should be happening or what might occur in the future. To glean a sense of what is really happening now, I conducted an informal survey of institutional researchers in my state, asking colleagues from both two-year and four-year institutions to respond to open-ended questions about how their institutions defined distance education, what impact distance education has had on their offices so far, and what they anticipate the future impact might be. Their responses provide some impor- tant glimpses into the “real world” of institutional research and its expecta- tions regarding distance education. My colleagues indicated that, at least in Illinois, the impact of distance education on institutional research is largely anticipatory. Most schools do not even have an agreed-on, operational definition of distance education. Only a handful of people have been involved in policy discussions, assess- ments, or considerations of data definitions and databases that can capture data and information about distance education. Some are beginning to track students, though not everyone has even coded courses to permit identify- ing students enrolled in distance education classes. Several respondents said they treat distance education students no differently from other students. I asked about changes expected to occur, realistically, in the collection and reporting of data about students engaged in distance education over the next three to four years. Some of my respondents anticipated that more data about distance education students will be needed, but most either did not 76 HOW TECHNOLOGY IS CHANGING INSTITUTIONAL RESEARCH answer this question or said they don’t expect distance education students to be differentiated from other students. Conclusion and Next Steps It appears that the impact of distance education on institutional researchers and their offices has not been meaningful in most institutions—yet. This generalization grows from a variety of indicators, including the informal sur- vey I conducted, a review of the literature, conversations with individuals who are experts in distance education, and assertions of NPEC panelists. Speculations abound. The NPEC panel, for example, posed a number of questions and challenges (U.S. Department of Education, 1998). Though the panel used the term technology, it really dealt with distance education and technology-mediated instruction in the broadest sense. According to the panelists, these are the broad-based issues that will need to be addressed, many of which have been discussed in this chapter: • Current surveys—for example, modifications in institutional and longi- tudinal surveys that will be required to capture changes in student behav- ior and participation • New relationships between learners and providers—for example, defini- tions of program completers, new sponsors of learning, undermining the relevance of many traditional indicators of quality • Using the student as the unit of analyses—for example, how we define students, how “completion” is determined, and how we can link students across multiple institutions, learning modes, and agencies that collect student-related data • Student assessment in a technology-based environment If we follow a more dramatic and extensive line of thinking, challenges will be even greater. Distance education could prompt the reexamination and possibly the reconceptualization of the ways in which instruction and the academic enterprise are perceived, organized, staffed, managed, physi- cally located, funded, marketed, and evaluated. Given this, what should institutional researchers do to prepare them- selves and to be proactive in meeting research and data challenges of dis- tance education? • Think in new ways about what constitutes courses, credits, degrees, learn- ing experiences, students, faculty, and institutions. • Find and share concrete examples of what is actually being done in insti- tutions that have some track record of distance education (this is not easy, since the literature is still replete with descriptions or speculations but contains few detail-oriented case studies or examples of problem solving). • Look for opportunities to link with offices making policy and implemen- tation decisions about distance education and the support systems under- 77THE IMPACT OF DISTANCE EDUCATION ON INSTITUTIONAL RESEARCH pinning not just distance education but also institutional databases, degree monitoring programs, and assessment of learning outcomes. • Avoid being seduced by skepticism or the attitude that distance educa- tion is “just a phase” that will pass, leaving the traditional organization, structure, and delivery of higher education intact. • Above all, look for opportunities to build bridges and create new part- nerships and working arrangements. In the language of technology, enhance connectivity within and among institutions, because the most dramatic impact of technology and distance education is likely to be breaking down barriers—among postsecondary institutions; among roles of faculty and staff; among colleges and universities on the one hand and corporate or other education services providers on the other; among on- campus and off-campus courses; and among credit-earning and non- credit or experiential learning. The landscape of what constitutes credible, viable, accessible, and valued learning options has become vastly more complex. It is more complicated for students and for insti- tutions to understand, to make sensible decisions about, and to act within. Given institutions’ inherent levels of self-interest, the natural desire for self-preservation, and the decades during which institutions focused on data and information about what occurred on the premises but not in other learning environments, it is simply too early to predict what the real impact of distance education will be on institutional research. What is not too early to predict is that there will be an impact. Notes 1. NPEC was created in 1994, when Congress authorized the National Center for Edu- cation Statistics to create a cooperative with a mission “to identify and communicate ongoing and merging issues germane to postsecondary education and to promote the quality, comparability, and utility of postsecondary data and information that support policy development, implementation, and evaluation.” NPEC comprises individuals rep- resenting all levels of postsecondary education, as well as statewide governing and coor- dinating agencies, federal government agencies, and national associations. 2. I am indebted to Nofflet Williams, former associate dean for distance learning at the University of Kentucky, for suggesting to me that distance education may well be the agent provocateur in the assessment arena, finally forcing traditionalists to take assess- ment seriously. References Adelman, C. “What Proportion of College Students Earn a Degree?” AAHE Bulletin, 1998, 51 (2), 7–9. Armajani, B., Heydinger, R., and Hutchinson, P. A Model for the Reinvented Higher Edu- cation System: State Policy and College Learning. Denver: State Higher Education Exec- utive Officers and Education Commission of the States, 1994. Berge, Z. L., and Schrum, L. “Linking Strategic Planning with Program Implementation for Distance Education.” CAUSE/EFFECT, 1998, 21 (3), 31–38. 78 HOW TECHNOLOGY IS CHANGING INSTITUTIONAL RESEARCH Bolman, L. G., and Deal, T. E. Reframing Organizations: Artistry, Choice, and Leadership. San Francisco: Jossey-Bass, 1991. Cartwright, G. P. “Technology Implications for Data Systems.” Change, July–August 1998, pp. 48–50. Couchman, J. “Becoming a Certified Oracle DBA.” Oracle Magazine, November–December 1998, pp. 125–130. Dolence, M. G. “Dawn of the Learning Age.” Paper presented at the Thirty-Second Annual National Conference of the Council for Resource Development, Washington, D.C., Dec. 3, 1998. Ewell, P. T. “Assessing Student Progress and Learning Gains.” In U.S. Department of Education, National Center for Education Statistics. Technology and Its Ramifications for Data Systems: Report of the Policy Panel on Technology. Publication no. NCES 98- 279. Washington, D.C.: National Postsecondary Education Cooperative, 1998. Guernsey, L. “A New Career Track Combines Teaching and Academic Computing.” Chronicle of Higher Education, Dec. 11, 1998, pp. A35–A37. Institute for Higher Education Policy. “Distance Learning in Higher Education.” CHEA Chronicle, 1999, 2(1), 1–8. Jewett, F. “Benefits and Costs of Mediated Instruction Summary.” [www.calstate.edu /special_projects/mediated_instr/summary.html]. Jan. 25, 1999. Jones, D. “Managing Faculty Assets to Accommodate New Realities.” NCHEMS News, Feb. 1999, pp. 2–5. McLean, R. S. “Assessing Course Assignments Submitted as Web Pages.” [www.oise .utoronto.ca/~rmclean]. Jan. 13, 1999. Mingle, J. R., and Ruppert, S. S. Technology Planning: State and System Issues. Denver: Education Commission of the States, 1998. Packer, A. H. “A Community Human Resource Network.” Unpublished paper, Institute for Policy Studies, Johns Hopkins University, July 23, 1998. Privateer, P. M. “Academic Technology and the Future of Higher Education: Strategic Paths Taken and Not Taken.” Journal of Higher Education, 1999, 70 (1), 60–79. Russell, A. B. “Advances in Statewide Higher Education Data Systems.” Unpublished paper available through State Higher Education Executive Officers, Oct. 1995. Russell, T. L. The No Significant Difference Phenomenon. Raleigh: Office of Instructional Telecommunications, North Carolina State University, 1999. U.S. Department of Education, National Center for Education Statistics. Technology and Its Ramifications for Data Systems: Report of the Policy Panel on Technology. Publication no. NCES 98-279. Washington, D.C.: National Postsecondary Education Cooperative, 1998. TRUDY BERS is senior director of research, curriculum, and planning at Oakton Community College in Des Plaines, Illinois. She has been chairperson of the National Postsecondary Education Cooperative and president of the Association of Institutional Research. She can be reached at tbers@oakton.edu.