College and Research Libraries A View from the Trenches: Costing and Performance Measures for Academic Library Public Services Charles R. McClure This paper reports the results of a pilot study to identify the issues and concerns of public ser- vice middle managers on costing and the use of performance measures. Based on group inter- views at two libraries that are members of the Association of Research Libraries (ARL) and additional individual interviews with other librarians, findings suggest that participants have little faith in the usefulness of producing cost data and using performance measures; they be- lieve that the availability of such data rarely has an impact on decision making; in-house data frequently lacks reliability and validity; and they are too understaffed to take time away from the provision of services to identify, collect, and analyze such data. Implications of these find- ings are discussed and specific recommendations are offered to increase the usefulness of costing and performance measures for academic library decision making and planning. U n recent years, increased con- cern about the efficiency and ef- fectiveness with which aca- - demic libraries operate has encouraged a number of academic librari- ans, researchers, and consultants to con- sider methodologies to (1) cost selected services/operations and/or (2) develop performance measures for such services/ operations. Despite this interest, evidence about how such methods and data collec- tion techniques actually affect library deci- sion making is difficult to identify. Costing is a process by which dollar amounts describe a specified library ser- vice or operation. Costing frequently is based on assumptions regarding the "value" of a particular activity. Perfor- mance measures are quantitative and as- sess the efficiency (allocation of resources) or effectiveness (accomplishment of objec- tives) of the library. Output measures, a type of performance measure, concentrate specifically on the effectiveness or quality of a service or product that the library of- fers its cliente~e. Cost and performance measure data require carefully developed data collection procedures to insure their reliability, validity, and utility for decision making. Academic libraries justify the collection of cost data or use performance measures on the basis that they will be used to make better decisions, develop better plans, and, ultimately, increase the overall effec- tiveness and efficiency of the library (how- ever defined by the library). But the link between organizational decision making and the process of identifying, collecting, and analyzing data for costs and perfor- Charles R. McClure is a professor at the School of Information Studies at Syracuse University, New York, 13244. The author gratefully acknowledges the funding support of this study from the Council on Library Re- sources . A version of this paper was presented at the Third Economics Seminar, Wye, Maryland, October 29, 1985, sponsored by the Council on Library Resources. 323 324 College & Research Libraries mance measures is not clear. If one defines decision making1 as that process whereby information is converted into action, then decision making is largely concerned with the process of ac- quiring, controlling, and utilizing infor- mation to accomplish some objective. 2 Thus, a major task for the decision maker is to identify what information is needed to serve as a basis for making decisions, to determine the means by which the infor- mation can be obtained, and to decide when enough information has been gath- ered to make a decision. These tasks call for a conscious effort to manage informa- tion for library decision making. 3 This rational view of decision making, 4 does not consider adequately organiza- tional politics, personalities, cognitive styles of decision makers, organizational climate, knowledge and competency of the decision maker, and a host of other in- tervening variables. Yet such factors must be considered. Simply because a method has been established for creating cost data or quantifying the performance of selected library services does not mean that the data will be used to enhance organiza- tional decision making. This paper reports the results of a pilot project identifying issues and concerns of public serviCe librarians related to costing and the use of performance · measures. A formal review of the literature on the eco- nomics of academic libraries, costing, and performance measures is outside the scope of this paper. Useful background readings on these topics have been pub- lished by Baumol and Marcus, 5 Cooper, 6 Cummings/ and Cronin.8 STUDY DESIGN AND METHODOLOGY The study resulted from an invitation from the Council on Library Resources to prepare a paper for the Third Economics Seminar. The objectives of the study were to • describe the perceptions and attitudes of a sample of public service academic li- brarians regarding the use and impor- tance of cost data and performance mea- sures in their library; • assess the degree to which cost data and performance measures are currently July 1986 used as a basis for library decision mak- ing and planning; • identify issues that affected the success with which cost data and performance measures could be used to enhance or- ganizational decision making and plan- ning. Based on the data collected, the investiga- tor could make recommendations for the better utilization of cost data and perfor- mance measures in academic library deci- sion making and planning. Study Design Given the pilot nature of this study, the lack of existing research on the general topic of middle management9 perceptions on cost and performance measure data, and the need to obtain in-depth data to identify additional areas for further inves- tigation, the investigator relied on group and individual interviews. Group inter- views took place with public service mid- dle management librarians at two ARL li- braries meeting the following criteria: • relatively stable administrative leader- ship, i.e., the director has been at the li- brary a minimum of two years; • a formally organized public services area that typically included reference, circulation, and interlibrary loan ser- vices; • willingness to participate in the study and allow the investigator to meet with professional library staff, typically the department heads, within the public services area; • geographically accessible to the investi- gator. Each study site was located in a different state, one in the South and the second in the Southwest. One was a public institu- tion and one was private. In addition to the group interviews, the investigator conducted eleven individual interviews. This group of librarians in- cluded both ARL and non-ARL profes- sionals who met the following criteria: • did not hold the position of either direc- tor or assistant/ associate director; • were employed in a library with at least 500,000 volumes; • had a minimum of five years experience in public services; • had administrative responsibilitie~ for the allocation of resources within the public service area; • were knowledgeable and would feel comfortable· discussing the project top- ics. The eleven participants represented nine different states primarily from the East and West coasts of the United States; seven were located in public institutions and four were located in private institu- tions. The investigator interviewed some in person, others by telephone. In a case study setting, group interviews focus on a specific target group and de- scribe various behaviors and the relation- ship of these behaviors to selected envi- ronmental variables or conditions. They allow the investigator to probe in-depth and to identify variables and propositions that can serve to direct further research. Oftentimes, such designs provide ''an op- portunity for an investigator to develop insight into basic aspects of human behav- ior ... [and] may lead to the discovery of previously unsuspected relationships. ''10 Further, such a design is especially rele- vant for studying knowledge utilization because "the topic covers a phenomenon that seems to be inseparable from its con- text."11 The selection of two group interview sites and eleven participants for individ- ual interviews was determined by a num- ber of factors. First, the investigator be- lieved that the interviews would generate adequate data to accomplish the study ob- jectives. Second, study constraints in- cluded a limited budget that had to in- clude travel to test-site libraries, long distance telephone interviews, research assistance, and a number of other miscel- laneous expenses. Data Collection In addition to generating data to re- spond to the study objectives, the group interviews were designed to • compare the views held by librarians within different areas of public services in the same library; • observe the group interaction and dis- cussion on the topic. During each meeting, the investigator relied on a basic set of interview questions (see appendix A) to guide the discussion. A View from the Trenches 325 The investigator assured participants that all comments would be confidential and would not be attributed to either a particu- lar institution or individual. In both in- stances, a great deal of discussion took place about the topic and the interview session averaged about two-and-one-half hours. During the session the investigator kept notes summarizing the discussion that were later detailed and expanded. Between May and July 1985 (after the group interviews were completed) the in- dividual interviews were conducted. This data collection was designed to • compare the attitudes and opinions of the group interviews to a different sam- ple of academic librarians; • follow-up on questions and issues that were identified in the group interviews but not adequately explored by the in- vestigator; • obtain the attitudes and opinions of aca- demic librarians who were not in a group context with either their peers or superiors. Some interviews were as short as fifteen minutes, and one lasted for an hour and twenty minutes. The notes from the group interviews and the individual interviews were analyzed together and the results are reported later in this paper. Quality of the Data The investigator used several tech- niques to increase the reliability and valid- ity of the data. In terms of reliability, the set of interview questions was pretested by three practicing public service aca- demic librarians. As a result, ambiguous questions were reworded and additional questions were added. Secondly, the in- vestigator recorded the responses of the participants both during the interview and then detailed a summary immediately after the interview. Third, the same inter- view questions guided both the group in- terviews and the individual interviews and there was a significant degree of simi- larity in their responses. Validity is an assessment of the extent to which data collection procedures actually measure what the investigator intends them to measure. The specific ''measure'' that was investigated in this study was perceived importance of cost data and per- 326 College & Research Libraries formance measures for decision making and was operationalized by (1) identifying the specific types of cost data and perfor- mance measures that were used in that particular library, and (2) determining the number of times interviewees actually used cost data and performance measures as input for decision making. The internal validity of the data was enhanced by clear operationalized definitions of key terms, matching questions both within and across the group interviews and individ- ual interviews, and obtaining the opinion of practicing academic librarians that the questions and definitions had ''face valid- ity," i.e., that they would accurately rep- resent the variables under study. In a pilot study such as this, greater at- tention is placed on reliability and internal validity than external validity, or the de- gree to which the results can be general- ized to a larger population. 12 Thus, the in- creased reliability and internal validity, as well as increased ability to identify propo- sitions for research, to probe into specific areas under investigation, and to obtain detailed information about the phenome- non under investigation, are counterbal- anced by reduced ability to generalize. FINDINGS AND DISCUSSION The views of the participating librarians are organized by specific topics. Because a topic is discussed first, however, does not imply added significance. A brief sum- mary of the librarians' views and related implications is provided. 'Availability of Data for Costing Interviewees reported that there is ''spotty'' coverage of cost data in their li- braries. At best, the data available can be described as transaction data that quanti- fies the extent or amount of a service that is provided. Instances where costs have been associated to these transactions are traditional areas such as: cost per interli- brary loan, cost per online reference search, and personnel costs. None of the libraries had a coordinated plan for the regular identification, collec- tion, analysis, and reporting of data spe- cifically for library decision making. Fur- thermore, some of the interviewees July 1986 worked under authoritative and/or pater- nalistic management styles with closed or- ganizational climates that could not sup- port the development of an integrated plan for the use of cost and performance measure data. Although the management styles at the various libraries differed, minimal attention was given to Informa- tion Resources Management (IRM) as an administrative device to assist in decision making. 13 The process by which cost data are iden- tified and collected is best described as re- active: the library "gears up" to supply data requested by the university, a profes- sional association, or a government agency. There is little sharing or general knowledge about such data. Indeed, dur- ing the group interviews, some depart- ment heads were surprised to find that others had data that might be useful to their operations. Overall, there was con- sensus that middle managers (1) did not have access to the cost data that were per- ceived to be in the director's office, and (2) did not know exactly what cost data were available in the library as a whole. Performance Measure Data There is little understanding about pub- lic service performance measures-how they are established, their purpose, and their relationship to larger administrative activities such as planning, goal setting, and evaluation. 14 One participant com- mented that "performance measures are just another administrative fad which will be forgotten in a couple of years.'' The ma- jority of the respondents did not consider the process of establishing and maintain- ing data for performance measurement at- tractive because: • They did not believe that such measures could accurately assess the complexities of their particular services. • Inadequate time was available to spend on data collection for such measures. • They did not think the use of perfor- mance measures would make a differ- ence in how decisions were made in the library. • They perceived that the mathematical and statistical skills necessary to com- pute such measures were excessive and, in some cases, beyond the existing capabilities of the interviewees. Further, there was some anxiety that per- formance measures would be used to as- sess the competency of individual librari- ans. Only a few of the interviewees pro- fessed familiarity with Objective Perfor- mance Measures for Academic and Research . Libraries. 15 Those familiar with it perceived it to be too complicated and time- consuming to be useful and doubted if the measures would be used for library deci- sion making. Generally, they were un- aware of the work being done by public li- braries and state libraries in the area of planning and the use of performance mea:.. sures. Only three or four knew of Output Measures for Public Libraries16 and these were not knowledgeable about specific measures. Interviewees had difficulty identifying specific objectives for individual areas within public services. Rather, they com- mented ori the broad range of their re- sponsibilities and the organization's in- ability to establish priorities. Given these conditions, it is not surprising that there is little data either being provided or actually used for the computation of public service performance measures. Distrust of Library Cost Data Overall, the interviewees did not trust library cost data and had a number of res- ervations about their use in the decision making process. One reason was that many had collected cost data themselves, were aware of the data's limitations, and knew that the data were not reliable. Also, the librarians clearly believed that it is too easy to manipulate cost and performance measure data-especially in the absence of agreed-upon cost categories, definitions, and performance measures. This attitude is best summarized by a comment from a department head who said, "We can come up with any number we need on a three day notice.'' Another limiting factor mentioned by virtually all interviewees was the percep- tion that in instances where cost data had been collected; that data had not been linked to the quality of service. They dif- A View from the Trenches 327 fered as to their views on the relationships between costs and quality, but generally they believed that ''high quality services'' cost significantly more than mediocre ser- vices. But, no data is available to substan- tiate this claim. Because cost data tend not to be related to the quality of a service, decision making becomes cost driven rather than program or service driven. They provided a num- ber of examples. In one instance, a biblio- graphic instruction program was elimi- nated after the department head conducted a study of the program's costs. Although it was highly regarded by fac- ulty and was assessed as very effective, the costs were seen as excessive and thus, it was eliminated. When departments lack clear service objectives (as most appar- ently did), the cost data are viewed in iso- lation and decisions, apparently, were made s.olely on a cost basis. The interview- ees objected strongly to such uses of cost data. Many of the comments indicated a dis- trust of collecting cost data because the data can be (and were) used for purposes other than those originally intended. They pointed out that cost data and perfor- mance measures can be used or inter- preted differently by the department, the library, library users, and the university. A number of department heads main- tained two sets of cost data on the same service-one for internal departmental use and one for submission to the library administration. In another instance, the department head provided a written ex- planation of the limitations and assump- tions underlying a set of cost data. This in- formation was eliminated from the report submitted to university officials. Politically Based Decisions versus Cost-Based Decisions The interviewees believed that even if the library could collect and analyze cost data conscientiously the data would have little impact on actual decisions made. This sentiment was stressed where cost data were used to justify funding requests at the university administrative level. They believed that the interpersonal skills, personality, and political savvy of 328 College & Research Libraries the director was far more important than going to the bargaining table well stocked with cost and performance measure data. As one librarian commented, "It is much more effective for the director to be playing tennis on a regular basis with the vice-president than producing cost data.'' Indeed, a number of interviewees re- counted instances where political and in- terpersonal factors between library staff and faculty or university administration had significantly greater pay-offs than carefully designed and conducted cost analysis studies. Political decision making inhibited the use of cost and performance measure data less frequently in the library; however, it was still present. One exchange between two department heads clearly indicated that one was a "favorite" of the director while the other was not. Both knew that the one "could get away with" limited supporting data to justify a change while the other probably would be unable to jus- tify the change even with high quality cost data supporting his/her position. It is un- likely this would go unnoticed by other department heads. Competition with Technical Services According to respondents, new information-handling technologies, the relative ease with which technical services can be casted, and the ''sexy'' nature of automated systems placed technical ser- . vices in "unfair competition" for library resources. Thus, public service activities frequently were shortchanged. Most of the interviewees readily admit- ted that their technical service counter- parts provided better cost data on activi- ties than they provided. They perceived that production of such cost and perfor- mance measure data was much easier to accomplish for technical services, that is- sues relating costing to the quality of the service were less complex, and that a number of the automated systems pro- duced such data as a by-product of the ser- vice. Some interviewees were resigned to this situation and felt little could be done about it. They also believed the impor- tance of public services would increase a£- July 1986 ter the "love affair" with automation ended. Perceptions versus Reality of uQuality" Services Another interesting theme was the sense that user perceptions of the quality of public services wer:e more important than specifying costs and assessing the ac- tual quality of that service. Indeed, many interviewees maintained that the genera- tion of cost and performance measure data could be detrimental. Given a choice, fac- ulty probably would prefer to purchase more books and increase the number of periodical subscriptions, especially if they knew what it costs to maintain a high "correct answer fill rate. " 17 One depart- ment head said, ''You can't sell services to the faculty but you can sell increased book collections and periodical runs.'' Interestingly, the interviewees were not aware of any data in their library that de- termined how much it costs to provide a correct versus an incorrect reference an- swer, an in-person versus a telephone an- swer, reference service by staff category or similar output measures. They preferred to assume that "high quality" services were provided but little empirical evi- dence was produced to support these claims. When the investigator mentioned stud- ies in academic libraries that showed less than 50 percent accuracy of reference staff on quick-fact and bibliographic ques- tions, 18 they generally doubted that this would occur in their library. As one refer- ence librarian commented, ''Most of our questions call for in-depth answers and are not quick-fact or bibliographic in na- ture.'' In addition, the use of unobtrusive measures to assess the quality of public services19 was not seen as appropriate for their particular libraries. Yet, such mea- sures are valid. Although performance measures can , provide indicators of the quality of a ser- vice, the interviewees showed little enthu- siasm for using such techniques. They had a high degree of confidence in their intu- itive ability to recognize "poor quality" services and saw the use of cost and per- formance measure data as but a weak and time-consuming replacement for their in- tuitive skills. Further, the perceptional feedback that many of the librarians re- ceive from direct contact with patrons was seen as providing more useful evaluative information. Limited Reward Structures The interviewees perceived few re- wards for engaging in the use of cost and performance measure data for decision making. They suspected that knowledge of actual costs and actual performance would make their jobs more difficult be- cause: • The time and effort necessary to pro- duce cost and performance measure data would be taken away from direct provision of service to library patrons. • Actual costs for provision of ''quality'' services are perceived to be so high that the continuation of these services may be questioned by library and university administration. • Identification of ''poor'' performance on a specific service would require a re- medial action and would represent a change in the status quo. Further, they did not believe that the availability of cost and performance mea- sure data for public services would signifi- cantly enhance their ability to obtain addi- tional funding to improve a service or develop new services. One interviewee reported that after a significant effort on the part of depart- mental staff on a cost analysis of online database searching, no decisions resulted for correcting the problems identified in the study. It was ''another study cast into the well of decision making never to be heard from again.'' In another example, a reference services department completed a cost effective- ness study on service demands and staff- ing. As a result of the study, the depart- ment was able to save 2.5 FTE staff while maintaining the same level of reference desk staffing. The positions were trans- ferred to technical services despite the pleas of the department head to use them for other activities within public services. She commented, ''That was the last time I tried to save money for the library.'' A View from the Trenches 329 In short, most interviewees believe that the use of cost data and performance mea- sures is more trouble than it is worth. Li- brarians who used these techniques typi- . cally were punished rather than rewarded for their efforts. And there was support for the notion that cost and performance measure data are probably best used at the departmental level and may have to be "adjusted" before the librarian submits them to other administrative levels. Summary Propositions The views and attitudes expressed dur- ing these interviews frequently suggest that significant organizational change will be necessary before cost and performance measure data can be integrated success- fully into academic library decision mak- ing. Despite the interviewees' degree of concern for the provision of high quality services and their high degree of profes- sional commitment to their jobs, there is little sense that a rigorous and formalized methodology for the collection of cost and performance measure data will assist them, or their staff, in performing better than they do currently. The general dissatisfaction with the use of cost and performance measure data is summarized below: • Middle managers are not likely to use costing and performance measure tech- niques in decision-making processes. • There are few rewards and benefits for those who provide empirical evidence to justify the costs and quality of public services. • The perception by users that the quality of services is high will offset reality, if services are poor. • Intuitive assessments are as accurate as decision making based on cost and per- formance measure data. • Funding for public services has been limited, in part, because technical ser- vices can better justify their expenses and performance; however, use of simi- lar methods for public services is not ap- propriate. • Costing and performance measures cannot adequately assess the quality of public services. • Costing, planning, and performance 330 College & Research Libraries measure methodologies developed in public libraries or by state library agen- cies are inappropriate models for use in academic libraries . The results from the interviews support the conclusion that many middle man- agers are • distrustful of the use of cost and perfor- mance measure data; • unaware of much of the research and development done on the general topic of performance measurement; • unlikely to use such data, even if avail- able, for library decision making. But such findings within the individual aca- demic libraries should not be surprising given the limited attention and exposure to these measures. Costing and perfor- mance measure methodologies seem to be concepts in search of a practice. Specific . strategies will be necessary to remedy this situation. INCREASING THE USE OF COSTING AND PERFORMANCE MEASURE DATA FOR ACADEMIC LIBRARY DECISION MAKING Clearly there is a need for academic li- brarians to increase their basic knowledge of the uses and applications of public ser- vices cost and performance measure data for library planning and decision making. An excellent place to begin is with a re- view of the work done in the public and state library context during the past ten to fifteen years . This literature provides an excellent context to better understand why such measures are needed, how they can be developed and refined, where they are being used currently, and how they can be used to improve library planning and decision making. A review of this literature is beyond the scope of this paper and has been summa- rized, in part by Lynch. 20 However, Perfor- mance Measures for Public Libraries, 21 pub- lished in 1973, stands as a benchmark because the study set the stage for the de- velopment of an attitude that public librar- ians must engage in a process that pro- vides ongoing evaluative data about the performance of the library. The Public Li- brary Association (PLA) supported this July 1986 study, and has provided direct support for the publication of A Planning Process for Public Libraries22 in 1980, Output Measures for Public Libraries23 in 1982, and Costing Li- brary Services: A Management Handboo1(4 in 1985. Also during the late 1970s and early 1980s, a number of state library agencies became actively engaged in the develop- ment of statewide planning and use of performance measures. The Oklahoma Department of Libraries published Perfor- mance Measures for Oklahoma Libraries in 1982 and the Utah State Library published in 1985 the manual, Planning Evaluatin~ and Measuring for Public Library Excellence. An overview of the state libraries' use of performance measures has been written elsewhere. 27 PLA, with the Urban Library Council (ULC) and Chief Officers of State Library Agencies (COSLA) funded the Public Li- brary Development Project in 1985 that will result in an integrated set of manuals/ reports related to : • the planning process • performance measurement • role setting, i.e., selecting appropriate missions and activities given the li- brary's resources and community • a national database of public library sta- tistics Karen Krueger and Douglas Zweizig have described this project in greater detail elsewhere. 28 However, the project is scheduled for completion in early 1987 and should provide important methodo- logies and tools for the assessment of pub- lic libraries. 29 In short, the public and state libraries have a long and productive involvement with the development and testing of plan- ning and performance measures. Over the years, they have promoted a positive atti- tude and respect for costing and perfor- mance measurement of public services by public librarians. Significant amounts of money have been used to finance these projects and produce the various manuals noted above . Despite obvious differences between the public and academic library setting, much of the methodology, many of the measures, and important lessons from this history of involvement can be applied to an academic library. Findings from this study as well as les- sons from the public library experience with cost and performance measure data suggest a number of possible strategies. Two different but related areas where strategies will have to be developed to in- crease their use in academic libraries are at • the professional level; • the organizational level. Professional Level Strategies 1. Assess and increase the degree of commitment that can be focused on the development of appropriate methodolo- gies. There may not be adequate support from the academic library professional community at this time to mount an effort for the development of cost and perfor- mance measure methodologies. While there clearly is much talk about the topic, specific actions and products have been few. 30 To have a significant impact on the academic library professional community, a commitment to such a project will be necessary from recognized leaders in aca- demic libraries, professional associations such as the Association of College andRe- search Libraries (ACRL), with direct sup- port from the Association of Research Li- braries (ARL), the Council on Library Resources (CLR), and perhaps other po- tential funding agencies. Individual library subscriptions to sup- port research and development programs are an excellent strategy that public li- braries have used successfully to increase commitment to such projects. For exam- ple, the "Public Library Development Project'' is sponsored by PLA, COSLA, and the ULC. Some thirty state library agencies and more than sixty public li- braries are funding the project-each con- tributing between $500 and $6,000. Such funding strategies lead to commitment and a sense of ''owning'' a part of the project. 2. Coordinate leadership and planning for developing the methodologies. Assuming a potential commitment to such projects will materialize, a coordi- nated plan and vigorous leadership are needed to A View from the Trenches 331 • clarify the objectives and content for a development program related to cost- ing and performance measures for aca- demic libraries; • seek and obtain funding to support re- search and development in the areas of academic library cost and performance measure methodologies; • identify and obtain the support from a wide range of academic library leaders . to participate in such projects. The primary players have been ARL and CLR. Significant direct participation by ACRL, federal and/or private funding agencies, and direct involvement from . major academic libraries in this country has yet to occur. The success that public and state li- braries have achieved is due largely to long-term coordinated leadership and planning. The PLA activities have covered more than fifteen years of concerted ef- forts; the state library in Oklahoma has been involved in research and develop- ment on planning and performance mea- sures for six years. Both have spent con- siderable resources to produce and test practical, usuable manuals. More impor- tantly, the research and development products from PLA, state library agencies, individual public libraries, and individual researchers and consultants have signifi- cantly benefited from the contributions of each other. 3. Increase academic librarian aware- ness of the importance and potential ap- plications of these methodologies. As the results from this study suggest, a major effort must be mounted to ''sell'' ac- ademic librarians and their directors on the importance of and applications result- ing from costing and performance mea- sure methodologies. Specific areas where increased awareness is necessary include • explaining their purpose for library ad- ministration; • describing the current status of the methodologies and how they can be de- veloped and computed; • relating the measures to library decision making and planning, and to increasing overall library effectiveness. Although this will not occur overnight, greater national attention is needed. 332 College & Research Libraries Many academic librarians are uncon- vinced that these data are either nec'essary or would contribute to increasing the qual- ity of library services. Academic library leadership has increased the professional community's awareness of other key is- sues such as library education, preserva- tion, and producing compatible machine- readable bibliographic records. It should be done here as well. 31 Organizational Level Strategies 1. Review existing management styles and organizational climates within the ac- ademic library. The results from this study indicated that in some instances the director's man- agement style and the library's existing or- ganizational climate would not support cost and performance measure-based de- cision making. The management style should include direct support for Informa- tion Resources Management (IRM) princi- ples, including: • Organizational information resources must be carefully identified and those that contribute to the increased effec- tiveness, efficiency, or productivity of the organization acquired and ex- ploited. • The quality of library decision making . and planning is directly dependent on the ability of the organization to manage cost and performance measure data. • A carefully established plan for using organizational information resources is essential. • All librarians should have wide access · and exposure to these resources. To enhance the use of cost and perfor- mance measure data it will be necessary to understand basic IRM concepts and to im- plement specific methodologies. Secondly, the organizational climate of the library must support the use of cost and performance measure data for deci- sion making. Organizational climate is a psychologically based method of describ- ing how peoples' value systems coexist with those of the organization. Measures of organizational ·climate have proven to be a viable method to distinguish and di- agnose an organization's psychological health. A validated methodology for as- sessing academic library organizational July 1986 climates has been described elsewhere. 32' 33 2. Increase the knowledge level of the importance and potential applications of cost and performance measure data. A program to · increase academic librari- ans' knowledge should include both tech- nical skills and administrative concepts: 34 • philosophical underpinnings and justi- fication for the use of cost and perfor- mance measure data in library decision making and planning • background information describing ex- isting methodologies • in-house training sessions about identi- fying, collecting, organizing, analyzing, and reporting data • demonstrations about how the individ- ual academic librarians can use specific data for decision making and planning. Clearly, a broad range of educational con- tent and formats can be used to increase the knowledge of library professional staff about the uses and potential application of these data. But this must be individually tailored to the needs of the library staff. 3. Develop administrative systems that support the identification, collection, or- ganization, analysis; and reporting of cost and performance measure data. Careful consideration is needed to de- sign Management Information Systems (MIS) and Decision Support Systems (DSS) that can assist librarians to identify, collect, organize, analyze, and report the data. 35 The participants in the interviews reported that there were few structures in place for a regular, ongoing process to manage the cost data that was available. Typically, they described a "reactive" rather than a "proactive" approach. Although figure 1 represents a broad overview of the reactive and proactive ap- proaches, some differences are important to note: • The reactive approach offers neither a philosophy of organization information management nor a set of specific objec- tives/policies for the use of cost and per- formance measure data. • The proactive system calls for a formal MIS orDSS. • There are no feedback loops in the reac- tive approach: formal evaluation does not take place, integration of new cost and performance measure data typi- l c~llly does not occur, and thus, no as- sessment can be made on the reliability and validity of the data. • Wide accessibility to the data is pro- vided in the proactive system and data are integrated into organizational deci- sion making and planning. As suggested earlier, the reactive ap- proach best describes the libraries as re- ported by the interviewees. To generate usable data, an organiza- tional system must be in place that recog- nizes the interactive aspects of policy mak- ing, encourages wide access to the data, and recognizes that empirical data are used in a much broader psychological con- text of organizational politics, personali- ties, and conflicting objectives. The proac- tive approach shown in figure 1 can be designed to accommodate these factors and still enhance rational decision making; the reactive approach discourages rational decision making and encourages a politi- Organizational philosophy of IRM and supportive A View from the Trenches 333 cal, personality based, intuitive decision- making process. 4. Establish reward structures for librar- ians who use cost and performance mea- sure methodologies for library decision · making. Academic librarians who are involved in the use of cost and performance measure data frequently are not rewarded for such involvement, or worse, may be punished (indirectly). Academic library administra- tors can use a wide range of motivational strategies and reward structures to en- courage the use of these methodologies, including: • Release time: Given the heavy load of re- sponsibilities for many academic librari- ans, new responsibilities to produce cost and performance measures should occur only after release from other re- sponsibilities. • Provide evidence that the cost and perfor- mance measure data are used in decision Evalutate the effectiveness and efficiency mans~~~~ent \ Revise and make changes as needed ln •.. of the MIS/ DSS on a regular basts Supply existing data or attempt to collect It; or no response I t Decentralized data storage In office of data collector Cost and performance measure policies and objectives (related to library goals and objectives) Formalized MIS or DSS to Identify, collect, organize, analyze, and report cost, performance measure, and other data Request for data to assist In decision making \. \. \ \ ._-- / FIGURE 1 ___ _. \. \. \. Integrate the Information Into library decision making and planning Reactive versus Proactive Library Information Management Systems 334 College & Research Libraries making: Too often, librarians perceive the collection of such data as simply an exercise; if the data are to be collected, make certain, and show evidence that they are used for library decision mak- ing and planning. • Negotiate reallocation of resources saved as a result of cost or performance measure stud- ies: If studies by an individual or a de- partment identify instances where re- sources can be saved or services made more effective, that department should profit in some way. • Provide resource support to assist in the use of cost and performance measure data: The need for training programs already has been discussed, but provision of equip- ment, e.g., microcomputers and soft- ware, staff assistance, or other re- sources to aid in the data collection and analysis process will facilitate the suc- cessful use of cost and performance measure data. Other motivational techniques also are possible and include a wide range of both intrinsic and extrinsic reward structures. These structures will have to be designed in light of the unique characteristics of the library staff and the reward structures available. FACING THE. CHALLENGE Much of the literature on the topics of cost and performance measure methodo- logies assumes that using such methods will increase library efficiency and effec- tiveness. However, there is limited empir- ical evidence that can be used to verify this assumption despite the fact that a growing --- - ------------- -- July 1986 number of libraries have been involved in the use of cost and performance measure data and are willing to "testify" about their importance for increasing library ef- fectiveness. The testimonials usually stress the use of such methodologies as an administrative self-diagnostic tool and not as a means to justify funding increases. Academic librarians must determine if the formal use of cost and performance measure data is either appropriate or nec- essary in their library. The determination will have to consider other library priori- ties and the resources necessary to estab- lish a system for the effective use of the data for library decision making and plan- ning. But, most importantly, they should remember that effective use may have a significant overall organizational impact- resources can be allocated more efficiently and objectives can better meet user infor- mation needs. The view from the middle management trenches is a bit paradoxical. Tools that will assist academic librarians to manage their areas better are badly needed. But the cli- mate to support their use for better man- agement is seldom available. If increased use is to occur, professional leadership and organizational development will be re- quired. The methodological tools can be created, improved, and refined. Resources can be marshaled. A strong commitment · and sense of purpose are necessary to meet the challenges to be faced in using cost and performance measure methodologies for improved academic library decision mak- ing and planning. REFERENCES AND NOTES 1. This paper assumes that decision making is a daily occurrence for all professional librarians and not limited only to library administrators . 2. Colin Eden and John Harris, Management Decision and Decison Analysis (New York: Wiley, 1975), p.57. 3. Charles R. McClure, "Management Information For Library Decision Making," in Advances in Li- brarianship, V.13, Wesley Simonton, ed., (New York: Academic, 1984), p.1-47 . 4. Herbert Simon, Administrative Behavior, 3d ed. (New York: The Free Press, 1976) . 5. W. J. Baumol and M. Marcus, Economics of Research Libraries (Washington, D. C.: American Council on Education, 1973) . 6. Michael D. Cooper, "Economics of Scale in Academic Libraries," L.ibrary Research 5:207-19 (Sum- mer 1983). 7. Martin M. Cummings, "The Effect of U.S. Policies on the Economics of Libraries," Bulletin of the A View from the Trenches 335 Medical Library Association 73:1-8 (Jan. 1985); and The Economics of Research Libraries (Washington, D.C. : Council on Library Resources, 1986). 8. Mary J. Cronin, Performance Measurement for Public Seroices in Academic and Research Libraries (Wash- ington, D.C.: Association of Research Libraries, 1985) . 9. The term middle management is used loosely in this paper to represent professional academic librari- ans who report to an assistant/associate director or the director, who allocate resources related to the provision of services and operations, and who have other professional librarians reporting to them . · 10. Donald Ary, Lucy Chesser Jacobs, and Asghar Razavieh, Introduction to Research in Education, 3d ed. (New York: Holt, 1985), p.323. 11. Robert K. Yin, "The Case Study as a Serious Research Strategy," Knowledge: Creation, Diffusion, Utilization 3:97-114 (Sept. 1981). 12. Yvonna S. Lincoln and Egan G. Guba, Naturalistic Inquiry (Beverly Hills, Calif.: Sage Publications, 1985). 13. A detailed discussion of Information Resources Management (IRM) is outside the scope of this paper. Levitan and Synnott and Gruber provide useful background readings on the subject: Ka- ren B. Levitan, ''Information Resource(s) Management-IRM,'' in Annual Review of Information Sci- enceand Technology, V.17, Martha Williams, ed., (White Plains, N.Y.: Knowledge Industry 1982), p.227-66; WilliamR. Synott and William H. Gruber, Information Resources Management (New York: Wiley, 1981). 14. The library profession is only recently considering methods of integrating cost and performance measure data into the planning and decision-making process. Both academic and public library publications on these topics (e.g., Cronin, Riggs, Kantor, Association of Research Libraries, Ro- senberg, Zweizig and Rodger, and Palmour et . al.) provide useful treatments on either costing, planning and decision making, or performance measures. However, integrated treatments of these topics has yet to be accomplished successfully. Cronin, Performance Measurement .. . ; Donald E. Riggs, Strategic Planning for Library Managers (Phoenix: Oryx, 1984); Paul Kantor, Objec- tive Performance Measures for Academic and Research Libraries (Washington, D. C.: Association of Re- search Libraries, 1984); Association of Research Libraries, Strategic Planning in ARL Libraries [Spec Kit no . 108] (Washington, D.C. : ARL, 1984); Philip Rosenberg, Costing Library Seroices: A Manage- ment Handbook (Orlean, Va.: Philip Rosenberg and Associates, 1985), [Draft]; Douglas L. Zweizig and Eleanor Jo Rodger, Output Measures for Public Libraries (Chicago: American Library Assn., 1982); Vernon E. Palmour, Marcia C. Bellasai, and Nancy DeWath, A Planning Process for Public Libraries (Chicago: American Library Assn., 1980). 15. Kantor, Objective Performance Measures .. . , 1984. 16. Zweizig and Rodger, Output Measures for Public Libraries, 1982. 17. Correct answer fill rate is an unobtrusive performance measure that assesses the quality of refer- ence service in terms of the percentage of the provision of correct answers provided by reference staff compared to all questions answered. See McClure for more discussion on these and other unobtrusive reference performance measures. Charles R. McClure, "Output Measures, Unobtru- sive Testing, and Assessing the Quality of Reference Services," in Evaluation of Reference Seroices, Bill Katz and Ruth A. Fraley, eds., (New York: Haworth, 1984), p.215-34. 18. Crowley offers a summary table of the results from unobtrusive tests of academic and public li- brary reference services. Terence Crowley, "Half-Right Reference : Is It True?" RQ 25 :59-68 (Fall 1985). 19. McClure, "Output Measures, Unobtrusive Testing ... ," 1984. 20. Mary Jo Lynch, "The Public Library Association and Public Library Planning," in Planning for Li- brary Services: A Guide to Utilizing Planning Methods for Library Management, Charles R. McClure, ed., (New York : Haworth, 1982), p.29-42. 21. Ernest R. DeProspo, Ellen Altman, and Kenneth E. Beasley, Performance Measures for Public Li- braries (Chicago: American Library Assn., 1973). 22. Palmour, and others, A Planning Process for Public Libraries, 1980. 23. Zweizig and Rodger, Output Measures for Public Libraries, 1982. 24. Rosenberg, Costing Library Seroices . . . , 1985. 25 . Oklahoma Department of Libraries, Performance Measures for Oklahoma Public Libraries (Oklahoma City, Okla.: Oklahoma Department of Libraries, 1982). 26. Utah Advisory Committee, Project Upgrade: Planning, Evaluation, and Measuring for Public Library Excellence (Salt Lake City, Utah: Utah State Library, 1985), [Draft]. 27. Charles R. McClure, ''The Role of the State Library Agency for Developing Statewide Peformance Measures, Planning, and Standards," in State Library Seroices and Issues: Facing Future Challenges Charles R. McClure, ed., (Norwood, N.J. : ABLEX, 1986), p.187-211 . · 336 College & Research Libraries July 1986 28. Karen J. Krueger and Douglas L. Z weizig, "Getting It Together: A Report on PLA' s Work in Prog- ress," Illinois Libraries 66:233-236 (May 1984). 29. The author is currently serving as the principal investigator for the "Public Library Development Project." 30. An ACRL Task Force on Performance Measures, of which this author was a member, issued a final report in 1984 that endorsed the use and appropriateness of peformance measures in academic libraries. Subsequently, the ACRL committee or Performance Measures was established with Vir- ginia Tiefel as chair. · 31. For example, the Council on Library Resources' program "Innovation and Improvement of Basic and Supplementary Education for Academic and Research Libraries" has brought considerable attention to the issues related to improving the quality of academic librarianship education. 32. Charles R. McClure and Alan R. Samuels, "Factors Affecting the Use of Information for Academic Library Decision Making," College & Research Libraries 46:483-98 (Nov. 1985); and Alan R. Samuels and Charles R. McClure, ''Utilization of Information for Decision Making Under Varying Organi- zational Climate Conditions in Public Libraries,'' Journal of Library Administration 4:1-20 (Fall1983). 33. This author and Alan R. Samuels have been involved in an ongoing research project to develop validated instruments that measure five dimensions of organizational climate, e.g., innovation, support, freedom, democratic governance, and esprit; and three dimensions of information proc- essing, e .g ., information acquisition, information dissemination, and information evaluation . We brick has done additional testing to these measures . The measures appear to provide a practical method to assess either an academic or public library's organizational climate and information processing characteristics. Sue Webrick, The Effects of Personality Type and Organizational Climate on the Acquisition and Utilization of Information (Ph.D. Diss.: University of Pittsburg, School of Library and Information Science, 1985) . 34. Douglas L. Zweizig and Charles R. McClure, "Issues in Training Practitioners for Library Plan- ning,'' in Planning for Library Services: A Guide to Utilizing Planning Methods for Library Management, Charles R. McClure ed., (New York: Haworth, 1982) p.235-250. 35. Probably the best example of an operational Decision Support System is "Maggie's Place," Pikes Peak Library District, Colorado Springs, Colorado. A discussion of Decision Support Systems (DSS), Management Information Systems, and their similarities/differences is beyond the scope of this paper. The topic is specifically addressed in McClure. Additional background information on library DSS can be found in Akoka, Heindel and Napier, and Bommer and Chorba. Kenneth E . Dowlin, The Electronic Library (New York: Neal-Schuman, 1984); McClure, "Management Infor- mation for Library Decision Making," 1984, p.3-11; J. Akoka, "A Framework for Decision Sup- port Systems Evaluation," Information and Management 4:133-41 (1981); Allan J. Heindel and H. Albert Napier, "Decision Support Systems in Libraries," Special Libraries 72:319-27; (Oct.1981); Michael R. W. Bommer and Ronald W. Chorba, Decision Making for Library Management (White Plains, N.Y.: Knowledge, 1982). APPENDIX A: INTERVIEW QUESTIONS* 1. How is "effectiveness" of library public services determined? 2. To what degree do public services have to be "efficient" for them to be funded within the library? 3. What costing data are currently available at the library and for which specific public service areas? 4. What are the administrative needs at the library to produce specific costs analyses of public ser- vices? 5. Are cost data used with performances measure data to assess public services? Is unobtrusive testing an appropriate methodology to assess the quality of public services? 7. On what basis are administrative decisions made regarding the expansion, elimination, and over- all effectiveness of public services? 8. What specific data would be useful regarding costing and performance measures of public ser- vices? 9. How well do public services compare to technical services in use of cost and performance measure data? What are the effects of disparities between the two areas in terms of library resource alloca- tion? 10. Does the library organizational climate and management styles support the use of cost and perfor- mance measure data for decision making and planning? *These were used as introductory questions, additional follow-up and in-depth questions broadly related to the study objectives were also used depending on the flow of the interview.