Conference Paper

 

Analyzing the MISO Data: Broader Perspectives on Library and Computing Trends

 

Laurie Allen
Coordinator for Digital Scholarship and Services
Haverford College Libraries
Haverford, Pennsylvania, United States of America
Email:
lallen@haverford.edu

 

Neal Baker

Library Director

Earlham College

Richmond, Indiana, United States of America

Email: bakerne@earlham.edu

 

Josh Wilson

Director for Academic Support and User Services

Brandeis University

Waltham, Massachusetts, United States of America

Email: jwilson@brandeis.edu

 

Kevin Creamer

Director for Teaching, Learning and Technology

Boatwright Memorial Library

University of Richmond

Richmond, Virginia, United States of America

Email: kcreamer@richmond.edu

 

David Consiglio
Head of Research Support and Educational Technology
Bryn Mawr College
Bryn Mawr, Pennsylvania, United States of America
Email:
dconsiglio@brynmawr.edu

 

 

cc-ca_logo_xl 2013 Allen, Baker, Wilson, Creamer, and Consiglio. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 2.5 Canada (http://creativecommons.org/licenses/by-nc-sa/2.5/ca/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

Abstract

 

Objective – To analyze data collected by 38 colleges and universities that participated in the Measuring Information Services Outcomes (MISO) survey between 2005 and 2010.

 

Methods The MISO survey is a Web-based quantitative survey designed to measure how faculty, students, and staff view library and computing services in higher education. Since 2005, over 10,000 faculty, 18,000 students, and 15,000 staff have completed the survey. To date, the MISO survey team has analyzed the data by faculty age group and student cohort. Much of the data analysis has focused on changes in the use, importance, and satisfaction with services over time.

 

Results – Analysis of the data collected during 2008-2010 reveals marked differences in how faculty and students use the library. The most frequently used services by faculty are the online library catalog (3.39 on a 5-point scale), library databases (3.34), and the library website (3.29). In contrast, the most frequently used services by students are public computers in the library (3.61) and quiet work space in the library (3.29). Faculty reported a much higher use of online resources from off campus. Analysis of data from schools where the survey was administered more than once during 2005-2010 reveals that both faculty and students increased their utilization of databases over time. All other significant faculty trends reflected declines in usage, whereas, with the exception of use of the library website, all other student trends reflected no change or increased usage.

 

Conclusion – As the MISO survey has continued and expanded over the years, the usefulness of rich comparable data from a set of peer institutions over time has increased tremendously. In addition to providing a rich source of data, MISO can serve as a model for how a group of schools can collaborate on a share assessment tool that meets the needs of individual institutions and provides a robust, aggregated dataset for deeper analysis.

 

Introduction

 

As higher education changes in response to budgetary, technological, and political pressures, library and technology leaders increasingly look for meaningful ways to assess how and to what extent our organizations support scholarship, teaching, and learning. The Measuring Information Services Outcomes (MISO) Survey is a Web-based quantitative survey designed to measure how faculty, students, and staff view library and computing services in higher education.

 

The core of the MISO Survey consists of questions designed to measure the use of library and IT services, their importance to the campus community, and the level of satisfaction with which the community views these services. The survey also measures the ownership of technology tools and their use for academic and personal purposes, as well as participants’ perceptions of their own technology skills and preferred learning methods. In addition, it measures overall attitude toward library and technology services on campus.

 

By looking at computing and library services together, the MISO Survey provides a richer context for each set of services while acknowledging the shared nature of many of the services as seen from the perspective of our constituents. While there are many distinct services offered by library and computing organizations on campuses, librarians and technologists also frequently work together to support instructional and academic computing needs on campus and to provide resources to off-campus students and faculty. In addition, library buildings are, on many campuses, the site of many computing resources.

 

Launched in 2005, the MISO Survey has been taken more than 43,000 times at 38 participating institutions, 26 of whom have responded to the Survey more than once and 8 more than twice. Overall, more than 10,000 faculty, 18,000 students, and 15,000 staff have completed the survey.

 

The precursor to the MISO Survey was designed by David Consiglio and his colleagues at Bryn Mawr College to assess the effectiveness of the College’s recently merged Information Services department. When the Survey proved extremely useful, a group of chief information officers from the Council of Library and Information Resources (CLIR) agreed to use the Bryn Mawr Survey as the basis for a common survey to be administered across schools. This would allow each school to learn from the data gathered on its campus and also compare itself to a group of peer institutions. In addition, by conducting the Survey every year, each institution would be able to evaluate its services over time. Bates College, Middlebury College, the University of Richmond, and Wellesley College graciously agreed to donate a significant amount of a top manager’s time toward the project. In January of 2005, the team members met for the first time at Bryn Mawr College to begin this process.

 

During the Spring and Summer of 2005, the MISO Survey team prepared and tested the instrument. Their five institutions participated in a pilot administration in Fall 2005. Additional schools administered the MISO Survey in Spring 2006 and in each Spring since.

 

The MISO Survey Team works together to develop long-term strategies, to conduct in-depth analysis of data, and to complete biennial revisions to the instrument. The co-investigators also liaise with participating institutions during the survey administration season to ensure that the survey administration goes well. The team has developed high standards for data quality by using tested questions, ensuring high response rates and customizing the survey instrument so that participating institutions can address local concerns. Each participating school receives a summary dataset representing all institutions for the survey year for comparison.

 

This article will focus on analysis of the larger dataset of all schools and years, offering deeper analysis of user needs than any one school could conduct using its own data. To date, the MISO Survey Team has analyzed the data by faculty age group and student cohort and is now examining how views on services are affected by academic discipline. Finally, the Survey Team combined use and importance trends to provide a richer look at longitudinal changes and better predict how constituents will view services in the future.

 

Survey Method, Structure, and Process


At each participating institution, the Survey is administered to all teaching faculty, all staff members who are not members of the library or IT organizations, and a stratified sample of students selected randomly from the population. The Survey is generally administered starting on the fourth Thursday of each institution’s Spring semester. This approach helps ensure that each institution’s data can be compared to data gathered at other institutions.

 

The Survey’s strategy of regular outreach to respondents enables each campus to achieve high response rates compared to other surveys. In addition, surveying a sample of each institution’s student body helps to avoid student survey fatigue and further increases the student response rate. These methods helped to achieve response rates in 2010 of 48.8% for faculty, 44.9% for students, and 50.3% for staff.

 

In addition to the core questions included in the base instrument, most participating schools include an expanded set of optional questions and many include custom questions that ask about local services. Most of the core and optional questions can be customized to reflect the service names in use at each institution (e.g., those about the online catalog or the course management system).

 

Once a school has agreed to participate in the Survey in the coming year, its leadership selects a Campus Survey Administrator (CSA) from among the library or IT staff. This individual is responsible for all aspects of survey administration at his or her institution. A member of the MISO Survey Team liaises with each institution, helps its CSA prepare for upcoming administration deliverables, guides the CSA in working with the school’s Institutional Review Board, and answers questions as the process unfolds. These preparations for survey administration take place largely during the Fall semester so that the Survey is ready to go live early in the Spring semester. A more detailed timeline for MISO Survey administration is available on the MISO website (http://www.misosurvey.org).

 

Once all participating schools have concluded their survey administration, the results are summarized and analyzed during the Spring and Summer months. Each participating school receives a comprehensive spreadsheet that includes the mean values for questions included in its Survey instrument for each population surveyed (faculty, staff, and students), as well as comparable mean values for all other participating schools. These spreadsheets include results from the current year as well as all previous years. The spreadsheets allow for easy comparison of schools and cohorts to show where statistically significant differences exist. Each institution also receives its raw data as well as an SPSS file for further data analysis.

 

What Is Unique about MISO?

 

While each institution has a rich collection of data to analyze from their own survey and from the spreadsheet of mean data for all schools, the MISO team has also spent considerable time analyzing results from all schools and cohorts to see broader patterns from within the data. This broader analysis is one of the unique features of the MISO Survey, as it is done in a statistically rigorous way that allows us to differentiate between patterns that seem emergent based on anecdotal evidence or changes at a single school and those that are truly widespread. The team has been able to view changes in student attitudes about services as they move from freshman year to senior year, as well as some changes that are happening in student attitudes over time without regard to class year. We have also looked closely at trends in the use, importance, and satisfaction with our services as it relates to the age of our faculty members. Beginning with the 2010 Survey (Table 1), we will look at how faculty and students within the various disciplines interact differently with our services as well. Below, we have provided one example of the kind of trend analysis possible with the MISO Survey instrument by taking a deeper look at how the use of library services has changed over time, and how those changes are different for faculty as compared with students.

 

Examples of How Analyzing the MISO Data Provides Broader Perspectives on Library and Technology Services

 

Much of the data analysis has focused on changes in the use, importance, and satisfaction with services over time. In this section, we look more closely at trends in the reported use of library services, without consideration of importance or satisfaction, as an example of one kind of analysis possible with the data. This section first presents the mean frequency of use for faculty and students 2008-2010 as a benchmark about current use patterns, followed by time trends taken from all institutions which participated in the Survey more than once from 2005-2010 (N=27). It is important to underline at the outset that an analysis of frequency of use alone is not a sufficient gauge of a service’s value to faculty and students. Such an analysis does, however, provide one informative, broader perspective on the IT landscape in higher education.

 

2010 Benchmarks: Faculty and Student Frequency of Use

 

Frequency of use in the MISO Survey is set on a five-point scale:

 

 

It should be noted that while the numbers used in the scale increase in a linear fashion the categories do not increase linearly. Each successive category represents an increase in use that is three or four times greater than the previous category. As a result, a person selecting category four uses a service about 16 times as much as a person selecting category two, even if the numbers “4” and “2” suggest there is only twice as much use.

 

Below are tables illustrating the frequency of use of all library and technology services 2008-2010 (Figures 1 & 2). No attempt is made to isolate what constitutes a library service per se, so that nominal “library” services can be viewed in the context of all services. It is of course difficult to decouple such increasingly linked terms.

 

Selecting from the overall array of services, various combinations can be grouped under a more focused rubric labeled “the library.” Any attempt to do so is potentially problematic given local conditions at each institution. Librarian position descriptions at some colleges involve campus course management system duties, for example, while other librarians elsewhere help to maintain access to online resources from off-campus via software proxy servers.

 

Despite differences in local conditions, there will likely be wide consensus as to what represents a typical library service. These standard library functions are grouped together for comparative analysis (Table 2).

 

Comparison of the data reveals marked differences in how faculty and students use “the library.”

 

The most frequently used services by faculty are the online library catalog (3.39), library databases like JSTOR (3.34), and the library website (3.29). These are the only library services that faculty use at least one to three times a month, on average.

 

In contrast, the most frequently used services by students are public computers in the library (3.61) and quiet work space in the library (3.29). These are the only library services that students use at least one to three times a month on average.

 

Table 1

Sample Sizes and Response Rates

Population

Sample Size

Responses

Response Rate

Total Institutions

Faculty

9,482

4,707

49.6%

38

Students

22,757

8,605

37.8%

38

 

Figure 1

Faculty use benchmarks

 

Figure 2

Student use benchmarks

 

The implications for “library as place” are worth serious consideration. Across the board, students report using library facilities more than faculty (Table 3).

 

When planning library facilities upgrades, decision makers might do well to consider design with students foremost in mind. They could also synthesize MISO frequency of use data with other empirical research that yields similar results about faculty and library facilities (Schonfeld and Housewright, 2010).

 

Whereas students use a location-based library, faculty turn to online library services with greater frequency (Table 4). In addition, faculty report a much higher use of “Access to online resources from off campus” (3.54 vs. 2.38), which presumably includes the use of proxy services which allow access to library materials outside of library facilities.

 

Note that it is difficult to determine the extent to which some library services are perceived as location-based or online. For example, library reference services can occur at a physical desk on campus or via email and/or chat. Likewise, the provision of interlibrary loan services occurs via online forms embedded in proprietary databases and at location-based service points. Furthermore, library patrons can typically use circulation services either online (i.e., a “renew books” option available in the online library catalog) or in a physical facility. Overall, faculty use most of these hybrid online/place-based library services with greater frequency than students. However, library reference services are used to basically the same extent by students and faculty (Table 5).

 

Table 2

Comparison of All Library Services Use Benchmarks

Service Name

Faculty Mean

Student Mean

Interlibrary Loan

2.32

1.76

Library Circulation services

2.70

2.25

Library Reference services

2.18

2.22

Library website

3.29

2.93

Online library catalog

3.39

2.80

Library collections

2.92

2.49

Library databases (e.g. JSTOR)

3.34

2.90

Digital image collections (e.g. ARTstor)

1.48

1.49

Library liaison/contact

1.91

Not asked

Online course reserves

Not asked

2.96

Study carrels in the library

1.28

2.78

Quiet work space in the library

1.55

3.29

Group study spaces in the library

1.24

2.76

The Library café

2.32

2.93

Public computers in the library

1.82

3.61

 

Table 3

Comparison of “Place-Based” Library Services Use Benchmarks

Service Name

Student Mean

Faculty Mean

Public computers in the library

3.61

1.82

Quiet work space in the library

3.29

1.55

The library café

2.93

2.32

Study carrels in the library

2.78

1.28

Group study spaces in the library

2.76

1.24

 

Table 4

Comparison of Online Library Services Use Benchmarks

Service Name

Faculty Mean

Student Mean

Online library catalog

3.39

2.80

Library databases (e.g. JSTOR)

3.34

2.90

Library website

3.20

2.93

 

2010 Trends: Faculty and Student Frequency of Use

 

To analyze trends in the use of library services, the following analysis relies only on data from schools where the Survey was administered more than once since 2005 (N=27). New questions have been added to the MISO Survey since 2005, stemming from changes in the wider library and technology services landscape. As a result, trend data is available for a smaller number of services because not all survey questions have yet to be answered more than once by enough institutions to provide generalizeable trends (denoted by “N.A.” in Table 6).

 

This section only reports on services where the change in use over time was statistically significant for faculty or students and where the change was large enough (+/-.025) to merit attention. Consequently, an “--“ value in the table below denotes a slope (i.e., a possible change over time) that is not statistically significant or not large enough to be of real practical significance.

 

The only library services use trend common to both faculty and students is increased utilization of databases like JSTOR (0.0300 and 0.0348, respectively).

 

With the exception of library database use, all other significant faculty library services trends reflect declines in usage for faculty: reference services (-0.0380), circulation services (-0.0430), and the online library catalog (-0.0430).

 

With the exception of the library website (-0.0338), all other student library services trends reflect no change, or reflect an increased usage (a rise in digital images collections like ARTstor [0.0711]) with less pronounced but still significant growth in interlibrary loan (0.0338).

 

Taken as a whole, these divergent trends also suggest important differences in faculty and student library use patterns.

 

To focus only on notional “library” services is to occlude important developments of interest to librarians, and this is where MISO data distinguishes itself relative to more circumscribed assessment tools. By means of conclusion, one additional technology frequency of use trend deserves careful attention. Both faculty and students increasingly turn to the course management system (0.2110 and 0.1399). The usage slopes for products like Blackboard and Moodle are much steeper than any increased library use trend. Librarians ought to consider embedding their services in their course management system since that is where their patrons are to be increasingly found.

 

Table 5

Comparison of Hybrid Online/”Place-Based” Library Services Use

Service Name

Faculty Mean

Student Mean

Library collections

2.92

2.49

Library Circulation services

2.70

2.25

Interlibrary Loan

2.32

1.76

Library Reference services

2.22

2.18

 

Table 6

Comparison of Statistically Significant Library Services Use Trends

Service Name

Faculty Trend

Student Trend

Interlibrary Loan

--

0.0338

Library Circulation services

-0.0430

--

Library Reference services

-0.0380

--

Library Website

--

-0.0337

Online library catalog

-0.0430

--

Library collections

N.A.

N.A.

Library databases (e.g. JSTOR)

0.0300

0.0348

Digital image collections (e.g. ARTstor)

--

0.0711

Library liaison/contact

--

Not asked

Online course reserves

Not asked

--

Study carrels in the library

N.A.

N.A.

Quiet work space in the library

N.A.

N.A.

Group study spaces in the library

N.A.

N.A.

The Library café

N.A.

N.A.

Public computers in the library

N.A.

N.A.

 

Conclusion

 

The data analyzed provide evidence of trends in stakeholder interactions with libraries for 2010. Faculty, for example, decreasingly use the online library catalog, library circulation services, and library reference services, and view these three service categories as decreasingly important. Of these three service categories, the online library catalog and library circulation services experienced slight drops in perceived importance among faculty while library reference services experienced a somewhat larger drop. On the other hand, faculty increasingly use library databases and are increasingly likely to access online resources from off-campus, which potentially speaks to an increased importance of proxy services. At the same time, faculty consider library research instruction, library liaisons, the library website, and interlibrary loan to be increasingly important, in that order. As for undergraduates, they are slightly less inclined to use library reference services and much less inclined to use the library website over time. Conversely, and more so than faculty, undergraduates increasingly use interlibrary loan, library databases, and particularly digital image collections. Like faculty but even more so, undergraduates consider library research instruction and interlibrary loan to be increasingly important, in that order. Unlike faculty, the undergraduate trend is to view the library website as slightly less important. Consistent with faculty, undergraduates view library databases and off-campus access as increasingly important.

 

The analysis above provides one look at the MISO data. By examining the use values for the subset of variables representing library services across time and institutions, we can see trends and patterns that would not have been as meaningful if taken from a single school. As the MISO Survey has continued and expanded over the years, the usefulness of rich comparable data from a set of peer institutions over time has increased tremendously. The MISO annual summary data help participant schools in identifying their relative strengths and weaknesses, creating peer groups for analysis, and determining whether a problem is a local concern or a nationwide trend. The analysis of micro data provided by the Survey Team allows library and technology decision makers a wider perspective on trends and relationships between services.

 

In addition to providing a rich source of data, MISO can serve as a model for how a group of schools can collaborate on a shared assessment tool that meets the needs of individual institutions and provides a robust, aggregated dataset for deeper analysis. The process of designing, updating, and customizing the MISO Survey by a team of library and computing leaders from within participating institutions ensures that the instrument remains relevant to decision making, and that the Survey is easy to conduct. As the dataset becomes larger, and a greater variety of institutions participate, we will continue to plan for ways to increase the usefulness and scope of analysis, while ensuring that all participating institutions continue to find useful measures of their own service.

 

 

References

 

Schonfeld, R. C., & Housewright, R. (2010). Faculty survey 2009: Key strategic insights for libraries, publishers, and societies. New York: Ithaka S+R. Retrieved 24 May 2013 from http://cyber.law.harvard.edu/communia2010/sites/communia2010/images/Faculty_Study_2009.pdf