EBL 101

 

Evaluating the Results of Evidence Application, Part Two: At the Practice Level

 

Virginia Wilson

Client Services Librarian

Murray Library

University of Saskatchewan

Saskatoon, Saskatchewan, Canada

Email: virginia.wilson@usask.ca

 

Originally published in:

Evidence Based Library and Information Practice, 5(4), 130–131. https://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/9388/7539

 

 

Received: 01 Nov. 2010 Accepted: 02 Nov. 2010

 

 

cc-ca_logo_xl 2016 Wilson. This is an Open Access article distributed under the terms of the Creative Commons-Attribution-Noncommercial-Share Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 


Evaluation after implementation of evidence is a step that can be easily overlooked. However, its importance cannot be overestimated. Last time I wrote about evaluation at the practitioner level, or reflection on your own performance as an evidence based practitioner. This time, it is evaluation at the practice level—you want to discover if, as Booth puts it, “the service that you introduced or modified as a result of undertaking the evidence-based process actually made the anticipated difference” (p. 127). Evaluation will do one of two things: confirm that the actions taken had the anticipated effect, or it may lead you to rethink your original issue. Either way, evaluation can generate valuable information for your practice.

 

Evaluation or assessment is not exclusive to evidence based library and information practice.   Many libraries are cultivating a culture of assessment. There is a conference that focuses on assessment (Library Assessment Conference, http://libraryassessment.org), academic libraries are using tools such as LibQual+ (http://library.queensu.ca/webir/canlibqual/carl-libqual.htm) for assessment activities, and there are even Assessment Librarians in place in some libraries. The value of looking inward at the overall practice of the institution, in whatever library sector you may be situated, is high. While LibQual+ is a massive assessment tool, evaluation can be done on a much smaller scale for individual evidence based projects.

 

Ideally, plans for evaluation are included at the beginning of an evidence based project, or at least at the start of the implementation of whatever change is being made. The exact steps to be taken to perform the evaluation will vary depending on the scope of the project and what changes are occurring or being evaluated. Evaluation for a newly implemented reference model will be different than evaluation of a new instructional design approach. Evaluation that is undertaken as the project is ongoing is referred to as formative evaluation, and focuses on the process. Summative evaluation, undertaken at the end of a project, illuminates the ultimate effectiveness of the implementation. Both can be helpful when looking at an evidence based project.

 

There are several questions to ask when thinking about evaluation:

 

 

There are many different methods of evaluation, depending on what you are attempting to evaluate and what data you need to collect. Approaches such as focus groups, interviews, surveys, questionnaires, and observation are all ways to assess whether or not your changes have achieved the desired effect.

 

While it is far beyond the scope of this column to explore the explicit details of these various kinds of evaluation, there are many resources out there to help you out. Here is a brief bibliography to get you started.

 

Crawford, J. (2006). The culture of evaluation in library and information services. Oxford, UK: Chandos.

 

Mathison, S. (Ed.). (2005). Encyclopedia of evaluation. London: Sage.

 

Matthews, J.R. (2007). The evaluation and measurement of library services. Westport.   CT.: Libraries Unlimited.

 

Wallace, D.P. & Van Fleet, C. (2001). Library evaluation: A casebook and can-do guide. Englewood, CO: Libraries Unlimited.

 

In practical terms, evaluating the implementation of evidence from the perspective of practice involves undertaking the steps of evidence based library and information practice again: formulate a question (decide what it is you are evaluating or measuring), find the evidence (in this case, you will be generating your own evidence by evaluating the changes made), appraise the evidence (this will involve making sure your evaluation methods and techniques are sound), and apply the evidence (by holding it up to your indicators of success). In terms of the evaluation step in this instance, you can be reflective and evaluate your own work as an evidence based practitioner.

 

As with anything in the evidence based process, starting small can help overcome feeling overwhelmed. Eventually, the process will become more streamlined and easier to manage. Consulting with colleagues as well as the literature can help to get you going. All you need to do is take that first step.

 

Next time in EBL101, I will take a look at the process of disseminating your research. You have done an evidence based project, now what? Get it out there to help others in the library world.

 

Reference

 

Booth, A. (2004). Evaluating your performance. In A. Booth & A. Brice (Eds.), Evidence–based practice for information professionals: A handbook (pp. 127-137). London: Facet.