Research in Practice
Evidence, Local Context, and the Hierarchy
Virginia
Wilson
Director,
Centre for Evidence Based Library and Information Practice (C-EBLIP)
University
Library
University
of Saskatchewan
Saskatoon,
Saskatchewan, Canada
Email:
virginia.wilson@usask.ca
Received: 11
Nov. 2015 Accepted: 18 Nov. 2015
2015 Wilson. This is an Open Access article
distributed under the terms of the Creative Commons-Attribution-Noncommercial-Share
Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use,
distribution, and reproduction in any medium, provided the original work is
properly attributed, not used for commercial purposes, and, if transformed, the
resulting work is redistributed under the same or similar license to this one.
A
key piece of evidence based library and information practice (EBLIP) is
evidence (obviously!). It’s a word that is a bit loaded or has been in the
past. What counts as evidence? Does the term “evidence” automatically suggest
quantitative research? In early days, as EBLIP emerged from evidence based
medicine, some purveyors and users of research evidence privileged quantitative
research—hard numbers generalizable across large populations. The research
evidence hierarchy was espoused as the model to follow, with randomized
controlled trials (RCTs) at the peak, the pinnacle of the pyramid, and case
studies often forming the large lower layer or base of the pyramid. In later
years, systematic reviews and meta-analyses deposed RCTs from the top of the
hierarchy.
The
idea of a hierarchy of evidence as Koufogiannakis outlined in 2010 is
problematic for various reasons as it pertains to EBLIP. I believe a hierarchy
of evidence doesn’t make sense for one particular reason: local context. EBLIP
consists of four components that must be present: research evidence,
professional expertise and knowledge, user preference, and the local context.
Each component should be explored, examined, and acknowledged when approaching
a practice problem in an evidence based manner. I suggest envisioning these components
as something like Figure 1.
If
you do not acknowledge all of the elements, it’s not truly EBLIP, and you won’t
have utilized everything at your disposal in order to make a decision or solve
a problem. If everything is placed against the backdrop of the local context,
as in Figure 1, then how can a hierarchy of evidence be effective? The best
systematic review to be found pertaining to your particular question might not
be applicable if it does not resonate with the way things are currently
configured in your local setting. That’s why critical thinking and critical
appraisal of whatever evidence is found is so important to the EBLIP process.
Figure
1
Components
of EBLIP
In
EBLIP, moving away from the rigid early hierarchy of evidence, the idea of
research evidence has broadened to include and value all types of research:
qualitative, quantitative, mixed methods, practical, theoretical,
participatory—you name it; it’s included. Additionally, evidence arising from
professional knowledge and expertise and from user preference has also come to
the fore in terms of acceptability. Of course, it must be acceptable if you
refer to the Venn diagram in Figure 1.
Research
conducted by Koufogiannakis looks at this very issue, and she reveals that
librarians are using a wide variety of evidence to inform their practice.
Koufogiannakis contends that “the focus of EBLIP over the past 15 years has
neglected to incorporate . . . parts of what the movement in fact defines
itself to include, namely the user-reported and librarian observed forms of
evidence” (2011, p. 42). The evidence used in pursuit of EBLIP when approaching
a problem or a decision in practice depends upon the nature of the problem and
the question you need to answer. This question arises from where you work, your
local setting. If you leave out your own expertise developed over time and the
needs, wants, or desires of your users when approaching your question, you’ve
left out two-thirds of EBLIP. And then if you apply some kind of arbitrary
hierarchy to the research evidence you’ve found, the focus becomes the
hierarchy itself rather than the local.
So,
if you’re going to look through a lens when practicing EBLIP (and I would argue
that lenses are ubiquitous no matter where we go and what we do), choose the
local context as your lens. If we can keep the idea front and centre that we
must not let our own settings slip out of sight when approaching a practice
question, we will realize that the best evidence is the evidence that supports
and informs our practice.
References
Koufogiannakis,
D. (2010). The appropriateness of hierarchies. Evidence Based Library and Information Practice 5(3): 1-3.
Retrieved from http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/8853/7348
Koufogiannakis,
D. (2011). Considering the place of practice-based evidence within Evidence
Based Library and Information Practice (EBLIP). Library and Information Research 35(111): 41-58. Retrieved from http://www.lirgjournal.org.uk/lir/ojs/index.php/lir/article/view/486/527