Evidence Summary

 

Assessment Related Skills and Knowledge Are Increasingly Mentioned in Library Job Postings

 

A Review of:

Passoneau, S., & Erickson, S. (2014). Core competencies for assessment in libraries: A review and analysis of job postings. Library Leadership & Management, 28(4):1-19.  https://journals.tdl.org/llm/index.php/llm/article/view/7080

 

Reviewed by:

Carol Perryman

Assistant Professor

Texas Woman’s University

Denton, Texas, United States of America

Email: cp1757@gmail.com

 

Received: 6 Dec. 2014     Accepted: 26 Jan. 2015

 

 

cc-ca_logo_xl 2015 Perryman. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

Abstract

 

Objective – The authors sought to determine whether existing definitions of assessment agree with assessment-related skills sought in job postings, and to identify key assessment-related skills, needs for training, and trends in assessment.

 

Design – Content analysis.

 

Setting – Job postings from six library-specific websites: the American Library Association, the Library & Information Technology Association, the Society of American Archivists, the Council on Library and Information Resources, the Association of Research Libraries, and Library Assessment job announcements at http://libraryassessment.info/?cat=13.

 

Subjects – Job titles and descriptions published during an 18-month period between Summer 2012 and Winter 2013 that met the inclusion criteria (n=231).

 

Methods – Job postings were searched and analyzed in two separate sets whose inclusion criteria is as follows: First, job postings with the term assessment in the position title or as the main focus of the position (n=44) were retrieved; of these, three postings were too old to contain descriptions, so were excluded from analysis. Second, job postings were retrieved with the terms assessment, evaluation, metrics, and strategic in the descriptive text of postings with position titles that did not specifically mention assessment (n=187). The full text of both sets was downloaded to ATLAS.ti software for analysis using a grounded theory approach. Mutually exclusive terms emerging from the coding process were documented and defined; from this analysis, networks of code “families” or co-relational groupings helped to create categories and sub-categories. The context of terms was closely examined to understand the meaning of assessment-related terms in job descriptions. Following this step, Microsoft Excel was used to generate tables and pivot tables, aiding understanding and illustrating data.

 

Main Results – All 44 job posts containing the term assessment as part of the position title were from research universities or four year colleges; of these, most were ARL member libraries. For these postings, the concept of assessment was more clearly aligned with definitions of assessment as an ongoing process. The positions described, requiring a minimum of three years’ of experience, ranged from entry-level to administrative in nature.

 

In the second set (187 postings), the interchangeable use of the terms “assessment” and “evaluation” was particularly evident in job postings unrelated to library instruction. No library types other than academic were recruiting for assessment librarians, but related skills, usually referred to as evaluation in public and special libraries, were mentioned in all areas of library practice including instruction, administration, public services, user behavior, and to a lesser extent, access services, archives, information technology, cataloging, and more.  While less prominent, these less often mentioned areas of practice also appear to be increasing their awareness of assessment.

 

Key skills and knowledge areas needed for assessment in libraries emerged from content analysis of the job postings. These were grouped under eight main areas of competency and were augmented by the authors’ own experiences as assessment librarians: background in library assessment, research methods, statistical and analytic skills, visualization and presentation skills, and project management and people skills.

 

Conclusions – Based upon analysis of this set of documents, a culture of assessment in libraries appears to be emerging, demonstrating a possible upward trend when contrasted with the earlier research of Walter and Oakleaf (2010). Overall, assessment related skills and knowledge were increasingly evident across all library types and positions. Suggestions for aiding the development of an emerging culture of assessment include fostering liaisons between ALA divisions and library schools to persuade the schools of the need for related coursework, workshops focused on assessment-related skills, certification programs, and a proposed minor in library assessment. Opening avenues for discussion between library types could enhance the growth of an assessment culture beyond academic librarianship. Additional research to better understand the diffusion of assessment culture and practice into non-academic libraries is also recommended.

 

Commentary

 

This study is intended to augment the earlier work of Walter and Oakleaf (2010), who performed an analysis of assessment-related job postings and found that “soft” skills were predominant (e.g., awareness of data needs) over “hard” skills (e.g., ability to gather and analyze data). The authors place their work in the context of numerous library job posting analyses. What is missing here is any definition of “competencies” as a concept different from “skills.” Throughout, the terms are used interchangeably, proving the authors’ point that consensus is sorely needed on terminology.

 

LIS-specific critical assessment tools developed by Glynn (2006) and Perryman (2014) were used to evaluate the quality of the article.

 

A limited number of posts were examined within an 18-month time frame. It is unclear whether the data was sufficiently representative, particularly in connection with non-academic libraries. The authors mention a lack of familiarity with public and special library practices and may have failed to include job listings that serve these communities, insufficiently representing their needs.

While detailed, insufficient information is provided for replication of the searching processes used, including the rationale for search terms used to retrieve and text-mine job posting documents. One of the posting sites (ALA) has a search engine that includes categories such as “other,” not just library positions. It is not made clear whether “other” was excluded from retrieval. Searching is not possible at the LITA job posting site, so authors must have opened each listing, however this is not mentioned. Among the sites searched, the Library Assessment site lists only two job postings within the specified period, both from late 2012. The ARL job listings provide only job titles and are not searchable by key words. The searchers would have had to go at least one and perhaps two layers in to retrieve the full posting text, but the process is not described. Information about the process used would have aided clarity and served to inform future research aimed at replication of the study. In addition, there is no description of any data cleaning process or of validation methods used, including inter-rater reliability measures.

 

Given page limitations, providing data online would have benefitted readers since the information mentioned is not sufficiently backed by statistics. As an example, the authors mention briefly that three years “experience” are required by “most” postings, but no explanation is provided about specific types of experience required. Asking a question related to experience required in future research would benefit job seekers and employers. It is surprising that in this study, there is no mention of whether “most” postings required an MLS degree.

 

The results of analysis of co-occurring terms is less convincing than the other findings, due to term ambiguity and the nature of the job postings themselves. Measuring the use of specific phrases such as “program improvement” and “culture of assessment” presumes a relatively consistent vocabulary, which is mentioned as a deficit of the study by the authors.

 

Despite its limitations, the study is a commendable effort that adds to a small body of literature tracking trends in assessment hiring in library practice. Future research built on this work can continue to examine the diffusion of assessment to practice in all library types, informing our future as it continues to adapt to measuring and documenting our return on investment to stakeholders. Additionally, the expanded categories derived from content analysis provide a basis for training and education that can benefit library administrators, educators, and associations.

 

References

 

Glynn, L. (2006). A critical appraisal tool for library and information research. Library Hi Tech 24(3), 387-399. http://dx.doi.org/10.1108/07378830610692154

 

Perryman, C. (2014). BibCAT: Evaluation tool for bibliometric studies. In JotForm – Form Builder. Retrieved from http://form.jotform.us/form/42397103556153

 

Walter, S. and Oakleaf, M. (2010, Oct. 27). Recruiting for Results: Assessment Skills and the Academic Library Job Market. Slides presented at the ARL Library Assessment Conference. Baltimore, MD. Retrieved from http://libraryassessment.org/bm~doc/walter_scott.pdf