Evidence Summary

 

Urban Public Libraries Do Not Yet Meet Benchmarks for Web Accessibility by Individuals with Disabilities

 

A Review of:

Maatta Smith, S. L. (2014). Web Accessibility Assessment of Urban Public Library Websites. Public Library Quarterly, 33(3), 187-204. http://dx.doi.org/10.1080/01616846.2014.937207

 

Reviewed by:

Ann Glusker

Reference/Consumer Health Librarian

Business, Science and Technology Department

The Seattle Public Library

Seattle, Washington, United States of America

Email: ann.glusker@spl.org

 

Received: 1 Mar. 2015    Accepted: 7 May 2015

 

 

cc-ca_logo_xl 2015 Glusker. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

Abstract

 

Objective – To determine the extent to which urban public libraries in the United States of America provide web sites which are readily accessible to individuals with disabilities with reference to the Urban Library Council’s EDGE initiative (specifically Benchmark 11, “Technology Inclusiveness”).

 

Design – Web site evaluation.

 

Setting – Urban public libraries in the United States of America.

 

Subjects – The 127 library systems, which were both members of the Urban Libraries Council at the time of the study and located in the United States of America.

 

Methods – Using the “everyday life information seeking” conceptual framework, an assessment of each of the web sites of the purposive sample of public library systems was performed by an online evaluation tool as well as visually and physically to determine web accessibility and, by extension, technology inclusiveness.

 

Main Results – The results of the online accessibility evaluation tool revealed that not one of the sites surveyed was free of errors or alerts. Contrast errors (related to color combinations), missing alternative text (providing text alternatives for visual elements), and missing form labels (thereby preventing screen readers from performing searches and navigating to results) were the most common problems. Results of visual and physical scans revealed that many sites lacked specific links and/or resources for persons with disabilities, as well as noting that the resources available used oblique language and required many clicks to access. In addition, the vast majority neglected to feature links to national resources such as the National Library Service for the Blind and Physically Handicapped.

 

Conclusions – The web sites of urban public libraries are not yet completely accessible for persons with disabilities. At the very least they need coding fixes and ongoing maintenance to address the kinds of issues found by the online web evaluation tool used. In addition, resources for disabled persons should be prominently and clearly linked and promoted. Further research is called for, both in non-urban library systems and in testing a wider range of access technologies. Improvement efforts should acknowledge that web design that improves access for persons with disabilities serves the broader community as well.

 

Commentary

 

There is no question that much remains to be done to make the internet accessible to persons with disabilities (Vicente & López, 2010; Dobransky & Hargittai, 2006). This study’s findings concur. Using the purposive sample of urban public libraries, which are members of the Urban Library Council, and using EDGE initiative benchmarks as a touchstone, it employed two methods for evaluating web sites. For this evidence summary, these methodologies were systematically assessed using the critical appraisal checklist by Glynn (2006).

 

The first method used in the study was an online evaluation tool for web sites the WAVE tool which, while not as powerful as expert inspection (Lazar et al., 2012), creates a consistent and rigorous assessment approach, and increases the quantifiability of and confidence in the evaluation results. However, there was no information about whether any comparisons were done with other tools, or which other tools might have been considered. The W3C Accessibility Initiative has a list of 48 tools on its site, with a detailed list of criteria for choosing the appropriate tool, so this could easily have been noted (W3C Web Accessibility Initiative, 2015).

 

More importantly, there were no detailed criteria mentioned for the second method, a “visual and physical” inspection of web sites. In a similar study examining public libraries in Maryland, each home page was examined by five experienced evaluators, working from an explicit set of guidelines which were included in the article (Lazar et al., 2012). The study author noted only that the sites were explored for certain features such as ease of use by screen readers, with neither indication of who performed the evaluations, nor of a standard list of features which were explored on each site.

 

The author openly acknowledges the limitations and lack of generalizability of the study. The purposive sample used covers only 1.5% of libraries (presumably meaning library systems, but this is unclear), and while sites were tested with several operating systems and browsers, further exploration remains to be done. The section on future research is detailed and explicit.

 

The implications for practice are clear and concrete. There are easy, achievable ways to make sites more accessible, if a library has the will and the funds. Librarians understand both their users and the uniqueness of accessed library resources, so they do a better job of ensuring accessibility compared with jurisdictional IT staff. Improving web sites’ accessibility helps everyone, not just the disabled; it would be useful to hear more about that, and also to have a resource list. Also, library users with various disabilities should be consulted for input. As web sites have more and more interactive content, and as they are increasingly accessed on mobile devices, the need for accessibility improvement is ever more urgent.

 

 

References

 

Accessibility Evaluation Resources (2015). W3C Web Accessibility Initiative. Retrieved from http://www.w3.org/WAI/eval/Overview.html

 

Dobransky, K. & Hargittai, E. (2006). The disability divide in Internet access and use. Information, Communication and Society 9(3), 313–34.

                http://dx.doi.org/10.1080/13691180600751298

 

Glynn, L. (2006). A critical appraisal tool for library and information research. Library Hi Tech, 24(3), 387-399. http://dx.doi.org/10.1108/07378830610692154 

 

Lazar, J., Wentz, B., Akeley, C., Almuhim, M., Barmoy, S., Beavan, P., …Yatto, T. (2012). Equal Access to Information? Evaluating the Accessibility of Public Library Web Sites in the State of Maryland. In Langdon, P., Clarkson, J., Robinson, P., Lazar, J., & Heylighen, A. (Eds.), Designing Inclusive Systems: Designing Inclusion for Real-world Applications (pp. 185-194). London: Springer-Verlag.

 

 Vicente, M. R. & López, A.J. (2010). A multidimensional analysis of the disability digital divide: Some evidence for Internet use. The Information Society 26 (1), 48–64. http://dx.doi.org/10.1080/01615440903423245