Evidence Summary
Personal Publications Lists Serve as a Reliable Calibration Parameter to
Compare Coverage in Academic Citation Databases with Scientific Social Media
A Review of:
Hilbert, F., Barth, J., Gremm, J., Gros, D., Haiter, J., Henkel, M.,
Reinhardt, W., & Stock, W.G. (2015). Coverage of academic citation
databases compared with coverage of scientific social media: personal
publication lists as calibration parameters. Online Information Review 39(2): 255-264. http://dx.doi.org/10.1108/OIR-07-2014-0159
Reviewed by:
Emma Hughes
Freelance Information Professional
Norwich, England
Email: emma.e.hughes@outlook.com
Received: 1 Aug. 2016 Accepted: 19 Oct.
2016
2017 Hughes.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – The
purpose of this study was to explore coverage rates of information science
publications in academic citation databases and scientific social media using a
new method of personal publication lists as a calibration parameter. The
research questions were: How many publications are covered in different databases,
which has the best coverage, and what institutions are represented and how does
the language of the publication play a role?
Design – Bibliometric
analysis.
Setting – Academic
citation databases (Web of Science, Scopus, Google Scholar) and scientific
social media (Mendeley, CiteULike, Bibsonomy).
Subjects – 1,017
library and information science publications produced by 76 information scientists at 5 German-speaking universities in
Germany and Austria.
Methods – Only
documents which were published between 1 January 2003 and 31 December 2012 were
included. In that time the 76 information scientists had produced 1,017
documents. The information
scientists confirmed that their publication lists were complete and these
served as the calibration parameter for the study. The citations from the
publication lists were searched in three academic databases: Google Scholar, Web of Science (WoS),
and Scopus; as well as three social media citation sites: Mendeley, CiteULike,
and BibSonomy and the results were compared. The publications were searched for
by author name and words from the title.
Main results – None
of the databases investigated had 100% coverage. In the academic databases,
Google Scholar had the highest amount of coverage with an average of 63%,
Scopus an average of 31%, and lowest was WoS with an average of 15%. On social
media sites, Bibsonomy had the highest coverage with an average of 24%,
Mendeley had an average coverage of 19%, and the lowest coverage was CiteULike
with an average of 8%.
Conclusion – The
use of personal publication lists are reliable calibration parameters to
compare coverage of information scientists in academic citation databases with
scientific social media. Academic
citation databases had a higher coverage of publications, in particular, Google
Scholar, compared to scientific social media sites. The authors recommend that
information scientists personally publish work on social media citation
databases to increase exposure. Formulating a publication strategy may be
useful to identify journals with the most exposure in academic citation
databases. Individuals should be encouraged to keep personal publication lists
and these can be used as calibration parameters as a measure of coverage in the
future.
Commentary
Measuring
coverage and impact of information scientists work is ever changing in the
advent of scientific social media (Bar-Ilan et
al, 2012). This study used a new calibration method of personal
publications lists to compare coverage of publications from information
scientists in both academic citation databases and scientific social media. The
study was appraised using the EBL Critical Appraisal Checklist (Glynn, 2006).
The strength of this study lies in the use of new calibration parameter of
personal publication lists. The study scored high for data collection and for
study design meaning that the study could be replicated. Contacting the
information scientists of the institutions to confirm that the authors had full
publication lists ensured that they could achieve fairly accurate analysis of
results although use of identifiers, such as ORCID, may have provided stronger
accuracy.
The
objectives and research questions of this study were clearly focussed. The
methodology builds on an approach used in a previous study (Kirkwood, 2012) and
clearly described the design, data collection, and analysis. However, it is not
wholly explained why the particular three academic databases and three
scientific social media databases used were selected over others, but the
methodology would be easy to replicate and the limitations were also discussed.
This
study focussed on information scientists working in German-speaking
institutions and highlights the potential limitations of publishing in
non-English language journals. The research found that the coverage of papers
from Dusseldorf were low in Web of Science due to the fact that many
information scientists publish in German information science journals, which
are not indexed by the database. Though the authors acknowledge that there is
no recent evidence on language skills of information scientists, it is possible
that this is the deciding factor when information scientists are at the
research publishing stage. This suggests that library and information
professionals should continue to promote their work, particularly where work is
not indexed in traditional databases and in instances when non-English language
publications have less coverage.
The
study limited the search to scholarly publications and not informal published
documents. Exploration of differences in coverage between document type and
coverage of papers in library specific databases such as LISA and LISTA, would
be interesting areas for further investigation.
Glynn’s
critical appraisal checklist advises that if overall validity of a study scores
>75% then the study is valid. Overall this study scored 77%.
The
methodology used in this study will be of interest to information and library
practitioners who want to show their research impact both on academic citation
databases and social media. The research also highlighted the need to keep
personal publication lists and the value of self-indexing on appropriate
scientific social media for library and information professionals.
References
Bar-Ilan, J.,
Haustein, S., Peters, I., Priem, J., Shema, H., and Terliesner, J. (2012).
“Beyond citations: scholars’ visibility on the social web”, in Proceedings of the 17th International Conference
on Science and Technology Indicators. Montréal, Canada. pp. 98–109.
Glynn, L.
(2006). A critical appraisal tool for library and information research. Library Hi Tech, 24(3), 387-399. http://dx.doi.org/10.1108/07378830610692145
Kirkwood, P.E.
(2012). “Faculty publication checklists: a quantitative method to compare
traditional databases to web search engines”, paper presented at the 119th American Society for
Engineering Education Annual Conference & Exhibition, San Antonio, TX,
10-13 June. www.asee.org/public/conferences/8/papers/3879/download