Issues in Science and Technology Librarianship | Spring 2006 |
|||
DOI:10.5062/F4HX19NP |
URLs in this document have been updated. Links enclosed in {curly brackets} have been changed. If a replacement link was located, the new URL was added and the link is active; if a new site could not be identified, the broken link was removed. |
The NIST Research Library's Lab Liaisons conducted an assessment of NIST-authored publications within a specific organizational unit. This assessment used commonly existing information tools, including Journal Citation Reports and the WorldCat database, to evaluate the projected impact of NIST research products within their respective technical communities. The results of this analysis were provided to upper-level NIST technical management to assist in the creation of an overall publications management strategy.
Assessment activities are a routine part of library management. Librarians examine the state of the library's collection, electronic resources, and services in order to assure that patrons' changing needs are continually being well met. As the role of the information professional continues to advance, the range of assessments conducted are evolving as well.
Library liaison programs are a mainstay in many academic and special library settings. These individuals work closely with a select portion of their customer community in order to better focus on specific user needs. The National Institute of Standards and Technology (NIST) Research Library employs the liaison service model. Recently, library liaisons at NIST conducted an assessment for researchers to help predict the reach and impact of their scientific publications.
Founded in 1901, NIST is a non-regulatory federal agency within the Department of Commerce's Technology Administration. NIST's mission is to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve the general quality of life. NIST carries out its mission in four cooperative programs:
The Lab Liaison program was established to more effectively meet the needs of researchers in their respective technical areas. The Lab Liaisons provide direct in-depth information research and analysis as well as collection development and training support to each NIST work unit. A liaison supports each of the above areas. As an entity, the Lab Liaisons work together and frequently collaborate to extend the range of services and analyses provided by the program.
NIST's research results are disseminated through its publications, primarily in the form of reports, conference papers, and journal articles. Publications in scientific and technical journal literature and conference proceedings are the single largest avenue for distributing and publicizing NIST scientific and technical results. But, not all publications are created equal. How can researchers and technical managers quantify the "reach" and impact of their work? In what ways can publications be measured in order to represent how they affect the larger scientific community? These are valid and important questions for any technical organization to ask, particularly as they strategically position themselves for the future. The ability to help answer these questions opens a new range of assessments for today's information professionals.
In the summer of 2005, the Electrical and Electronics Engineering Laboratory (EEEL) approached its liaison with a request for assistance. EEEL's director and his team wanted a deeper understanding of their relative success in "getting the word out" regarding their accomplishments and results. Additionally, the director was interested in collecting baseline data to support development of an overall publication strategy for the Laboratory's research staff. The Research Library's EEEL Lab Liaison considered the possible methodologies available to provide an answer to some of these questions.
In reviewing a subset of library literature, it became apparent that there are a substantial number of analyses studying journal publications, citation rates, and patterns. Much of this work demonstrates how citation analysis can be used to create core journal lists within specialized and even emerging disciplines (LaBonte 2005; Vincent & Ross 2000; Ramesh & Nagaraju 2000; Shiue et al. 2004; Kelsey & Diamond 2003; Shin 2004). Another area of emphasis includes assessing the status of a library's collection through citation analysis (Tunon & Brydges 2005; Johnson 2000).
The use of citation analysis is not without its detractors. Its attributes have been well characterized by many authors (Glanzel & Moed 2002). The literature also presents studies in which citation analysis has been used in more controversial approaches, such as to evaluate the success of individual faculty or research staff members (Rey-Rocha et al. 2001; Russell-Edu 2003). Citation analysis is not a panacea for representing the fullest view of literature within a field (Wormell 1998). More recent articles emphasize the use of citation analysis for non-traditional purposes, such as being considered with the problem discussed above (Wormell 1998; Tunon & Brydges 2005).
The ultimate purpose of this analysis was to examine EEEL's publishing trends over time, rather than to evaluate the citation rate of specific articles written by individual research staff members. With this context in mind, analyzing impact factors of given journal patterns over time provides a relative sense of the "prestige" of a journal.
The Laboratory Director's staff provided publication data from October 2001 through June 2005. These data consisted of all journal titles where EEEL researchers had published articles as well as many conferences where Laboratory research staff had presented papers. In order to develop a reasonable dataset, the EEEL Lab Liaison initially focused on those journal titles where researchers had published more than three times over the given time period. The EEEL Director's staff and the Lab Liaison jointly selected a subset of conferences with published proceedings based upon a number of factors including personnel attending, repeat attendance over multiple years, and total size of the conference in question. Different types of analytical methods were used for each of these publication categories as they could be quite different in terms of audience. However, the basis for the Research Library's methodology is that an article's "reach" can be assumed to be related to the prestige or status of the journal or conference in which it is published.
Thomson ISI's Journal Citation Reports (JCR) was used as the basis for analyzing data related to journal publications. For each journal title, the respective subject categories, as established by JCR, were identified. This array of categories provided a "map" of the subject areas that might be potentially relevant for EEEL research. For each of these categories the top journals were identified, as rated by the journal's impact factor, and compared against EEEL publications patterns.
The Journal Impact Factor rating was selected as a metric since it is arguably the most significant and well-known metric developed by Thomson ISI's JCR. The Journal Impact Factor is defined as the "number of current citations to articles published in a specific journal in a two-year period divided by the total number of articles published in the same journal in the corresponding two-year period" (Thomson Scientific 2006). In keeping with the criticisms of citation analysis, the Journal Impact Factor's limitations and controversies are well documented. Several studies have asserted the concerns with using this figure as an absolute metric (Jacso 2001; Wormell 1998). One of the most basic concerns centers on using the Journal Impact Factor as a measure of an individual author's impact or level of contribution (Russel-Edu 2003). Eugene Garfield (1998), the original creator of this metric, indicates that "citation data and analysis should always be used in combination with other indicators when evaluating departments or individuals."
These concerns and limitations are well acknowledged. However, for the purposes of this analysis, the Journal Impact Factor is still a valid metric to consider (Garfield 2006). As one study quotes:
Impact Factor is not a perfect tool to measure the quality of articles but there is nothing better and it has the advantage of already being in existence and is, therefore, a good technique for scientific evaluation. Experience has shown that in each specialty the best journals are those in which it is most difficult to have an article accepted, and these are the journals that have a high impact factor. . . .The use of impact factor as a measure of quality is widespread because it fits well with the opinion we have in each field of the best journals in our specialty (Hoeffel 1998).
The impact factor was not the only metric considered. The Immediacy Factor was also considered. This metric is defined by ISI as "the average number of times an article is cited in the year it is published" (Thomson Scientific 2006). It is calculated by dividing the number of citations to articles published in a given year by the number of articles published in that year. In addition:
Because it is a per-article average, the immediacy index tends to discount the advantage of large journals over small ones. However, frequently issued journals may have an advantage because an article published early in the year has a better chance of being cited than one published later in the year. Many publications that publish infrequently or late in the year have low immediacy indexes.For comparing journals specializing in cutting-edge research, the immediacy index can provide a useful perspective (Journal Citation Reports 2005).
Within EEEL's research portfolio, several scientists are engaged in activities that are at the technical boundaries of their respective disciplines. For publications reporting on these types of research results, the Immediacy Factor may be an important consideration of impact.
Using the web-based version of the JCR, the Lab Liaison collected impact factor and immediacy index data for all titles in which EEEL authors had published three or more times over the period in question. The corresponding JCR subject categories for these individual titles were then identified as well. These categories represent publishing avenues of potential relevance for EEEL research findings. For each of these categories, the top journals (as rated by impact factor) were identified and compared against EEEL publications patterns. The Laboratory Director's staff was presented with tabular data showing the following: leading titles for each category, where staff researchers had published relative to these titles, and the respective ranking for other titles within the categories.
EEEL management specifically mentioned an interest in publications produced by the Institute of Electrical and Electronics Engineers (IEEE), as these represent a major subsection of topically relevant journal titles. The assessment analyzed the full range of titles published by the Institute and projected these results against the publications history of the EEEL data. The EEEL Lab Liaison considered the other publication avenues for EEEL authors as well, namely those titles in which articles have been published with less frequency. These titles were subsequently compared to those selected with greater frequency by EEEL staff and their respective rankings were assessed within their given categories.
Within technical and scientific areas, conference publications are also an important means of disseminating research results and increasing visibility for EEEL research efforts. However, as there is no equivalent of JCR for conference proceedings, it is not as easy to analyze the prestige or impact of participating within a specific conference and thereby being included in the proceedings. ISI does not record data for conference proceedings as they do for journal literature.
Since there is no direct way to assess conference prestige, the EEEL Lab Liaison developed an approach to analyzing types of data that are easily located. In order to estimate the status of a given conference, the analysis considered the following factors:
If a library has purchased conference proceedings for their respective collection, it indicates that these volumes are well regarded. The WorldCat (OCLC) database provides cataloging references for many libraries worldwide, thereby allowing an estimate to be made for how many have added a specific proceeding volume to their collection.
The range of physics journals covered indicates both the breadth and depth of physics-related publications. A total of eight different individual sub-categories of physics publications are tracked. The areas emphasized in EEEL's publications history correspond with the three largest sub-categories within this discipline:
The category on "Instruments and Instrumentation" was also pertinent, but is a distinct "niche" focus. This range of titles definitely encompassed EEEL's traditional core metrology functions; however, emerging "cutting edge" technology research efforts would likely be more appropriate for publications in the above-referenced subject categories.
When scientists are selecting a venue for publication, this process is based on a number of factors beyond the relevance of the title's specific subject matter or its plans for a special topical issue. Other considerations include a researcher's experience and comfort level with a title as well as the editorial and review processes of specific journals. As a result, there are boundaries, both real and perceived, to the range of potential publication avenues for any given research article. However, on a general level, it is advisable to target those publications that are within the top percentage for each pertinent subject category in order to increase the prominence of EEEL's research and broaden the impact. In an extremely broad and diverse area such as electrical/electronics engineering, a wider range of publications can likely be pursued. As part of a comprehensive strategy, the EEEL Lab Liaison suggested that it may be prudent to select the higher rated publications for publishing the findings of most significance to EEEL's strategic plans and future initiatives.
Using the WorldCat database, the EEEL Lab Liaison searched for references to a selection of conference titles. In addition to the quantities of libraries which had purchased specific conference proceedings, the Lab Liaisons examined the specific libraries which had purchased these volumes. Prominent academic institutions such as MIT, Cal Tech, Carnegie Mellon, and Princeton Universities as well as other entities such as the New York Public Library Research Library were taken as indirect metrics of quality.
Also of interest were reference patterns for other conferences presented by the same organizations. This can provide further understanding on how well a particular conference is perceived: if the sponsoring organization hosts many different conferences and the variety of those proceeding volumes are cataloged in several library collections, then this may indicate a level of esteem within the technical community.
Longitudinal data examining the purchases of specific proceedings over time also provided an indication of the prestige of a conference: if attendees return to a particular conference again and again, then these events may hold a higher value from the perspective of the technical community. Noting the acquisition patterns for conference volumes over a range of years helps to provide a basic assessment of those events.
The analysis indicated that EEEL researchers are publishing in several conference proceeding volumes that are widely purchased, ranging from 90 to 120 copies catalogued worldwide. In some cases, the Lab Liaison could extrapolate that a conference was gaining (or losing) in circulation simply by noting how many libraries had procured proceeding volumes from year to year. If the differences from year to year were statistically significant, then these findings were highlighted to the EEEL director.
One particular set of conferences frequented by EEEL research staff was very poorly represented with only two to five libraries adding these symposium titles to their collections between 2000 and 2005. This could be for a variety of reasons, such as lack of prestige or a more narrow technical focus. However, despite little presence in the WorldCat collections, there may be a strong value for researchers to attend based on networking opportunities, interactions with vendors, or other more intangible benefits.
Using general key words collected from the list of EEEL frequent publications, the EEEL Lab Liaison also conducted WorldCat searches to determine which conference proceedings were most frequently purchased. These types of titles might suggest other potential conference venues available throughout a given year and may be good opportunities to target for EEEL research publications.
Analyzing Internet references provided a very subjective assessment of a conference's perceived value. In the absence of other more concrete data sets, it is useful only as a tool for understanding "relative" importance of a specific conference. For nearly all of the conferences provided by the EEEL Director's staff, the Lab Liaison found a great variety of Internet references, indicating that many of these conferences are well attended within their target communities. The array of references found for these conferences ranged from industry and manufacturing web sites to publication lists for professors representing international universities. This would indicate a large and varied demographic "pool" of attendees, including industry and academic representation across multiple countries. Attendance and participation at such conferences will benefit EEEL research by increasing both the diversity and geographic "reach" of audiences.
In terms of the final report provided, the EEEL Director and staff ultimately found the analysis results to be very useful as a basis for developing an overall publications strategy. The data reported above must be used in concert with many other factors as the EEEL scientists determine their ultimate avenues for publicizing their scientific results.
On a larger level, this analysis provided an opportunity for NIST Research Library's Lab Liaisons to develop new assessment skills and to identify innovative ways to support customer needs. This report is now serving as a template for all Lab Liaisons to apply to the individual operating technical units within the entire NIST organization. Furthermore, this experience has demonstrated that additional assessment activities can be undertaken using a common array of data sources available to today's information professional. The Lab Liaison program will continue to develop new ways to collaborate with the NIST scientific staff.
The author wishes to acknowledge the vision and collaboration of Dr. David Wollman, Scientific Advisor for EEEL at NIST.
Garfield, E. 2006. The history and meaning of the journal impact factor. Journal of the AMA (JAMA) 295(1): 90-93.
Glanzel, W. & Moed, H. 2002. Journal impact measures in bibliometric research. Scientometrics 53(2): 171-193.
Hoeffel, C. 1998. Journal impact factors. Allergy 53(12): 1225.
Jacso, P. 2001. A deficiency in the algorithm for calculating the impact factor of scholarly journals: the journal impact factor. Cortex 37(4): 590-594.
Johnson, B. 2000. Environmental impact: a preliminary citation analysis of local faculty in a new academic program in environmental and human health applied to collection development in an academic library. Library Philosophy and Practice 2(2) [Online]. Available: http://www.webpages.uidaho.edu/~mbolin/johnson.html [Accessed: February 1, 2006].
Journal Citation Reports. 2005 Journal Citation Reports "Help Menu". [Online]. Available: {http://clarivate.com/?product=journal-citation-reports. [Accessed: February 3, 2006].
Kelsey, P. & Diamond, T. 2003. Establishing a core list of journals for forestry: a citation analysis from faculty at southern universities. College & Research Libraries 64(5): 357-377.
LaBonte, K. 2005. Citation analysis: a method for collection development for a rapidly developing field. Issues in Science and Technology Librarianship [Online]. Available: http://www.istl.org/05-summer/refereed.html. [Accessed: February 1, 2006].
National Institute of Standards and Technology. 2005. General Information. [Online]. Available: {http://www.nist.gov/public_affairs/general_information.cfm} [Accessed: January 30, 2006].
Ramesh, L.S.R.C.V. & Nagaraju, A.V.S.S. 2000. Citation analysis of the Indian Journal of Information, Library and Society. Indian Journal of Information, Library and Society 13(3-4): 171-179.
Rey-Rocha, J., et al. 2001. Some misuses of journal impact factor in research evaluation. Cortex 37(4): 595-597.
Russell-Edu, W. 2003. The impact factor: your job may depend on it -- but do you know what it is? Cancer Futures 2(3-4): 171-175.
Shin, E. 2004. Measuring the impact of electronic publishing on citation indicators of education journals. Libri 54(4): 221-227.
Shiue, Y-C., et al. 2004. A computer-aided bibliometrics systems for journal citation analysis and departmental core journal ranking list generation. Journal of Educational Media & Library Sciences 42(2): 199-219.
Thomson Scientific. 2006. Glossary of Thomson Scientific terminology. [Online]. Available: http://thomsonscientific.com/support/patents/patinf/terms/ [Accessed: February 1, 2006].
Tunon, J. & Brydges, B. 2005. Improving the quality of university libraries through citation mining and analysis using two new dissertation bibliometric assessment tools. World Library and Information Congress: 71th IFLA General Conference and Council. [Online]. Available: http://www.ifla.org/IV/ifla71/papers/078e-Tunon_Brydges.pdf. [Accessed: February 1, 2006].
Vincent, A. & Ross, D. 2000. Citation analysis of the Decision Sciences journal. Decision Line 31(1): 4-8.
Wormell, I. 1998. Informetric analysis of the international impact of scientific journals: how "international" are the international journals? Journal of Documentation 54(5): 584-605.