Testing RDA at Dominican University's Graduate School of Library and Information Science: The Students’ Perspectives This article was downloaded by: [Biblioteca del Congreso Nacional], [Mr Biblioteca Congreso Nacional] On: 28 December 2011, At: 07:44 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Cataloging & Classification Quarterly Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/wccq20 Testing RDA at Dominican University's Graduate School of Library and Information Science: The Students’ Perspectives Marjorie E. Bloss a a Graduate School of Library and Information Science, Dominican University, River Forest, Illinois, USA Available online: 17 Nov 2011 To cite this article: Marjorie E. Bloss (2011): Testing RDA at Dominican University's Graduate School of Library and Information Science: The Students’ Perspectives, Cataloging & Classification Quarterly, 49:7-8, 582-599 To link to this article: http://dx.doi.org/10.1080/01639374.2011.616264 PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae, and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand, or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material. http://www.tandfonline.com/loi/wccq20 http://dx.doi.org/10.1080/01639374.2011.616264 http://www.tandfonline.com/page/terms-and-conditions Cataloging & Classification Quarterly, 49:582–599, 2011 Copyright © Taylor & Francis Group, LLC ISSN: 0163-9374 print / 1544-4554 online DOI: 10.1080/01639374.2011.616264 Testing RDA at Dominican University’s Graduate School of Library and Information Science: The Students’ Perspectives MARJORIE E. BLOSS Graduate School of Library and Information Science, Dominican University, River Forest, Illinois, USA Dominican University’s Graduate School of Library and Informa- tion Science (GSLIS) was one of a funnel group of graduate schools of library and information science selected to test Resource De- scription and Access (RDA). A seminar specifically for this purpose was approved by the dean and faculty of the library school and was conducted from August to December 2010. Fifteen students participated in the test, creating records in Anglo-American Cata- loguing Rules (AACR2) and in RDA, encoding them in the MARC (Machine Readable Cataloging) format, and responding to the re- quired questionnaires. In addition to record creation, the students were also asked to submit a final paper in which they described their experiences and recommended whether or not to accept RDA as a replacement for AACR2. KEYWORDS Resource Description and Access (RDA), Anglo- American Cataloguing Rules (AACR2), user studies, descriptive cataloging, cataloging Received May 2011; revised August 2011; accepted August 2011. Marjorie E. Bloss was a full-time lecturer in Dominican University’s Graduate School of Library and Information Science from 2004–2011. She also served as the RDA Project Manager from 2005 to 2009. She retired at the end of June 2011. The author thanks Dean Susan Roman and the Dominican GSLIS faculty for their support, and the following students who participated in the RDA Testing seminar: Albulena Bruncaj, Mary Jo Chrabasz, James Hennelly, Andrea Jarratt, Phyllis Kastle, Concetta Kellough, Heidi Knuth, Richard Martin, Lauren Robb, Jennifer Rubin, David Sanborne, Anthony Santaniello, Stacy Taylor, Julie Tegmeier, and Amanda Vermeulen. Address correspondence to Marjorie E. Bloss, 2827 W. Gregory Street, Chicago, IL 60625, USA. E-mail: marjorie bloss@msn.com 582 D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 Testing RDA at Dominican University 583 INTRODUCTION There is a saying that the author of this article first heard from her library director a number of years ago: “Pioneers are often found with arrows in their backs.” What the director was referring to at the time was something called the MARC (Machine Readable Cataloging) format. We were in the early days of MARC and were busy trying to convince the university administration that converting our card catalog records into MARC was a cost-effective thing to do. To prove the point, we decided to generate a microfiche catalog from our MARC records in order to replace the card catalog. It did not go over very well with the users even though our new-found ability to hold what used to be the card catalog in one hand impressed everyone considerably. While the microfiche catalog was mercifully replaced by an online catalog, we learned an important lesson: life is not easy for pioneers—even in the library world. So here we are, experiencing yet another example of the pioneering spirit in our exploration and testing of Resource Description and Access (RDA). Dominican University’s Graduate School of Library and Information Sci- ence, (GSLIS, located in River Forest, Illinois), was one of fourteen library schools constituting a funnel group that was selected to participate in the formal testing of RDA. (A “funnel group” is a group of library schools—in this case—working together as a single unit. Information and processes are “funneled” through an institution representing the group as a whole.) Each library school that agreed to participate in the test was given free reign with regard to how it wished to design its approaches to the testing. This article will focus on the testing that took place specifically at Dominican University’s GSLIS and will include the following: • The RDA testing process • How RDA testing was conducted at Dominican University • Students’ comments and observations—The Negatives • Students’ comments and observations—The Positives • Perspectives on teaching RDA THE UNITED STATES NATIONAL LIBRARIES’ TESTING PROCESS The plans to test RDA were created jointly by a steering committee consisting of representatives from the Library of Congress (LC), the National Agricultural Library (NAL), and the National Library of Medicine (NLM). The process they identified was to select approximately 25 libraries from various communities (academic, public, special and school libraries, library automation vendors, and library schools). Each participant would be expected to catalog the same 25 titles selected by the three U.S. national libraries and to create original cataloging records for them using both Anglo-American Cataloguing Rules, D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 584 M. E. Bloss 2nd edition (AACR2) and RDA. Five additional records were selected that would be copy-cataloged using RDA. Finally, each participating institution would be expected to create a minimum of 25 “extra set” records using only RDA. The 25 extra set records were to be selected by each participating library, reflecting the materials the institution received as part of its normal acquisitions. Libraries not selected for the formal testing process were also encouraged to contribute records for the test. Once the records were cataloged participants were asked to submit them to the three U.S. national libraries for review. This could be done through OCLC or some other method for submission. For every record created, par- ticipants had to fill out a survey assessing such things as the amount of time it took to create the record, the difficulties they had along the way (be it with the content of the RDA rules or using the online version of the cataloging instructions), the amount of time taken to consult with others, and ultimately, whether or not RDA should be adopted. These surveys were then analyzed by the three U.S. national libraries in order to determine whether RDA would become accepted cataloging practice in the United States. TESTING TIMELINES The three U.S. national libraries had identified timelines for the testing, de- pendant on RDA’s release. When RDA was released on June 23, 2010, the testing period began. The end of June through the end of September 2010 was designated as a training period when test participants were expected to become familiar with and experienced in using the RDA instructions and the RDA Toolkit (the online package that includes RDA itself plus additional features such as tools related to RDA’s use, e.g. AACR2, workflows, RDA to MARC and MARC to RDA mappings). From October through December 2010, participants were expected to catalog the 25 original records using both AACR2 and RDA, the five copy cataloging records, and a minimum of 25 “extra set” records. Following the submission of the cataloging records and their related questionnaires, LC, NAL, and NLM analyzed the results from January through March 2011. During this time, they began to formulate their recommendations regarding the adoption of RDA. The three U.S. national libraries announced their recommendations at the 2011 annual American Library Association conference. The overriding recommendation was to adopt RDA but not to do so until January 2013 at the earliest. LC, NAL, and NLM identified a number of modifications they felt should be made in the intervening 18 months both to RDA content and the RDA Toolkit as well as developing a replacement for the MARC format. Many of these recommendations were based on comments received from the RDA test participants. D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 Testing RDA at Dominican University 585 SELECTION OF THE LIBRARY EDUCATORS’ GROUP TO TEST RDA Prior to the selection of the RDA test participants, a call went out to the grad- uate library schools inquiring if they would like to form a group of library school educators for the purposes of testing RDA. Educators from fourteen schools indicated they would be interested in doing so. The appropriate ap- plication forms were submitted and the three U.S. national libraries selected the group to participate in the formal testing of RDA. The group was a loose confederation—one where each institution could decide how it wished to catalog and submit its records. In some institutions, only the faculty submitted records. In others, students contributed records ei- ther as part of a practicum, a seminar devised specifically for testing RDA, or voluntarily.1 Some of the educators decided not to participate in the testing once they saw how time-consuming the process was. Although the edu- cators’ group was considered a funnel group, coordination occurred at the administrative level only rather than creating and submitting bibliographic records and surveys through one institution. RDA TESTING IN DOMINICAN UNIVERSITY’S GSLIS PROGRAM During the spring of 2010, the author proposed to Dominican University’s GSLIS faculty that she conduct a seminar designed specifically for the purpose of testing RDA. The faculty approved the proposal for the fall 2010 semester. In order to register for the seminar, students had to have taken the cataloging-related core course, Organization of Knowledge, and the second- level cataloging course. Students who had taken only the core course were permitted to take the seminar if they could prove they had sufficient cata- loging knowledge and skills equivalent to the second-level cataloging course. To this end, they had to submit cataloging records demonstrating knowledge of both AACR2 and the MARC format. Consequently, fifteen students were admitted to the seminar. This num- ber was very advantageous when it came to divvying up the 25 original set records, the five copy cataloging records, and the extra set records. The final tally of records submitted by Dominican included all 25 original set records in both AACR2 and RDA, the 5 copy cataloging records, and 95 extra set records using only RDA. Of course, the related surveys for each record were also submitted. The students’ record creation process for the test was divided as follows. • Five students created 5 original set records each, using RDA; in addition they created 5 extra set records using RDA • Five students created 5 original set records each, using AACR2; in addition they created 5 extra set records using RDA D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 586 M. E. Bloss • One student did the 5 copy cataloging records; in addition she created 5 extra set records using RDA • Four students created 10 extra set records using RDA Classroom time was spent reviewing records, discussing successes and diffi- culties encountered with RDA content, the RDA Toolkit, creating records in OCLC, as well as general observations about the testing process itself. SOME STUDENT DEMOGRAPHICS One of the goals in having GSLIS students participate in testing RDA was to gauge whether RDA was easier to use than AACR2. The hypothesis was that library school students who approached RDA without the baggage of many years of using AACR2 would have an easier time adjusting to the new cata- loging code than those immersed in AACR2. To this end, the self-selecting process of students who registered for the class proved very effective. Cat- aloging experience for 11 out of the 15 students ranged anywhere from no cataloging experience to one year. Four students had 1–2 years of cata- loging experience. Unfortunately, the survey questions regarding cataloging experience were no more specific than this; therefore, it was impossible to know what, exactly, students’ cataloging experience consisted of (e.g., experience gathered only during course work, copy cataloging experience, original cataloging experience). Another interesting demographic was the ages of the students. Although no specific information was requested from the students regarding their ages, the author estimates that all of them were in their 20s, 30s, and 40s. Consequently, these particular students would have a number of years left as practicing librarians and would indeed witness the impact of RDA on library staff and users alike. In short, they had a vested interest in the future of cataloging and of RDA. Prior to the seminar, students were advised that they needed to have an ability to tolerate ambiguity and they would need to be flexible. We knew instructions would be coming in quick succession, often while students were training or even after we had moved from the training period to creating the test records themselves. We also knew there would be modifications to testing instructions since all of us (the U.S. national libraries and OCLC as well) were learning as we went along and no one knew all the answers right out of the box. Furthermore, and perhaps most important, students needed to understand that we would need to work collaboratively and make allowances for mistakes—even by the professor. In addition to the requirements for the test itself, the students were re- quired to submit a final paper that would describe their learning experiences using RDA and to record their observations on the management of the testing D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 Testing RDA at Dominican University 587 process as a whole. In other words, they were asked to look beyond record creation and to look at the testing of RDA from a project management per- spective. Finally, they were asked to recommend whether or not RDA should be adopted and to explain the reasons for their recommendation. Students were asked to comment on the following in their final paper: • How easy was the transition from AACR2 to RDA for you? • What helped you acclimatize yourself to RDA? Were there certain “tricks of the trade” that you found useful? • What is your assessment of RDA’s content? Based on your experiences, what changes would you like to see? • What is your assessment of the RDA Toolkit? Based on your experiences, what changes would you like to see? • Do you feel RDA lives up to the goals that the Joint Steering Committee (JSC) and the Committee of Principals (CoP) identified for RDA (e.g., more cataloging efficiency, better internationalization, better accommodation of digital materials, etc.)? • Do you prefer AACR2 over RDA or RDA over AACR2 and why? • What has this course taught you from the perspective of management: ◦ Introducing a new cataloging code ◦ Introducing new software ◦ Observing how people learn and what helps them learn ◦ Assessing your leadership role should you be in a position of introducing RDA to your staff, or to the library as a whole • Would you recommend that RDA (a) not be adopted, (b) be adopted with some changes along the lines of what you previously identified, (c) be adopted as soon as possible realizing that some changes are inevitable? All the students were extremely enthusiastic and excited about being part of a national program to test RDA. The fact that they were putting their library school experience into practice and contributing to a national decision about the future of cataloging provided them with an experience only few would have. Through the semester we experienced moments of frustration but in the end, everyone had a feeling of immense satisfaction for having participated in testing RDA. DISCONNECTS BETWEEN RDA AND SCHOOL SEMESTER TIMELINES Even before the semester began, we were at a disadvantage. As has been mentioned previously, the training period for RDA began immediately after D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 588 M. E. Bloss its release in late June 2010. The seminar, however, was not scheduled to begin until Dominican’s fall semester, which commenced at the end of August. Technically, this would mean a loss of two months in the testing process. Additionally, our time would be cut short at the end of the testing period since the semester concluded in mid-December rather than at the end of the month. In order to compensate to some degree, the university allowed us to hold two sessions during early August (before the fall semester officially began). This helped immensely in terms of discussing some of the basics of the testing process and distributing training documents to the students for them to review prior to the beginning of the semester. We were at another disadvantage with regard to the submission of our cataloging records through OCLC. While Dominican University’s GSLIS most certainly has an account with OCLC, we are (understandably) allowed to use the system only in a limited mode. What this means is that we can save records in OCLC in order for them to be reviewed, but we cannot upload them into the system. OCLC’s policy for testing RDA was that we were expected to create Institutional Records (IRs), a process that had its own procedures and guidelines. This, then, was another component of the test (in addition to becoming familiar with RDA and the RDA Toolkit) that was part of our learning curve. AVAILABILITY OF TRAINING MATERIALS Even in the early stages of RDA testing, there were a number of excel- lent training materials available. Many of these were generated by the LC including their nine PowerPoint training modules as well as the training documents used by LC staff. The LC also made its RDA policy decisions (Library of Congress Policy Statements, or LCPS) available that were specific to the RDA testing. Not long after the RDA Toolkit became available, the LCPS were integrated in with RDA content, making it very easy to go back and forth between the policy statement itself and the RDA instruction to which it referred. The University of Chicago in particular began to catalog using RDA and generously provided access to its training documentation. This included a number of valuable workflows that were incorporated within the RDA Toolkit itself. A series of Webinars was given, with instruction and guidance for using both RDA content and the RDA Toolkit. When it came down to it, there was such a plethora of good documentation that it became necessary to be selective in what to use and what to omit. Even with the LCPS, test participants were encouraged to make their own decisions regarding certain RDA instructions and workflows. One of these decisions had to do with whether or not to include authority control as part of the testing process. In the Dominican GSLIS seminar, we decided not to create any authority control records. The semester was simply too short D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 Testing RDA at Dominican University 589 given the university schedule described above and there was enough to do with training, bibliographic record creation, learning how to create IRs in OCLC, filling out the survey for each record created, not to mention writing the final paper for the class. In hindsight, we felt this omission was a good decision. WHAT WE LEARNED: GENERAL COMMENTS Initially, the author thought she would be the only one having difficulty in making the transition from AACR2 to RDA due to her many years of using AACR2. This was not the case as every member of the class commented on the steep learning curve required for creating bibliographic records using RDA. What surprised us most was the need for a detailed knowledge of the Functional Requirements for Bibliographic Records (FRBR) and to a lesser degree the Functional Requirements for Authority Data (FRAD). This knowl- edge goes beyond an overview of FRBR. It calls for a solid understanding of the attributes of the groups of entities and how they relate to one another when creating a bibliographic record. We observed that RDA does not lay out cataloging instructions nearly as linearly as does AACR2. Attributes that we are used to seeing as a unit in AACR2 (e.g., extent data) are found in separate chapters in RDA (e.g., pagination and size are found under the instructions for “Manifestation” while the instructions for illustrations are found under “Expression”). We also found ourselves needing to adjust to splitting AACR2’s General Material Designation (GMD) into three parts especially as one of those elements (content type) has its instructions in chapter 6 whereas the instructions for the other two elements (carrier type and media type) are found in chapter 3. Another learning curve was adjusting to RDA’s terminology. In some cases, it was like learning a new language, as the vocabulary used in RDA comes very much from FRBR, as do the concepts that underlie RDA. In other cases, we discovered that the terminology in RDA does not always have an equivalent in AACR2 and vice versa. This caused frustration when searching a term using AACR2 vocabulary that does not exist in RDA. There was nothing that immediately pointed us in the right direction in the RDA Toolkit. Consequently, we developed skills for what we called “going through the back door”—namely, reviewing training documentation or consulting materials familiar to us like MARC and AACR2 that would then provide a map to get us into the appropriate instruction in RDA. Even with the modifications made to MARC to accommodate RDA, we discovered we had difficulty putting the round RDA pegs into the square MARC holes (unless it was the other way around). One major example is the 1xx field, the main entry field. RDA provides instructions for access points but not main entries. We found ourselves needing to select an access point D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 590 M. E. Bloss for a main entry in order to create a MARC record. We seriously considered putting all personal, corporate, or family access points into 7xx fields as a way to adhere to RDA more closely. STUDENT COMMENTS: THE NEGATIVES The author analyzed the students’ comments from their final papers, group- ing them into “negative” and “positive” categories and grouping like com- ments together. What follows is a listing of student comments based on their submitted final papers. The students’ comments underscored the importance of having a solid understanding of the details of FRBR and FRAD due to the lack of a linear approach for creating bibliographic records in RDA, the difficulty in corre- lating AACR2’s vocabulary with RDA’s, and the observation that MARC is not the best encoding scheme for RDA. Here are some of the other difficulties the students found in using the RDA content and the RDA Toolkit. Negative Comments on RDA Content • RDA’s rules can be vague and lack clarity in places; the language of RDA should be simplified • Initially, students ran into difficulty adjusting to RDA’s vocabulary and the order of the rules • Students felt that RDA’s structure was based too heavily on that of FRBR and FRAD, making a linear approach to cataloging difficult. While the students were very supportive of the concepts of FRBR and FRAD, they expressed a desire to see RDA recast for catalogers rather than having the instructions governed by FRBR and FRAD entity groups • The rules for description and the creation of access points seemed frag- mented and at times jumped around to different chapters rather than keep- ing instructions all together (e.g., the creation of access points) • The lists of relationship designators should be combined into one list • Students were not always sure when they had completed a bibliographic record—they often had an uneasy feeling that there was more information that they needed to include Negative Comments on the RDA Toolkit • The RDA Toolkit slowed down the cataloging process as the software drilled down through the entire chapter before arriving at the specific rule D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 Testing RDA at Dominican University 591 • The students were unable to reference multiple rules in the RDA Toolkit simultaneously • The RDA Toolkit did not have an index (this has recently been rectified) • Scrolling in the RDA Toolkit was often slow or erratic • Some of the RDA Toolkit’s functionality was unclear or awkwardly struc- tured (e.g., “Previous” and “Next-hit” arrows and “Advanced search”) STUDENT COMMENTS: THE POSITIVES Positive Comments on RDA Content • Many students commented on the fact that RDA’s de-emphasis on format and type of material led to greater flexibility in the rules • RDA was much better equipped for cataloging digital materials • RDA was much more “future-proof” than AACR2 in that it was much more in line with other digital knowledge and information communities • Students found that many rules in RDA were similar or identical to those in AACR2 (especially those dealing with access points). Therefore, once they became familiar with where to find these rules in RDA, their comfort level in using RDA increased significantly • For the most part, students liked replacing the GMD with the carrier, content, and media designators (especially for digital materials) and also supported the individual MARC tags for them although one person felt that an expansion of the GMD identifiers would have sufficed • Students believed that eliminating the “rule of three” was essential in pro- viding better access to materials • A number of students commented that they felt RDA would meet its goals regarding a broadening of scope internationally • Students were highly in favor of the entity relationship-based database concepts of RDA, believing that this provides a greater ability to support user needs. They also commented on the importance of the vendor com- munity in supporting this database architecture in order to realize the full potential of RDA • Students noted that although the RDA learning curve was steep, they be- came more adept at using RDA as they gained practice in its use. Several students were surprised when, at the end of the semester, they were cre- ating RDA bibliographic records very quickly. Positive Comments on the RDA Toolkit Generally, the RDA Toolkit received high marks from the students although there were certainly areas for improvement. One of the students commented that he could not see how anyone could easily learn RDA from a print D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 592 M. E. Bloss document due, in large part, to the many interactive tools, mappings, and resources available in the Toolkit. Specific comments included the following: • Students felt the RDA Toolkit was readable and easy to navigate • Students felt that the Toolkit itself greatly aided in their learning the RDA content. Specifically, they liked such Toolkit functions as: ◦ Integration of the Library of Congress Policy Statements with the RDA instructions ◦ Quick Search function ◦ Various mappings (e.g., RDA to MARC and MARC to RDA, AACR2 to RDA) ◦ Bookmarks and notes ◦ Synch Table of Contents ◦ Workflows RDA’S IMPACT ON USERS Although the students did not have the opportunity to query catalog users as to which record was preferable, AACR2 or RDA, they made their own comparisons. Students noted that in many cases there was little difference in the representation of bibliographic data from an AACR2 to an RDA record; however, they still felt there were some improvements in the RDA records. A number of students pointed to the elimination of abbreviations as something that would benefit the user, especially in the effort to internationalize RDA. The students also supported the separation of the GMD into three components—content, carrier, and media identifiers. While students did not believe this was overly useful for print material, they observed the great value in being able to be more specific when describing digital materials. And finally, students noted the value of a FRBRized catalog as hugely beneficial for library users. Providing them with the ability to access all the manifestations of a work at once rather than having to sort through several different records was seen as a major plus. By extension, RDA’s focus on attributes and relationships provides increased access to information. thus increasing responsiveness to user needs. OTHER CONCERNS The students’ comments went beyond RDA content and the RDA Toolkit, demonstrating that they had broader concerns than simply RDA cataloging. These included: • The cost of RDA—both the Toolkit and the hardware and software on which to run it D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 Testing RDA at Dominican University 593 • Concern for the paraprofessional and clerical staffs needing to understand FRBR theory in order to accomplish their day-to-day cataloging tasks • The need to address the important issue of authority control and whether AACR2 and RDA headings could, in fact, co-exist in a single file • And again, the steep RDA learning curve even for people who do not carry AACR2 baggage GENERAL SUPPORT FOR RDA In their final papers, the students were asked to make a recommendation about RDA’s future. All of the students supported the adoption of RDA, al- though there was hesitation in some cases as well as differences of opinion as to how quickly RDA should be adopted. Nine of the 15 students rec- ommended that RDA be adopted immediately. Six recommended that it be adopted but only after modifications either to the content or the Toolkit or both were made. TEACHING RDA What is the best way to teach RDA to GSLIS students regardless of whether or not they intend to become catalogers? One thing that is obvious is that get- ting our minds and mouths around RDA is considerably different than when AACR2 was implemented. Technology—the ability to share documents over the Web, to hold training sessions via Webinars and other similar technolo- gies, to ask questions and receive responses within hours if not minutes, to hold philosophical discussions about cataloging in general and cataloging codes in particular—has provided us with information overload. If anything, our difficulty is going to be sorting out all of the available material (much of it excellent) and deciding what is most appropriate to use for our particular circumstances. Training in a classroom situation can be very different than training in a cataloging department. Dominican University’s GSLIS prides itself in face- to-face classroom settings. Although the number of online courses taught is growing and are certainly included in the curriculum, our current course ratio has us teaching a larger number of face-to-face courses. Face-to-face sessions are held once a week during the fall and spring semesters (twice a week is the norm in summer) with opportunities through Blackboard for asking questions, making observations, and holding discussions between class sessions. This process differs from a cataloging department where staff has the opportunity to meet, review material first-hand, and ask questions on a daily basis. So what works in a classroom situation? What is the most effective way to teach RDA in that setting? D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 594 M. E. Bloss For the immediate future, it is essential to include some fundamental instruction of AACR2 in basic cataloging syllabi. There are simply too many existing bibliographic records that were created using AACR2 cataloging and it is highly unlikely anyone will have the time or money to convert them to RDA cataloging. It would be a disservice to students and the people they ultimately assist not to include AACR2 in our curriculum. This is also true because a number of RDA instructions (particularly in the way access points are formatted) are based on AACR2 rules. As time continues and more and more RDA records are integrated into our databases and files, we will see a shift in the time spent teaching RDA rather than AACR2 in our curricula but for the meantime, we need to teach both. As has been mentioned previously, anyone working with RDA must spend time with FRBR and FRAD, and that means consulting the complete documents; not only the overviews of them. This will form the foundation for understanding RDA’s organization, the terminology and vocabulary used in RDA, and the relationships between the various groups of entities. The Library of Congress has made a number of its excellent documents available. These range from PowerPoint presentations to training documents to documents that compare AACR2 cataloging with RDA cataloging (see list of references). Giving students examples of the similarities and differences between AACR2 and RDA immediately (even before introducing them to RDA’s instructions) provides an excellent visual introduction as to what they can expect to see in RDA bibliographic records. Once students have an idea of the content of an RDA bibliographic record, it is time to look at the RDA instructions themselves. Providing an overview of the organization of RDA is essential and of course, tying it in with FRBR terminology and principles emphasizes how the FRBR conceptual model forms a foundation for the RDA instructions. Comparisons between AACR2 and RDA can prove helpful here if students have been exposed to AACR2. With regard to the creation of a descriptive cataloging record, RDA’s core record attributes for manifestations are a valuable way for students to understand the elements of a bibliographic record and their related rules. In addition to becoming familiar with the content of RDA’s instructions, students will also need to become familiar with the features of the RDA Toolkit. Giving students the opportunity to spend time experimenting with RDA Toolkit functionality, developing a good understanding of how the Toolkit is organized, and examining its various resources and features are essential. Those training for and teaching RDA will need to factor in time for students to become comfortable with the Toolkit and how it supports RDA content. The RDA Toolkit functionality can also be an excellent instructional tool when learning the content of RDA instructions. As was noted in the stu- dents’ comments, the various features of the Toolkit such as “Quick Search” (where you can key in either rule numbers or phrases, e.g., “statement of D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 Testing RDA at Dominican University 595 responsibility”) help tremendously when attempting to find specific rules. The mappings, especially those from RDA to MARC and MARC to RDA (there are also mappings from Dublin Core to RDA and vice versa), are extremely valuable as another way of finding the appropriate RDA instruction. AACR2 is also included in the RDA Toolkit and the rules there are hot-linked to the corresponding RDA instructions. The Toolkit’s bookmarking system is yet another method that can help students track down a rule previously used; and the comments feature, allowing one to add his or her own observations on the application of instructions, assures the user that their earlier work will not be lost. The author previously noted that RDA is not nearly as linear as AACR2 in providing users with a step-by-step method of proceeding when creating a bibliographic record. The “Workflows” feature in RDA can help allevi- ate this difficulty. (It is here that RDA users can submit their own internal workflows—either for all to see and use or limited only to their own institu- tion.) The “Workflows” section is already beginning to be populated thanks initially to the Library of Congress and the University of Chicago. Students found the workflows invaluable when looking for step-by-step cataloging instructions. All in all, it is essential to give students time to understand the interactive nature of the RDA Toolkit and to use its features effectively. JUMPING INTO THE DEEP END OF THE POOL And after all the documentation is read and reviewed, and after all the PowerPoint presentations are given and Webinars are attended, the best way to learn RDA’s instructions is to take a deep breath and simply begin to use them. An effective method for doing this is to have all students catalog the same resource—often one of the course texts works best. Having students identify the descriptive cataloging attributes using RDA’s core record elements is a good starting place. In addition to having the students record the attributes themselves, having them identify the specific rules they used is another important factor in familiarizing them with the RDA instructions (and AACR2 for that matter). There is no question that the students will have growing pains when first matching wits with RDA but they had similar growing pains when they first began to use AACR2. We are introducing them to a new vocabulary and new concepts (as was the case with AACR2). There is no way we can expect students to create perfect records the first time they apply RDA. Therefore, it is essential to build in time for students to learn, to make mistakes, for us to provide them with feedback, and for all of us to critically analyze RDA’s instructions. The more experience students have using RDA (and anything new for that matter), the easier and faster record creation becomes. Cataloging is a new experience for the majority of GSLIS students. They D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 596 M. E. Bloss need time to learn the language and understand the rules—whether AACR2 or RDA or both. CONCLUSION The students who participated in Dominican University’s GSLIS RDA Testing Seminar went into the class knowing they would need to be flexible and open-minded; and that there would be additional instructions and modifica- tions to those instructions coming out at the same time as they were creating records. They knew they were working with both instructional content and software that had not yet been tested. They knew changes would be made once the test results had been analyzed and that they were, for all intents and purposes, beta testing RDA—both content and software. There was never a debate about the importance of AACR2, knowing it held us in good stead for more than 30 years but RDA moves us into the digital age in ways AACR2 cannot. Perhaps most important in having students participate in the RDA testing is that these are the people who will be the catalogers of the future. To quote one of the students: I prefer RDA to AACR2 and am all for adopting it as a new standard as I am rather heavily invested in the future of cataloging. It is clear to me that the future depends on making some significant changes. RDA represents an important step forward. . . [and] is imperative for the growth of the cataloging profession, for the evolution of libraries as a whole, and for the fulfillment of the basic principles of library service. We must pay attention to students’ comments and observations regarding the RDA instructions, the effectiveness of the RDA Toolkit, and the impact of RDA on catalogers and users alike. Today’s students are truly our pioneers and they are moving us forward into a new cataloging frontier. RECOMMENDED RESOURCES (All URLs accurate as of July 26, 2011). About FRBR and RDA: About RDA (OCLC) http://www.oclc.org/us/en/rda/about.htm Le Boeuf, Patrick, ed. 2005. Functional Requirements for Bibliographic Records (FRBR): Hype or Cure-All? Binghamton, NY: Haworth Information Press. D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 Testing RDA at Dominican University 597 Joint Steering Committee for Development of RDA http://www.rda-jsc.org/rda.html Maxwell, Robert L. 2008. FRBR: a Guide for the Perplexed. Chicago, IL: Amer- ican Library Association. Oliver, Chris. 2010. Introducing RDA: A Guide to the Basics. Chicago: ALA Editions. RDA listserv, RDA-L, http://www.rda-jsc.org/rdadiscuss.html. To subscribe to the list send an e-mail to: LISTSERV@LISTSERV.LAC-BAC.GC.CA with Sub- scribe RDA-L Firstname Lastname in the body of the message. RDA Toolkit, http://access.rdatoolkit.org/ Tilllett, Barbara. 2004. What is FRBR?: A Conceptual Model for the Biblio- graphic Universe. Washington, D.C.: Library of Congress Cataloging Distri- bution Service, http://www.loc.gov/cds/downloads/FRBR.PDF Tillett, Barbara. “RDA Changes from AACR2 for Texts,” http://www.loc.gov/ today/cyberlc/feature wdesc.php?rec=4863 Taylor, Arlene G., ed. 2007. Understanding FRBR: What It Is and How It Will Affect our Retrieval Tools. Westport, CT: Libraries Unlimited. Zhang, Yin and Athena Salaba. 2008. Implementing FRBR in Libraries: Key Issues and Future Directions. New York: Neal-Schuman. RDA and MARC MARC 21 Standards http://www.loc.gov/marc/ RDA in MARC http://www.loc.gov/marc/RDAinMARC29.html The US RDA Test and RDA Examples General Information on the US Test of RDA http://www.loc.gov/bibliographic-future/rda/ Joint Steering Committee for Development of RDA: Working Documents: Complete Examples for RDA Toolkit 2010 http://www.rda-jsc.org/working2.html#rda-examples Library of Congress Documentation for the RDA Test http://www.loc.gov/catdir/cpso/RDAtest/rdatest.html Library of Congress Choices for the RDA Test http://www.loc.gov/catdir/cpso/RDAtest/rdachoices.html Library of Congress Documentation: Examples for RDA Compared to AACR2 http://www.loc.gov/catdir/cpso/RDAtest/rdaexamples.html D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 598 M. E. Bloss OCLC Policy Statement on RDA Cataloging in WorldCat for the U.S. testing period http://www.oclc.org/us/en/rda/policy.htm RDA Test Partners Handout http://www.loc.gov/bibliographic-future/rda/RDA%20test%20partners%20 handout.xls RDA Test “Train the Trainer” (Kuhagen and Tillett—9 modules) http://www.loc.gov/bibliographic-future/rda/trainthetrainer.html http://www.loc.gov/catdir/cpso/RDAtest/rdatraining.html Resource Description and Access (RDA) Testing at the University of Chicago Library http://www.lib.uchicago.edu/staffweb/depts/cat/rda.html Includes training materials, UC’s timeline and testing decisions, and records created using RDA. US RDA Test Record Collection Plan http://www.loc.gov/catdir/cpso/RDAtest/admindoc2.doc US RDA Test Policy for the Extra Set: Use of Existing Authority and Bibliographic Records (Common Copy Set) http://www.loc.gov/catdir/cpso/RDAtest/admindoc1.doc Metadata Beyond RDA Coyle, Karen. Coyle’s InFormation, http://kcoyle.blogspot.com/ Hillmann, Diane. Metadata Matters, http://managemetadata.org/blog/ Open Metadata Registry: The RDA (Resource Description and Access) Vocabularies, http://metadataregistry.org/rdabrowse.htm W3C Library Linked Data Incubator Group, http://www.w3.org/2005/ Incubator/lld/ Recommendations for RDA’s Future Issued by the Library of Congress, National Agricultural Library, Na- tional Library of Medicine. Testing Resource Description and Access (RDA): Report and Recommendations. Washington, DC, June 13, 2011, http://www.loc.gov/bibliographic-future/rda/ Report and Recommendations of the U.S. RDA Test Coordinating Committee: Executive Summary. Washington, DC, June 13, 2011, http://www.nlm.nih.gov/tsd/cataloging/RDA report executive summary. pdf Response of the Library of Congress, the National Agricultural Library, and the National Library of Medicine to the RDA Test Coordinating Com D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11 Testing RDA at Dominican University 599 mittee. Washington, DC, June 13, 2011, http://www.nlm.nih.gov/tsd/ cataloging/RDA Executives statement.pdf A Web site has been established that will be the central place for plans, news, and progress of the MARC Transition Initiative: http://www. loc.gov/marc/transition/ NOTE 1. The University of Illinois at Urbana-Champaign published the details of their site experiences in Robert Bothmann, ed., “Cataloging News,” Cataloging & Classification Quarterly 49, no. 3 (2011): 242–256. D ow nl oa de d by [ B ib li ot ec a de l C on gr es o N ac io na l] , [ M r B ib li ot ec a C on gr es o N ac io na l] a t 07 :4 4 28 D ec em be r 20 11