key: cord-1045423-zcg45848 authors: Chang, Yu-Che; Chen, Chien-Kuang; Chen, Jih-Chang; Liao, Chien-Hung; Lee, Ching-Hsing; Chen, Yu-Chuan; Ng, Chip-Jin; Huang, Jing-Long; Lee, Shih-Tseng title: Implementation of the mini-clinical evaluation exercise in postgraduate Year 1 residency training in emergency medicine: Clinical experience at Chang Gung Memorial Hospital date: 2013-09-30 journal: Journal of Acute Medicine DOI: 10.1016/j.jacme.2013.06.004 sha: ad73fe4413e9e8ed60c04d7dec5e5bec4e5d9431 doc_id: 1045423 cord_uid: zcg45848 Abstract Background Although the mini-clinical evaluation (mini-CEX) exercise has been adapted to a broad range of clinical situations, limited studies of the mini-CEX for postgraduate residency training in emergency medicine (EM) have been documented. Aim The purpose of this study is to analyze the results of implementing the mini-CEX into the one-month postgraduate residency training in EM. Materials and methods This study is a retrospective review of mini-CEXs completed by ED faculty members from August 2009 to December 2010. All PGY-1 residents enrolled in this study rotated through the one-month EM training. Each PGY-1 resident received one week of trauma training and three weeks of non-trauma training. The clinical competencies of each PGY-1 resident were evaluated with mini-CEXs, rated by a trauma surgeon and three emergency physicians (EPs). We analyzed the validity of weekly mini-CEX and the impact of seniority and specialty training of ED faculties on observation time, feedback time and rating scores. Results Fifty-seven ED faculty members (42 EPs and 15 trauma surgeons) evaluated 183 PGY-1 residents during the 17 months of EM training. ED faculties with different specialty training provided similar assessment processes. Most competencies were rated significantly higher by trauma surgeons than by EPs. On the computerized mini-CEX rating, no data was missed and junior EPs rated all competencies significantly higher. The evaluators and PGY-1 residents were generally satisfied with the computerized format. As compared to the first assessment, only some competencies of PGY-1 residents were rated significantly higher in subsequent evaluations. Conclusion The seniority and specialty training of ED faculty affected the mini-CEX ratings. The computer-based mini-CEX facilitated complete data gathering but showed differences for ED faculty with different levels of seniority. Further studies of the reliability and validity of the mini-CEX for PGY-1 EM training are needed. The mini-clinical evaluation exercise (mini-CEX) has been used to assess clinical skills of residents in many specialty training programs. This tool provides both assessment and education for residents in training 1 and its validity has been established. 2 The mini-CEX is also a feasible and reliable evaluation tool for internal medicine residency training. 3 The number and breadth of feedback comments make the mini-CEX a useful assessment tool. 4 To some extent, such a tool may predict the future performance of medical students. 5 The mini-CEX has been well received by both learners and supervisors 6, 7 ; however, Torre et al 8 have shown that evaluators' satisfaction, observation time, and feedback time differed by the form of the mini-CEX 8 and the personal digital assistant (PDA)-based mini-CEX was more highly rated by students and evaluators when compared to the paper format. 9 Alves de Lima et al 10 and Kogan et al 11 found that feasibility was the principal defect of the mini-CEX. Although the mini-CEX has been adapted to a broad range of clinical situations, 12e14 limited studies of the mini-CEX for postgraduate residency training in emergency medicine (EM) have been documented. In 2003, the severe acute respiratory syndrome epidemic uncovered deficiencies in Taiwan's system of medical education. In order to improve the competency of junior resident physicians, the Department of Health authorized the Taiwan Joint Commission to develop the Postgraduate Primary Care Medical Training Program. Each postgraduate resident then received 3-months of general medicine training during residency from 2003 to 2006. The postgraduate Year 1 (PGY-1) general medicine training course was extended to 6 months from 2006 to 2011 and then extended to 1 year in July 2011. Residents must complete this training within 1 year after being accepted for specialty training. One month of emergency medicine training has been part of this program since July 2009. Chang Gung Memorial Hospital is a 3800 bed tertiary hospital in Taoyuan, Taiwan. The emergency department (ED) of the hospital provides the PGY-1 residency-training program and uses the mini-CEX as an assessment tool for clinical competence. As limited knowledge about the variations of assessment outcomes rated by different ED faculties using mini-CEX is understood, this study analyzed the results of implementation of the mini-CEX in the 1-month postgraduate residency training in EM. The purpose of this study is to determine the feasibility, validity, and impact factors of the mini-CEX ratings in the ED setting. This study was a retrospective review of mini-CEXs completed by ED faculty members from August 2009 to December 2010. All PGY-1 residents enrolled in this study rotated for the one-month EM training received the same 6month Postgraduate Primary Care Medical Training Program. Each PGY-1 resident received 1 week of trauma training and 3 weeks of nontrauma training during the 1-month EM rotation. The clinical competency of each PGY-1 was evaluated with a weekly mini-CEX, one by a trauma surgeon and three by emergency physicians (EPs), during the 1-month rotation. Fifty-seven faculty members (42 EPs and 15 trauma surgeons) evaluated 183 PGY-1 residents during the 17 months of EM training. Most ED faculties received a brief workshop of video-based rater training during our preliminary stage of implementation of this assessment tool (mini-CEX) into the PGY-1 EM training. The workshop provided raters reference readings and assessment instructions on the mini-CEX, scoring video and subsequently discussed and reconciled scoring difference for rater errors training. These works helped most evaluators in this study to be competent to perform this work-place assessment. In the mini-CEX, PGY-1 residents are rated in seven competencies (medical interviewing, physical examination, professionalism, procedural skills, clinical judgment, counseling skills, and organization) using a nine-point rating scale (1 ¼ unsatisfactory and 9 ¼ superior). Each PGY-1 resident was evaluated with a mini-CEX weekly. In the first 12 months, paper-based mini-CEXs were rated by EPs and trauma surgeons. In the followed 5 months, the mini-CEXs rated by EPs were changed to a computer-based format. The satisfaction survey (9point scale) of evaluators and students was also available in the new format. We analyzed the validity of the weekly mini-CEX and the impact of seniority and the specialty training of ED faculty members on observation time, feedback time, and rating scores. We also explored the benefits of implementation of the computer-based mini-CEX format. A value of p < 0.05 was considered statistically significant. This study was approved by the Hospital Ethics Committee on Human Research. The study protocol was reviewed and qualified as exempt from the requirement to obtain informed consent. From August 2009 to December 2010, 475 (paper-based) mini-CEX ratings (122 PGY-1 residents, 47 examiners) were collected in the first 12 months and 248 (computer-based and paper-based) mini-CEX ratings (61 PGY-1 residents, 42 examiners) were collected in the subsequent 5 months. In the first 12 months, 385 of the paper-based mini-CEXs were rated by EPs and 90 by trauma surgeons. The demographic characteristics of all PGY-1 residents are listed in Table 1 . The most frequently evaluated competencies were medical interviewing (99.1%), clinical judgment (98.9%), and physical examination (98.3%) and the least was clinical skill for intubation (55.2%). In the subsequent 5 months, there were 208 mini-CEXs rated by EPs using computerized formats due to the preliminary implementation of electronic teaching system and the other 40 mini-CEXs rated by trauma surgeons were still paper based. On analysis of paper-based mini-CEXs in the first 12 months, ED faculty with different specialty training (EPs or trauma surgeons) provided similar observation times and feedback times when completing mini-CEXs. Except for professionalism, all the competencies (medical interviewing, physical examination, procedural skills, counseling skills, clinical judgment, and organization) were rated significantly higher by trauma surgeons than by EPs (Table 2 ). On analysis of paper-based mini-CEXs completed by EPs, those with more than 10 years of clinical attending experience provided similar observation times and feedback times and rated PGY-1 residents similarly in all competencies when compared to junior faculty (Table 3) . In the analysis of 475 paper-based mini-CEXs during the first 12 months, data gathering was incomplete for all seven dimensions of clinical competency and for other items as well (Fig. 1 ). There were no missing data after the implementation of the computer-based format over the next 5 months. The evaluators and PGY-1 residents were generally satisfied with the computerized mini-CEX format; their ratings ranged from 5 to 9 (7.3 AE 0.8 and 7.9 AE 0.8 respectively). Junior faculty were more satisfied with the format than were senior faculty (7.5 AE 0.8 and 6.9 AE 0.9 respectively, p < 0.001). On the computerized mini-CEX rating, junior EPs rated all competencies of PGY-1 residents significantly higher and senior faculty rated most of them lower ( Table 4 ). The scoring difference in each domain was larger for junior faculties than for senior faculties. In our analysis, junior faculty provided longer observation and feedback times than did senior faculty; however, that trend was not statistically significant. The feasibility of using the mini-CEX in ED training for PGY-1 residents was high based upon the high rate of completion in the first 12 months (98.1%) and in the following 5 months (100%). Analysis of the validity of using a weekly mini-CEX in the ED training showed that when compared to the first assessment, some competencies (physical examination and organization/efficiency) of PGY-1 residents were rated significantly higher in the third week of the 4-week rotation. The other clinical competencies did not change significantly (Table 5 ). The mini-CEX is a method of clinical skills assessment developed by the American Board of Internal Medicine for graduate medical education and is widely used for the evaluation of the clinical competence of medical students in different clerkships 1, 4, 8, 9, 11 and residents receiving various specialty training such as those in internal medicine, 11, 15 cardiology, 10 anesthesia, 16 neurology, 17 and dermatology. 18 It promotes educational interaction and improves the quality of training. 19 We retrospectively analyzed the outcomes of implementation of the mini-CEX for PGY-1 residency training in EM, which has not been documented in the literature. The computer-based format of the mini-CEX in postgraduate emergency residency training effectively replicated the paper-based format for entering and gathering data associated with teaching and learning activities in the ED. The computer-based mini-CEX tool was highly rated by both residents and evaluators. Donato et al 20 noted that modifying the mini-CEX format may produce more recorded observations, increase inter-rater agreement, and improve overall rater accuracy. Norman et al 21 noted the need for a careful assessment of the advantages and disadvantages for trainees prior to utilizing the PDA-based format of the mini-CEX. These findings were important considerations for improving the quality and effectiveness of a work-place assessment tool in assessing the EM residency training program in a crowded ED environment. The mean rating scores for PGY-1 residents in each domain of clinical competence were not different on paper-based assessments by ED faculty with different levels of seniority. After implementation of the computer-based format, the mean scores in each domain were rated significantly higher by junior faculty and lower by senior faculty. Torre et al 9 documented that residents rated students significantly higher than the faculty did in all mini-CEX clinical domains in the PDAbased format. Kogan et al 22 noted that faculty members' own clinical skills may be associated with their ratings of trainees. Torre et al 8 found that evaluators' satisfaction, observation time, and feedback time differed according to the form of the mini-CEX. In previous studies, the electronic format of mini-CEX seemed to elicit different responses from ED faculty with different level of seniority. 8, 9 In our analysis, junior faculty members were more satisfied than senior faculty with this new computerized format for the mini-CEX. Junior faculty members might be more familiar with the operation of the computer interface in the assessment process than were senior faculty. From another point of view, senior faculty members could be more able to make their evaluations for ED trainees consistent as compared to junior faculty members despite the fact that the format of mini-CEX was changed to another one. In our review of the literature about mini-CEX studies, rater training was rarely described or did not occur at all because of limited resources and time. The necessity for rater training in the use of the mini-CEX has been questioned in the literature. 23 Norcini et al 12 studied examiner differences in the mini-CEX and found no large differences among the ratings. Torre et al 8, 9 showed that format and seniority were associated with rating outcomes on the mini-CEX. As this assessment tool is widely utilized in various clinical settings and is used to certify residents' clinical competence, a better and more standardized evaluation is crucial. Two randomized studies highlighted the value of rater training and its effect on scores 23, 24 ; however, it was likely to be ineffective if the training was too brief. 23e25 Most studies evaluating the mini-CEX focus on its educational impact as an assessment tool and on the perceptions of evaluators and trainees. 1, 6, 16, 19 We found no published studies looking at the effect of this assessment tool on doctors' performance in the ED. In our analysis, the seniority and specialty training of ED faculty showed differences in the mini-CEX rating for ED trainees. Moreover, the format of the mini-CEX also had an impact on ratings and was different for junior and senior ED faculty members. Rater training for the assessment process and familiarity with the impact of computer-based format on ratings was suggested as one way to promote educational interaction and to improve the quality of assessment for ED trainees. Assessment of trainees who deal with a wide range of patient encounters in the crowded environment of an ED is a challenge. The feasibility of a weekly mini-CEX was confirmed based on the high completion rate, which met the requirements of our PGY-1 training program despite the format of the mini-CEX; however, the validity of a weekly mini-CEX could not be documented by these study results. The necessity of weekly mini-CEX is questioned. It was difficult to show improvement by PGY-1 residents in all domains in a 1-month rotation. Only if the postgraduate EM training course were extended could improvement in trainees' competence be obvious. 1. Although most ED faculties received a brief course of video-based rater training, inter-rater and intrarater variations indeed existed at our preliminary stage of implementation of this assessment tool (mini-CEX) into the PGY-1 EM training. 2. Our study was retrospective and the data were collected from computer and paper databases. Although we made every effort to remain objective, possible errors may have been introduced. 3. The computerized format was not utilized in the 1-week trauma-training curriculum so that the impact of these specialty raters on the mini-CEX ratings was limited to the paper-based format. 4. The domains of the computer-based mini-CEX were modified from the paper-based format. The clinical skills domain was replicated with Direct Observation of Procedural Skills and an overall domain was added to the computerized format. However, most of the core competencies were the same in both formats for analyzing the impact of format change on rating outcomes. 5. The analysis of the impact of seniority on ratings was limited to the EM faculty only because mini-CEX ratings by senior faculty (>10 years) among trauma surgeons were too few. 6. There was no satisfaction survey along with the paperbased mini-CEX so it was not possible to analyze the differences in satisfaction between the paper and computer-based formats. 7. The observation time of around 15 minutes was hardly sufficient to assess all seven parameters of mini-CEX for ED cases with moderate to high complexity. However, most cases treated by PGY-1 residents supervised by ED faculties were simple and in line with the goal of 1-month EM training curriculum. The seniority and specialty training of the ED faculty affected the mini-CEX ratings. The computer-based mini-CEX facilitated complete data gathering but showed differences for the ED faculty with different levels of seniority. Both evaluators and PGY-1 residents were satisfied with the format of the mini-CEX. Further studies of the reliability and validity of the mini-CEX for PGY-1 EM training are needed. Internal medicine residents' perceptions of the Mini-Clinical Evaluation Exercise Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review The reliability and validity of the American Board of Internal Medicine Monthly Evaluation Form Mini-clinical evaluation exercise as a student assessment tool in a surgery clerkship: lessons learned from a 5-year experience Predictive validity of the mini-Clinical Evaluation Exercise (mCEX): do medical students' mCEX ratings correlate with future clinical exam performance The mini clinical evaluation exercise (mini-CEX) for assessing clinical performance of international medical graduates Reliability and acceptance of the mini-clinical evaluation exercise as a performance assessment of practicing physicians Comparing PDA-and paper-based evaluation of the clinical skills of third-year students Feasibility, reliability and user satisfaction with a PDA-based mini-CEX to evaluate the clinical skills of third-year medical students Validity, reliability, feasibility and satisfaction of the Mini-Clinical Evaluation Exercise (Mini-CEX) for cardiology residency training Implementation of the mini-CEX to evaluate medical students' clinical skills Examiner differences in the mini-CEX The mini-CEX (clinical evaluation exercise): a preliminary investigation The mini-CEX: a method for assessing clinical skills Assessing the reliability and validity of the mini-clinical evaluation exercise for internal medicine residency training Mini-clinical evaluation exercise in anaesthesia training Clinical skills evaluation of trainees in a neurology department Assessing the assessments: U.K. dermatology trainees' views of the workplace assessment tools Investigation of trainee and specialist reactions to the mini-Clinical Evaluation Exercise in anaesthesia: implications for implementation Evaluation of a novel assessment form for observing medical residents: a randomised, controlled trial Compliance of medical students with voluntary use of personal data assistants for clerkship assessments What drives faculty ratings of residents' clinical skills? The impact of faculty's own clinical skills Effect of rater training on reliability and accuracy of mini-CEX scores: a randomized, controlled trial Effects of training in direct observation of medical residents' clinical competence: a randomized trial How well do internal medicine faculty members evaluate the clinical skills of residents? There were no conflicts of interest related to this study.