key: cord-0929036-9soxwu0x authors: Meyer, Eric G.; Battista, Alexis; Sommerfeldt, John M.; West, James C.; Hamaoka, Derrick; Cozza, Kelly L. title: Experiential Learning Cycles as an Effective Means for Teaching Psychiatric Clinical Skills via Repeated Simulation in the Psychiatry Clerkship date: 2020-11-10 journal: Acad Psychiatry DOI: 10.1007/s40596-020-01340-8 sha: 86b28f5f33013c8d220433412f190d58facb498e doc_id: 929036 cord_uid: 9soxwu0x OBJECTIVE: This retrospective study compares differences in clinical performance on the psychiatry clerkship Objective Structured Clinical Examination (OSCE) between students receiving traditional repeated clinical simulation with those receiving repeated clinical simulation using the Kolb Cycle. METHODS: Psychiatry clerkship OSCE scores from 321 students who completed their psychiatry clerkship in 2016 and 2017 were compared. Specific performance measures included communication skills as determined by the Essential Elements of Communication, gathering a history, documenting a history and mental status exam, defending a differential diagnosis, and proposing a treatment plan. Results were calculated using repeated two-way analysis of variance between students receiving no simulation and traditional repeated simulation training (TRS) as compared to students receiving no simulation and repeated simulation utilizing the Kolb cycle (KRS). RESULTS: Students who received KRS performed significantly better in three of the five components of the clerkship OSCE as compared to students who received TRS. Specifically, students who received KRS performed better on gathering a history (+ 14.1%, p < 0.001), documenting a history (+ 13.4%, p < 0.001), and developing a treatment plan (+ 16.7%, p < 0.001). There were no significant differences in communication skills or in developing and defending a differential diagnosis. CONCLUSIONS: Psychiatry clerkship students engaged in repeated simulations explicitly integrated with the Kolb cycle demonstrate improved clinical skills as measured by OSCE performance. Integration of the Kolb cycle in designing simulation experiences should be carefully considered and may serve as a model for individualized coaching in programs of assessment. In 1992, the American Association of Medical Colleges (AAMC) supported the use of simulation-based education (SBE) (e.g., standardized patient simulation) in teaching and evaluating students' clinical skills [1] . The use of SBE to assess participants has gained widespread adoption. According to a 2011 AAMC survey, 86% of responding medical schools employ standardized patient simulations (SPSs) in their preclinical curricula [2] . Furthermore, prior to the COVID-19 crisis, the National Board of Medical Examiners (NBME) required SPSs as part of medical licensing [3] . Students who participate in psychiatry clerkship programs utilizing SPS have been found to perform significantly better on comprehensive clinical competency examinations at the end of their clerkship year [4] . They were also more likely to receive favorable scores for key items, including professionalism and assessing patients for thoughts of self-harm [4] . In 2008, a systematic review of simulation methodologies across various levels of psychiatric education found limited results in graduate medical education (GME) while medical students reported the use of SPS as helpful in increasing exposure to different patients and in improving interviewing skills [5] . A more recent review (2017) [6] found limited advancement in the application of simulation in psychiatry education since the earlier 2008 review [5] . The authors called for more research focused on curriculum development employing SBE methods in psychiatry, to include expanding beyond communication-focused scenarios with more robust clinical psychiatry scenarios, increased application of medical knowledge to simulated contexts, and increased participation in critical reflection [6] . They also highlighted Kolb's experiential learning cycle (Kolb cycle) as the gold standard for SBE [7, 8] . The Kolb cycle employs four distinct stages of experiencing, observing, conceptualizing, and experimenting [7, 8] . Abdool et al. pointed out that only one of the 63 reviewed articles explicitly utilized all four stages of the Kolb cycle [6] . In response to this 2017 review, and as part of routine quality assurance, the Uniformed Services University of the Health Sciences (USUHS) psychiatry clerkship directors (EM and CW) preliminarily reviewed 3 years of Objective Structured Clinical Examination (OSCE) performance data (2013-2016), comparing overall OSCE performance of students who received weekly simulation-based education with students who received no simulation-based education. There were no meaningful performance differences on OSCE overall and subsection performance (i.e., communication skills, gathering a history, documenting a history and mental status exam, defending a differential diagnosis, and proposing a treatment plan), thus raising concerns about the efficacy of the clerkship's weekly SBE, especially given their expense, labor requirements, and time away from clinical activities. In light of these preliminary results, the clerkship directors contemplated cancelling the weekly simulation. However, given that overall average OSCE performance for these years, regardless of simulation exposure, was only 70%, it seemed more prudent to improve these weekly simulations. While aspects of our weekly SBE could be loosely mapped to the stages of the Kolb cycle ( Fig. 1) [7, 8] , these were not explicitly designed to achieve the goals of each stage nor were they designed to function as a continuous learning cycle. As a result, our SBE failed to achieve the purported benefits of the Kolb cycle. We hypothesized that explicitly implementing the Kolb cycle as part of a repeated simulation experience would improve students' OSCE performance. This retrospective cohort study aims to confirm this hypothesis by evaluating the differences in clinical performance on the psychiatry OSCE between students receiving traditional repeated simulation as compared to repeated simulations explicitly utilizing the Kolb cycle. Preexisting TRS-Based Education Program (Fig. 1) The USUHS psychiatry clerkship educates 10 students at four local clinical sites and 10 students at four geographically distant sites for nine 5-week rotations, totaling approximately 170 students per year. Students complete their clerkship rotations in three 15-week blocks and are evaluated with an OSCE at the end of each block to assess clinical skills relevant to the three rotations completed. For more than a decade, local sites utilized weekly SPSs to strengthen student interviewing skills. These SPSs were not available for students at the distant sites due to their geographical separation from the simulation center and lack of local simulation resources. Each week, the ten local students completed a 20-min interview with a standardized patient (SP) after a 30-min generic discussion of a psychiatric topic. Students were then broken into two groups of five: the first group interviewed the SP while the second group completed an unrelated learning activity. The first group later watched the second group interview their SPs before completing the unrelated learning activity. Students rotated going first each week, ensuring everyone received a similar experience. The SP's chief complaint changed each week to align with the clerkship's overall curriculum, covering affective disorders, thought disorders, trauma and stressor related disorders, and neurocognitive disorders. Following each encounter, SPs completed the Essential Elements of Communication (EEC) checklist [9] to capture their student's communication skills in areas such as opening and closing a discussion, building a relationship, gathering information, understanding the patient's perspective, sharing information, and reaching agreement on a plan [9, 10] . SPs also completed an internally developed checklist of key clinical history elements to rate each student's ability to gather a history (available upon request from the corresponding author). Simultaneously, students completed a post-encounter note modeled after the NBME Step 2 CS template [11] , which included a brief written history, focused physical exam/mental status exam, differential diagnoses with diagnostic reasoning, as well as proposed diagnostic studies and a brief treatment plan. Lastly, students received 5 min of one-on-one feedback with the SP using the debriefing with good judgment model [12] . After the first group completed their simulation, they observed a peer complete the same simulation, providing additional unstructured peer feedback. When all ten students completed their simulation, they met as a group with a faculty facilitator to review key points of the clinical history (i.e., interviewing, documenting) and to discuss differential diagnoses and treatment planning. All information (EEC, History Checklist, SP Feedback, Step 2 CS style note [11] , and a video of their interview) was stored in a digital learning system [13] that was not directly provided to the students or reviewed as part of the debrief. Students were, however, encouraged to review their data on their own. The Kolb cycle is a continuous process of concrete experiences, reflective observation, abstract conceptualizations, and active experimentation [7, 8] . While most images depict the cycle as starting with concrete experience, the cycle can start at any point [8] . According to Kolb, abstract conceptualization involves forming new ideas-frequently based on previous reflection while active experimentation provides learners with a chance to apply these new ideas to achieve new outcomes. Concrete experience provides a "publicly shared reference point for testing the implications and validity of ideas" (p. 21) [7] . This allows the learner to get explicit feedback on their active experimentation before they begin reflective observation, where the learner reviews on their performance and begins considering new strategies-starting the cycle over again [7, 8] . The Kolb cycle provided an ideal approach for improving the clerkship's SPSs because the simulation program already had opportunities for active experimentation with new experiences, explicit assessments of student performance which could be shared as a concrete experience, and time for feedback and reflection, consistent with reflective observation. Furthermore, our preexisting SPS also employed a series of simulations and observation experiences: enabling cycles of learning [14] [15] [16] [17] . Aligning the psychiatry clerkship simulation experience more fully with the Kolb cycle required a d d i n g a d d i t i o n a l s t e p s e m p h a s i z i n g a b s t r a c t conceptualization while explicitly emphasizing how the other events supported their respective stages. Specific improvements, as organized by the four stages of the Kolb cycle, included: & Abstract conceptualization: This stage of the cycle was identified as our starting point based on best practices of SBE, namely the use of a prebriefing to prepare students for the SPS and establish expectations and psychological safety [18, 19] . As such, we replaced our generic 30-min opening discussion with an explicit prebrief focusing on the type of patient (diagnosis and communication difficulties) students would encounter and what potential interviewing strategies they could employ. & Active experimentation: Students completed a 20-min interview with SPs and then documented a Step 2 CS style note immediately after. To explicitly encourage active experimentation, students were told that this experience was their "sandbox," a place where they could experiment with new interviewing/documentation strategies without consequence. & Concrete experience: Immediately following the SP interview and writing their note, students received concrete feedback on their efforts in several ways. First, the SP they interviewed provided specific feedback on their communication strategies. Second, all students received objective feedback on their interview using an internally developed history checklist and the EEC checklist [9] . Third, students evaluated their note with an internally developed documentation rubric. These objective, concrete evaluations provided students with a reference point to determine the success of their experimentation-a critical step needed before asking the students to reflect on their performance (all rubrics are available upon request from the corresponding author). & Reflective observation: Students were first provided with a structured approach to visually map one of their peers interview an SP, providing an additional opportunity to reflect on similarities and differences in their own interview strategies. They were also provided with a charting tool to help them monitor their performance over time (also available upon request). Students were then provided with dedicated time to synthesize all of their performance data before the faculty debrief. & Abstract conceptualization: Returning to the start of the cycle, during a group debriefing session, faculty helped students understand their performance and collaborated on generic solutions to common problems. Students were asked to identify and document a specific goal for future improvement along with potential strategies. Students were encouraged to incorporate these strategies with actual patients in preparation for their next simulation. These goals and any successes/difficulties implementing them Aligning the simulations did not involve changes to the simulated cases, SP training, case length, or number of cases. Total time dedicated to each week's simulation experience increased from 105 to 150 min. The increase in time came from refocusing the 30-min prebrief from a general topic to introducing a specific skill employed in the simulation, increasing one on one feedback with the SP from 5 to 10 min, and adding 10 min of dedicated time to self-grade the encounter note. As most of these changes just required re-prioritizing existing content, there was no increase in the actual time spent at the simulation center and thus no additional costs or faculty time requirements. This retrospective protocol (#908875) was approved by our institution's IRB as exempt. Students who completed their psychiatry clerkship in 2016 and 2017 were included in this study. Students who left the clerkship before completing the OSCE and those who were remediating the clerkship were not included in the analysis as they would have had incomplete or duplicate exposures (i.e., traditional repeated simulation (TRS), KRS, nothing). Students were categorized into four cohorts: those receiving no simulation in 2016, those receiving TRS (2016), those receiving no simulation in 2017, and those receiving KRS (2017). OSCE performance data from 2016 and 2017 was collated and de-identified. The psychiatry clerkship OSCE utilizes two SP cases depicting common psychiatric conditions. No changes to these cases occurred during the study timeframe. Clinical skills evaluated for each case include communication, gathering a history, documenting a history and mental status exam, developing and defending a differential diagnosis, and proposing a treatment plan. Communication skills were measured by the EEC checklist [9] . Gathering a history was assessed using an explicit checklist of key elements developed by the Mid-Atlantic Consortium of Medical Schools. Both of these checklists were completed by highly reliable [20] SP trainers who were observing the OSCE in real time. The three remaining skills were assessed immediately after the OSCE by the same five psychiatry faculty for both years. Using a rubric developed for each case (available upon request from the corresponding author), faculty reviewed each student's USMLE Step 2 CS style note. Scores for all components were reported as a percentage and average performance data for each clinical skill from both stations was calculated for each student. Student performance in each of the clinical skills measured by the OSCE were considered unique dependent variables and assessed separately. Differences between cohorts were first calculated within year groups: students receiving no simulation in 2016 were compared with students receiving TRS in 2016, while students receiving no simulation in 2017 were compared with students receiving KRS in 2017. The difference in differences (DID) was then calculated between year groups to control for potential differences between the classes. Finally, time within the clerkship year (first, middle, last third) was controlled for as a fixed factor to further reduce bias due to clinical experience gained from other clerkships. All analyses were done using a two-way analysis of variance (ANOVA). Type I error was controlled at 5% for all analyses. OSCE performance data was collated and de-identified for 150 students who completed the psychiatry clerkship in 2016, and 171 students who completed the clerkship in 2017 (N = 321 students). Baseline demographics [21] for each group revealed no meaningful differences (Table 1) . Data on (1) documentation, (2) developing a differential diagnosis, and (3) proposing a plan could not be included for 56 students from the first third of 2016 due to subtle differences in the rubrics used to score these sections (n = 265). When comparing 2016 student OSCE performance data between students who received TRS (n = 70) and those who received no simulation (n = 80), statistically significant differences were found in communication skills (+ 3.4%, F(1,321) = 3.917, p = 0.049) and ability to document a history and mental status exam (+ 5.3%, F(1,265) = 4.432, p = 0.036). Differences in gathering a history, developing a differential diagnosis, and management were not statistically significant ( Table 2 , 2016 data). Comparison of the 2017 student OSCE performance data between students who received KRS (n = 85) and those who received no simulation (n = 86) revealed statistically significant improvements in all domains of performance ( Evaluation of OSCE performance differences between 2016 students (no simulation and TRS) and 2017 students (no simulation and KRS) revealed a significantly larger improvement in three out of five areas of performance on the psychiatry clerkship OSCE following implementation of the Kolb cycle (Fig. 3) . Specifically, these students showed significant improvement in gathering a history (+ Δ14.1%, F(1,321) = 41.544, p < 0.001), documenting a history and mental status exam (+ Δ13.4%, F(1,265) = 17.477, p < 0.001), and developing a plan (+ Δ16.7%, F(1,265) = 15.092, p < 0.001). The improvement in communicating and developing a differential diagnosis seen in students who received KRS was not significantly larger than the change seen in students who received TRS as compared to students receiving no simulation. In this retrospective cohort study, the explicit integration of all four components of the Kolb cycle in a repeated simulation experience resulted in a significant improvement in all clinical skills assessed on the psychiatry clerkship OSCE as compared to students who received no simulation. TRS that did not fully follow the Kolb cycle had limited to no effect on psychiatry clerkship students' clinical performance on the psychiatry clerkship OSCE as compared to students who experienced no simulation. Comparison of TRS to KRS demonstrated a significant and meaningful improvement in gathering a history, documenting a history, completing a mental status exam, and proposing a treatment plan. These results suggest that careful instructional design that explicitly follows the Kolb cycle increased the educational benefits associated with SBE. These findings add to the body of evidence, indicating that SBE should incorporate specific instructional design features across the broad continuum of activities that students engage in, including focusing student's preparation efforts [22] , affording repeated and/or regular participation [23] , providing guidance to students' observation efforts [24] , and structuring feedback and/or reflection [25] . Furthermore, the findings provide a worked example of how integrating educational theory (such as the Kolb cycle) can improve the efficacy of SBE, thereby ensuring that the energy spent in planning and implementing simulation is worthwhile. This example also highlights how theory helped us view simulation as more than just single elements (i.e., simulation activity, feedback) but rather framed simulation as including a robust collection of learning activities. The marked improvement in developing a treatment plan and the absence of improvement in communication skills (EEC) were both unexpected findings. While each of the five areas of performance were given equal weight in the prebrief and debrief, we had anticipated that given the robust nature of the tools used to teach and assess communication skills (i.e., the EEC) that these skills would improve the most, while managing would remain stable, as this skill is typically considered beyond the developmental stage of a clerkship student. We were not surprised that "developing and defending a differential diagnosis" was an ongoing area of difficulty, as this skill is developmentally one of the primary struggles for clerkship students. The lack of improvement in communication skills may be due to potential psychometric problems with the EEC (variable weighting of double-barreled behavioral anchors) [9] or a hidden curriculum de-emphasizing the importance of effective communication [26] . The marked improvement in developing a treatment plan may have been due to seeing what was expected in treatment planning on the note grading rubric. It may also have been due to the explicit guidance given during the active experimentation phase that the students should discuss management plans with their patients-which resulted in a more robust treatment plan in their documentation. These findings offer potential areas for future investigation and program improvement. Our study was limited to psychiatry clerkship students at our institution over the course of two academic years. Analysis of available Step II CS performance was limited to pass/fail data, which did not provide additional insights into our student population. The limited scope and time frame may have resulted in an over-or underestimation of the true differences in performance after implementation of the experiential learning cycle. While our data suggests that incorporating the Kolb cycle was beneficial, it is not possible to discern the full impact of each stage of the Kolb cycle, since all four stages of the Kolb cycle were implemented at one time. It is also not clear if adding additional time on task or starting at a different phase of the cycle were etiologic. Another limitation was that while 265/321 students had comparable data for two of the five skills measured on the OSCE, data from 56 students could not be included for the other three skills. It is also possible that there were differences in the four local sites as compared to the distant sites; however, since there were no meaningful programmatic changes to the clerkship other than implementation of the Kolb Table 2 for individual cohort data. DDx= differential diagnosis in local simulation training, this source of bias seems unlikely. Lastly, although there were no baseline differences between classes, and assignment to local clerkship sites with simulation versus distant clerkship sites without simulation is random, it remains possible that there may have been significant differences in baseline characteristics between local and distant groups that could have influenced our results. Future work to expand these efforts to other disciplines throughout the clerkship experience may provide additional evidence of efficacy. Further evaluation into the overall generalizability of our study results must also be performed to determine if implementation of KRS is beneficial across other specialties and examinations, such as the USMLE Step 2 CS examination. There is also a potential that improvement in clinical skills measured by a single OSCE may not directly translate into improvements in Step 2 CS examination performance or changes in behavior during actual patient care encounters. Further study is needed to determine if the improvements noted in OSCE performance will also result in enhanced clinical skills with future patients. Given the success of implementing the Kolb cycle as part of our local repeated simulation efforts and in an effort to provide comparable learning experiences across clerkship sites, we have implemented simulation experiences for students at distant sites using "telesimulation." Telesimulation includes simulated patient encounters offered via a distance learning platform where participants receive remote guidance from faculty, similar to live simulation. Future investigations will compare the effectiveness of the Kolb cycle in live repeated simulation as compared to repeated telesimulation. A final area to consider the Kolb cycle's utility is in the ongoing international discussion of moving to programs of assessment that require multiple repeated formative measures. As outlined in the 2018 consensus framework for good assessment [27] , students will need a faculty coach who provides specific feedback at each formative assessment. The Kolb cycle could provide a framework for such a model of competency-based instruction. Should we use standardized patients instead of real patients for high-stakes exams in psychiatry? Acad Psychiatry Medical simulation in medical education: results of an AAMC survey Step II Clinical Skills Overview Use of standardized patients during a psychiatry clerkship Psychiatric education and simulation: a review of the literature Simulation in undergraduate psychiatry: exploring the depth of learner engagement Professional education and career development: a cross sectional study of adaptive competencies in experiential learning. Lifelong Learning and Adult Development Project Experiential learning: Experience as the source of learning and development Use of the Kalamazoo essential elements communication checklist (adapted) in an institutional interpersonal and communication skills curriculum Essential elements of communication in medical encounters: the Kalamazoo consensus statement Step II Clinical Skills -Sample There's no such thing as "nonjudgmental" debriefing: a theory and method for debriefing with good judgment Simulation-based interprofessional education guided by Kolb's experiential learning theory Repetitive pediatric simulation resuscitation training FIRST2ACT: Educating nurses to identify patient deterioration-A theory-based model for best practice simulation education Optimisation of simulated team training through the application of learning theories: a debate for a conceptual framework Standards of best practice: Simulation standard IV: Facilitation Establishing a safe container for learning in simulation: the role of the presimulation briefing Validity evidence for medical school OSCEs: associations with USMLE® step assessments Standards of best practice: Simulation standard IX: Simulation design Mastery learning for health professionals using technologyenhanced simulation: a systematic review and meta-analysis Observer roles that optimise learning in healthcare simulation education: a systematic review Standards of best practice: Simulation standard V: Facilitator Using an online forum to encourage reflection about difficult conversations in medicine Consensus framework for good assessment Publisher's Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations Acknowledgments Special thanks to the staff of the Val G Hemming Simulation Center for their ongoing support of the USUHS Psychiatry Clerkship. We also wish to thank Dr. Cara Olsen, for her assistance in designing our analysis. Disclosures None.Ethics Approval This research protocol (#908875) was reviewed and approved by the USUHS institutional review board (IRB) in accordance with all applicable Federal regulations governing research.Disclaimers The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Uniformed Services University of the Health Sciences (USUHS), the Department of Defense, or the US Government. This work was prepared by a military or civilian employee of the US Government as part of the individual's official duties and therefore is in the public domain and does not possess copyright protection (public domain information may be freely distributed and copied; however, as a courtesy, it is requested that the Uniformed Services University and the author be given an appropriate acknowledgment).