key: cord-0074784-6p3kungd authors: Velaga, Jyothirmayi; Lee, Sonia Shu Yi; Tran, Nguyen Tuan Anh; Vora, Bimal Mayur Kumar; Cheng, Lionel Tim-Ee title: Survival Radiology: How a popular in-person interactive medical student radiology workshop pivoted online during the COVID-19 pandemic date: 2022-02-10 journal: nan DOI: 10.1177/20101058211055306 sha: b63c8203f20841965f4f873adc105c49e96e181b doc_id: 74784 cord_uid: 6p3kungd BACKGROUND: Survival Radiology (SR) is a flagship annual full-day in-person radiology workshop targeted at final year medical students in Singapore to prepare them for internship. Previous in-person editions have consistently received positive reviews from 2014 to 2019. However, the COVID-19 pandemic necessitated a rapid online pivot for its sixth edition in 2020. OBJECTIVES: This study aims to (a) identify key success factors of a traditional in-person medical student radiology workshop, (b) describe the rapid online pivot in 2020 and (c) to identify key success factors for online educational initiatives. METHODS: Post-workshop survey responses of SR from 2014 to 2020 were evaluated. Likert-scale data were quantitatively analysed, while free-text responses were qualitatively analysed. RESULTS: A total of 1248 post-workshop surveys (2014–2020 workshops) and 266 free-text responses (2020 workshop) were received from 2640 participants over the years. Progressive changes that sustained or improved participant feedback for in-person SR workshops included adoption of a case-based approach, utility of ‘live’ audience response systems and incorporation of quizzes with a favourable overall feedback rating of 4.42–4.89 from 2014 to 2019. The webinar version of SR in 2020 became the best-rated edition since inception with a rating of 4.9. Qualitative analysis of feedback from SR 2020 showed that the participants preferred the webinar model, online modes of engagement and interactivity. CONCLUSION: Our experience shows that it is not only possible to successfully pivot online for such workshops, but that blended educational formats utilising online engagements supplemented by in-person activities will be well-received by ‘Generation Z’ learners even after the COVID-19 pandemic. Survival Radiology (SR) is an annual full day in-person radiology workshop for final-year medical students from the three medical schools in Singapore. It is an interactive workshop focusing on important and urgent radiological findings to equip future interns with essential knowledge and basic interpretation skills of common radiological studies relevant for safe practice. SR had undergone five in-person iterations from 2014 to 2019, evolving from consultant-led didactic lectures to resident-led interactive case-based workshops. The iteration in 2020 involved pivoting form a large inperson gathering of about 250 students in 2019 to a totally online webinar in 2020. This study aims to (a) identify key success factors of a traditional in-person medical student radiology workshop, (b) describe the rapid online pivot in 2020 and (c) to identify key success factors for online educational initiatives. The evolving model of the in-person editions of SR was based on the Kolb cycle of experiential learning by integrating real-time audience response systems (ARS) (Poll Everywhere, San Francisco, California) into case-based scenarios. [1] [2] [3] The SR programme had various subspecialtybased modules, each consisting of multiple case-based scenarios. Each case scenario provided a clinical vignette accompanied by one or more radiological images. Participants' responses were recorded via ARS. The use of real-time ARS allowed participants to attempt cases (active experimentation), enabled a simulated clinical encounter (concrete experience) with interactive options such as a multiplechoice quiz (Figure 1 ), open-ended question with word cloud feature ( Figure 2 ) and image-based interactive response ( Figure 3 ). The real-time participant responses allowed the faculty to tailor the explanations accordingly. It also gave participants a chance to reflect on the entire cohort answers (reflective observation). At the end of the case, a summary of learning points was provided, enabling the participants to conclude each case encounter (abstract conceptualization). The COVID-19 pandemic curtailed medical student training due to strict social-distancing measures. 4 In-person lectures were temporarily halted, as were many educational activities in the hospitals. In Singapore, the most restrictions were imposed during the 'Circuit Breaker' period from April to June 2020, 5 which coincided with the usual SR workshop. These restrictions made it unfeasible to conduct SR with the traditional in-person model. However, in response to numerous requests for radiology teaching sessions from both local and international medical students who had their medical education disrupted during this time, the organising team put aside the initial notion of cancelling the workshop. The team began exploring alternate avenues to hold SR 2020. Multiple technical and human factors which may pose as challenges to a successful implementation were considered. During this time, work-from-home arrangements and virtual meetings were just starting to become commonplace. Hospitals and educational institutions also transitioned to videoconferencing and webinars for multidisciplinary team meetings and teaching sessions. 6 A timely collaboration with the College of Radiologists, Singapore (CRS) allowed the team to tap into their experience with hosting online workshops. Together with strong faculty support, SR 2020 (Pandemic Edition) was launched within 6 weeks of its conception as an online webinar series via a cloud-based web conferencing platform (Zoom Video Communications, San Jose, California, USA) ( Figure 4 ). Can People 'Zoom' for an Entire Day? The phenomenon of 'Zoom fatigue' emerged as the world embraced video-conferencing platforms for work, education and personal communication. Zoom fatigue is the exhaustion arising from extended virtual online videoconferencing. Factors such as excessive close-up eye gaze, cognitive load, unhealthy self-evaluation and limited mobility have been touted to result in Zoom fatigue. 7 Since the traditional SR model was a full-day workshop, the organising team mitigated Zoom fatigue by dividing the workshop into smaller components over several weekend afternoons. The pilot session was a 1-hour webinar titled 'Lines and Tubes on radiographs'. The second and third sessions were longer 3.5-hour sessions with three modules each (brain, spinal cord and chest for session 2, followed by abdomen, spine and limbs for session 3). Each module had 45-minute for case-based scenarios, followed by 15 minutes of moderated question and answer (Q&A) segment. A mandated 5-minute break between each module was implemented, during which engaging interfaces such as music and space-filler PowerPoint slides were presented to break the monotony. Current webinar capabilities include 'live' chat and Q&A options where queries can be submitted in real time. With this feature, the team anticipated more queries from the audience while the session was still in progress as compared to a classroom setting. 8 The team decided to employ a larger group of faculty members to answer 'live' ad-hoc questions. This allowed the students to quickly clarify simple concepts without disrupting the flow of the main interactive presentation. A dry-run with the entire team was performed 1 week before the webinar to identify and resolve any technical issue, including audio-visual clarity of the presenters and presentations, and internet connectivity problems such as lag time. The short pilot session and additional mini dry-runs prior to each session served as additional checks to pick up on any last-minute technical difficulties. Team debriefing after each webinar session reviewed the overall experience and participant feedback, with an aim to improve subsequent sessions. Online safety is a major challenge when using videoconferencing platforms. There have been reports of security breaches, unwanted disruption and trolling. 9,10 Measures employed to secure the webinar included the use of meeting passcodes, waiting room function to admit only registered participants, requiring authentic profiles, disabling screensharing option for participants and keeping meeting details private. 11 Changes from the Traditional in-Person Model PowerPoint Slide Optimization. The existing SR modules were case-based scenarios with short clinical vignettes and radiological images on PowerPoint (Microsoft, Washington, USA) slides. The team ensured that all content were modified to suit a virtual platform. For example, using a darker background for better visual appeal and using legible font type, size and colour. 12 The radiological images were adjusted to the appropriate contrast and size. Stack images were animated to simulate viewing a cross-sectional radiology study. Additionally, annotations and drawing tools were used to help illustrate key concepts or structures on images, enhancing understanding and engagement. Cancellation of Quizzes. Quizzes with prizes were an integral part and highlight of prior in-person editions of SR. However, conducting the individual and team-based quizzes was technically challenging on an online platform. Decision was made to discontinue this activity. Continued Use of Audience Response Systems. The interactive case-based lectures with audience response system (Poll Everywhere) were easily adaptable for the online platform. It has been implemented in many previous SR workshops with good audience reception. Event Host as the Main Presenter. The event host was responsible for guiding the audience and faculty members through the webinar to ensure a smooth and well-timed workshop programme. The event host focused on engaging the participants in a more informal, and at times, entertaining manner, to sustain interest and engagement. The designated host was one of the workshop directors who had an overview of the webinar programme and roles of all team members. Event Moderator as an Advocate for the Audience. The moderator was responsible for monitoring the 'live' chat and Q&A discussion, consolidating recurrent themes and presenting them to the speaker during the Q&A segment for each module. The designated moderator was one of the senior faculty members with 3 years of experience in conducting previous SR quizzes. Dedicated Administrative Team. The scale of the online webinar required a dedicated administrative team to admit participants, ensure presentations were running well and manage any acute technical issues. The administrative team was a mix of personnel from previous SR workshops and new members from the CRS who had experience hosting virtual workshops. Non-Presenting Faculty Members. Other faculty members besides the speaker were responsible for answering questions as they arose in the 'live' chat and Q&A function. These faculty members also highlighted technical issues to the administrative team. All participants were presented with a post-workshop survey, consisting of a series of questions on the course performance (using Likert scale ranging from 1 being very poor/strongly disagree to 5 being excellent/strongly agree). The surveyed questions are presented in Table 1 . Responses to the questions a, b and c were used as surrogate of audience feedback. Direct comparison to the immediate previous year was performed to assess audience reception to each new key changes from 2014 to 2020. The mean performance of question (c) (overall quality of the course) was tracked from 2014 to 2019 along with the incremental improvements to workshop format over the years to identify key changes that were well-received by participants in the in-person sessions ( Table 2 ). All comparisons were performed using Mann-Whitney test for non-parametric data of independent population. The free-text responses in the survey forms were analysed qualitatively by inductive thematic analysis, and the responses were coded manually. 13 The codes were then categorised into different themes. The sample selection is illustrated in Figure 5 . The number of participants (with survey response rates in parentheses) in 2014, 2015, 2017, 2018 and 2019 were 49 (83.6%), 96 (75.0%), 118 (78.0%), 215 (33.0%) and 236 (51.2%), respectively. In 2020, the number of participants increased to 1911 (44.5%). Altogether from 2014 to 2020, we had a total of 2640 participants in SR workshops. A total of 1248 post-workshop surveys submitted by participants were included in analysis. In 2020, among 850 survey responses, there were 266 free-text responses ( Figure 5 ). The number of participants and survey responses for each year are represented in the bar diagram in ( Figure 6 ). Quantitative Analysis ( Table 2 ). The overall course rating has been consistently more than 4 with the highest in 2017 (mean = 4.89) and lowest in 2014 (mean = 4.42) and 2015 (mean = 4.53). Significant increase in rating was observed between 2015 and 2017 (p-value < 0.005). The response to whether the course helped participants perform their jobs effectively was also consistently more than 4 with the highest in 2017 (mean = 4.81) and lowest in 2014 (mean = 4.46) and 2015 (mean = 4.56). Significant increase in rating was observed between 2015 and 2017 (p-value < 0.005). The rating of whether the level of the course was at the right educational level was also consistently more than 4 with highest in 2017 (mean = 4.82) and lowest in 2014 (mean = 4.33) and 2015 (mean = 4.52). Significant increase in rating was observed between 2015 and 2017 (p-value < 0.005). (II) Evaluation of SR 2020 Webinar Format Quantitative Analysis (Table 3) . A total of 99.5% responded favourably to 'overall quality of the course'. Favourable responses to 'if the course helped them perform their job effectively', 'if the course was at the expected level of education' and 'if quiz platform was easy to use' were 99.8%, 99.2% and 99%, respectively. 86% of the participants responded that the course duration was just right, and 99.7% responded that they would recommend this course to a friend. Qualitative Analysis. Out of 266 free-text responses, 436 codes were obtained for qualitative analysis. Coding identified 13 codes and 2 codes emerged from these codes, namely, (i) the overall learner experience and perception of the webinar, and (ii) engagement and interactivity of the webinar (Table 4) . Overall Learner Experience and Perception of the Webinar. This theme emerged from grouping of nine codes, namely, perception on general webinar experience (50%), webinar format, host and moderator, content and case-based approach, presentation style, annotating images, clarity of images, provision of notes and webinar logistics. The webinar was generally well-liked and well-received by the participants and many felt that it was useful, enjoyable and improved their confidence in interpreting imaging skills. · 'I believe this year's sessions has shown that online webinars can work as well as in real life classes. Thank you'. · 'More importantly this made me more confident in my interpretation skills and encouraged me to keep trying and practicing'. The participants enjoyed both the humorous interactions with the host and the dedicated moderated Q&A segments facilitated by the moderator. · 'Host was friendly, humorous, and time management was excellent, thank you so much to the whole team'. · 'Having moderators for Q&A is excellent because it doesn't disrupt the speaker from delivering the content'. The case-based approach and presentation styles were well-liked. Many participants enjoyed personalised viewing of images on their own screens and suggested to continue in this format for future editions of SR. Interestingly, there were many requests for handouts without annotations to be provided before the sessions so that the participants can annotate and make additional notes in real-time. Engagement and Interactivity of the Webinar. This theme is a combination of codes such as general interactivity on the online format, interactivity via online tools (Poll Everywhere and 'Live' Q&A function) and faculty engagement. The participants felt that it was easier to ask questions via the 'live' Q&A and also easy to follow the answers without disrupting the ongoing lecture. Team effort by the entire faculty to answer the 'live' Q&A was highly appreciated. The participants found the faculty to be engaging and encouraging. The participants particularly enjoyed the audience response system. · 'Hosting the Q&A session online is beneficial as it allows all participants to see both important and miscellaneous questions accounted for by the doctors through text, which is easier for us to keep track of than verbal explanation sometimes'. Both the overall course rating and the response on whether the course helped participants perform their jobs effectively were highest in 2020 (mean = 4.9) compared to the previous years ( Table 2 ) (Figure 7) . The rating of whether the level of the course was at the right educational level was the same compared to 2017 (mean = 4.82) but higher than other previous years ( Table 2 ) (Figure 7 ). Evolution of the SR format mirrored the learning journey of the organising team as it gained experience over time. The first edition in 2014 consisted primarily of consultant-led didactic sessions with some case-based content, following a more traditional radiology tutorial style. Interactive 'live' audience response systems and individual quizzes were incorporated in 2015 for better participant engagement. A senior resident was also invited to participate as faculty in line with efforts to nurture future educators. In 2017, there was a major revamp of SR as the entire workshop was led by a resident team with consultants taking up a supervisory role. The senior resident who participated as faculty in 2015 took on the role of workshop director. The workshop content was converted to full case-based sessions. A new team-based quiz was added to complement the individual quiz. The top scorers of the individual quizzes from different schools were grouped into teams to compete in a 'grand finale' teambased quiz which was conducted in a 'Jeopardy' gameshow format. Paralleling this major revamp, the 2017 workshop also marked the highest course rating of the inperson editions, with the biggest improvement in the course rating between 2015 and 2017. It is likely that radiology residents are more cognisant of the needs of new generation learners in view of their own recent medical student training. Younger faculty may also impart a 'softer touch', bridging the usual generation gap between the audience and the faculty. Incorporating quizzes likely enhanced audience engagement while at the same time helped to nurture friendly competition, making the workshop more enjoyable. By 2017, the workshop format was standardised with only smaller changes incorporated in the subsequent editions in 2018 and 2019. The 2017 resident team also led the 2018 edition with inclusion of new resident understudies who subsequently led the 2019 edition. In 2018, the presentations were enhanced to cater to a larger venue for a bigger audience with minor improvements in content delivery. The concluding team quiz was conducted between different medical schools in contrast to 2017 where the teams were all mixed, which further created a sense of gamification and excitement. In 2019, SR underwent a curriculum revamp with greater focus on systems-based exam-relevant clinical scenarios using guidelines from the Singapore Ministry of Health (MOH) National Outcomes for Medical Schools. The 2019 iteration retained the concluding team quiz conducted between different medical schools. These progressive changes allowed SR to maintain a favourable feedback rating of more than 4 for overall course ratings from 2014 to 2019. SR2020 clearly surpassed the previous ratings of the inperson editions including the most successful 2017 edition. Similar to the previous iterations, participants did not have to pay to attend SR2020. Previous success factors of the inperson editions such as a resident-led team, case-based scenarios and audience response systems were easily ported to SR2020. Accessibility and ease of attending the workshop from any location, personalised viewing of the screen and images and ability to interact via Poll Everywhere, 'live' chat and Q&A functions appealed to tech-savvy participants. [14] [15] [16] Although the well-liked quizzes of previous in-person SR editions had to be discontinued in SR2020, introduction of new players such as a host, moderator and utilization of webinar tools such as 'live' chat and Q&A functions contributed to its success. Poll Everywhere has been one of the main components in our workshops leveraging on audience interactivity since 2015. It was easy to tap on our experience incorporating Poll Everywhere into the presentations for the webinar version. Poll Everywhere is a type of audience response system allowing active and 'live' audience participation in a fun and engaged manner by using a variety of interactive features (Figures 1-3) . [17] [18] [19] Multiple choice question (MCQ) format may be better suited for difficult questions or new content where a prompt could help the participants to answer. 20 On the other hand, open-ended word cloud can add in a layer of challenge and motivate critical thinking for easier questions and concepts. 20, 21 In the context of radiology, interactive response images are more suited to directly identify abnormalities directly on radiological images. Usage of Poll Everywhere can be anonymous but at the same time allows participants to visualise answers from their peers in real time which enables reflective observation. Leveraging on webinar capabilities of the 'live' chat and Q&A functions also increases audience participation. The participants may find it less intimidating and be less hesitant to put forward queries and doubts as there is a room to accommodate for more questions on an online platform. It would be beneficial to provide the participants with all the questions and answers at the end of This not only allows students to see how their colleagues are responding, but also provides real-time information to the tutor about possible knowledge gaps that may not be apparent at the outset. The case-based format is more engaging for students, and allows learning points to be divided over a series of cases for better understanding rather than as one 'full-blown' didactic session Avoid 'too much of a good thing' Judicious use of 'live' polls There was a deliberate effort to limit the number of polls done, as too many interactive components can be disruptive to the learning process and take up too much time. We used a rule of thumb of not more than two polling spots (either image-based, word cloud or MCQ) per 7-min case 'Know your audience' Customise poll type according to learner level In our experience, MCQ polls tend to be better for junior medical students, or for a topic that is hard to grasp. Word cloud and image-based answers can be more useful for senior medical students, as the MCQ format could give the answer away to the students. The word cloud format forced the students to think of an answer without any potential hints that MCQ options would otherwise provide Introduce 'new players' Separate the role of 'host' and 'moderator' The host focuses on engaging the participants in a more informal and at times, entertaining manner in order to sustain interest and engagement. The moderator spends time following the 'live' Q&A discussion, sieving out recurrent themes or learning points that can be highlighted to the speakers for reinforcement at the end of each segment 'Leveraging the familiar' Consider 'live' chat in addition to Q&A Most post-millennials are more comfortable interacting in a 'live' chat format rather than speaking out. Apart from familiarity, the 'live chat' also allows faculty to provide quick or simple clarifications that students may have while the lecture is ongoing (e.g. what does IOCM stand for). This allows the query to be rapidly addressed, allowing the student to then re-engage fully with the lecture. The traditional 'Q&A' at the end of each segment allows the speaker to summarise key learning points and address more complicated questions that are posed (e.g. should not an ultrasound be done before the CT scan for this case?) IOCM: iso-osmolar contrast medium. Unlike the in-person model, the scope of personal interaction offered on a webinar model is minimal, but there are other ways to improve personalisation and enhance participant engagement and interactivity. Whilst a moderator acts as an audience advocate sieving through the live-stream of questions to highlight recurrent themes or important queries to the tutors, an engaging host can add a touch of humour and keep the sessions lively. Both the host and moderator roles were separately defined and well-received by participants. Irrespective of advanced functions and software available, the driving force of all interactions is the faculty itself. The time taken to building a strong and dedicated team of radiology resident educators was paramount to the success of previous as well as current editions of SR, and the importance of nurturing the next generation of educators during residency cannot be overstated. 22 For SR, some residents progressed from participant to faculty and eventually to workshop director over the years. Incremental yearly teaching experience coupled with mentoring by more experienced senior faculty allowed SR to continue to generate a strong team of motivated and enthusiastic resident faculty. This is reflected by the consistent positive feedback by participants on faculty engagement. Overall, participants expressed preference for the new webinar format and recommended continual incorporation of online components such as personalised viewing, 'live' chat and Q&A options even if the future workshops are to be organised in an in-person format. In our experience, we noted that it was easy to port image-rich content of radiology to an online platform. We have incorporated online teaching to our elective medical students, and intend to extend SR to overseas students as a future direction. Key learning points based on a combination of author interpretations and extrapolation of qualitative results are summarised in Table 5 . These key learning points can be applied to both medical student and resident education initiatives. The quantitative analysis is based on learner perception and how the participants reacted to the workshop experience (Kirkpatrick evaluation level one), and the level and impact of learning were not assessed. Certain factors such as humour by the host and likability of instructors are difficult to measure and have limited generalisability to other educational contexts. An online pivot can be intimidating, especially if one is used to prior successful in-person engagements. However, an online format can be successful and appeal to post-millennial learners. Not only can prior interactive elements be sustained, one can even engage learners at a level beyond what is achievable during traditional in-person sessions while reaching out to a far larger audience. Image-intensive specialties (e.g. radiology, dermatology, pathology, etc.) may be uniquely suited for online versions due to the inherent advantages of viewing high-quality images on personal devices. It is likely that blended learning formats (both online and inperson, synchronous and asynchronous) will be useful ways of engaging the Generation Z' learner in the years ahead. Experiential learning : experience as the source of learning and development The impact of live audience participation teaching on medical education at The Surgical Scousers, an undergraduate surgical society Experiential learning: AMEE Guide No. 63 Medical student education in the time of COVID-19 COVID-19 and Singapore: from early response to circuit breaker From trial to implementation, bringing team-based learning online-Duke-NUS Medical School's response to the COVID-19 pandemic Nonverbal overload: a theoretical argument for the causes of Zoom fatigue Perspectives of physicians regarding the role of webinars on medical education during the COVID-19 pandemic A feature on Zoom secretly displayed data from people's LinkedIn profiles Answering your top 10 questions about Zoom for healthcare Death to weak PowerPoint: strategies to create effective visual presentations Using thematic analysis in psychology Online liver imaging course; pivoting to transform radiology education during the SARS-CoV-2 pandemic What's in your teaching toolbox Students' perceptions of lecturing approaches: traditional versus interactive teaching Examining the benefits and challenges of using audience response systems: a review of the literature Undergraduate radiology education during the covid-19 pandemic: a review of teaching and learning strategies [formula: see text Poll everywhere to encourage learner satisfaction and participation in internal medicine fellowship didactics Very short answer questions: a viable alternative to multiple choice questions Comparing singlebest-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: cross-sectional study Why teaching should be recognized as an essential component of radiology residency training Acknowledgments N/A. All authors of this paper have directly participated in the planning, execution and writing of this article. The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. The author(s) received no financial support for the research, authorship, and/or publication of this article. The datasets generated for the current study have been obtained from the post-course survey responses of the participants via Google forms Ethical Approval N/A. Jyothirmayi Velaga  https://orcid.org/0000-0002-1863-4330