key: cord-0989835-43etakmq authors: Brewer, P. E.; Racy, M.; Hampton, M.; Mushtaq, F.; Tomlinson, J. E.; Ali, F. M. title: A three-arm single blind randomised control trial of naïve medical students performing a shoulder joint clinical examination date: 2021-07-21 journal: BMC Med Educ DOI: 10.1186/s12909-021-02822-5 sha: aaf1d415ddee14ec2336b602179096e552526b48 doc_id: 989835 cord_uid: 43etakmq BACKGROUND: Technological advances have previously been hailed as a new dawn in Higher Education, with the advent of ‘massive open online courses’ (MOOCs) and online learning. Virtual platforms have potential advantages such as accessibility and availability but simply transferring educational material to the online environment may not ensure high quality learning. Clinical examination is a fundamental principle of medical assessment, and this study aimed to assess the role of technology in teaching these skills. AIMS/OBJECTIVES: To determine whether three teaching modalities were of equal efficacy in teaching examination of the shoulder joint to naïve medical students. METHODS: Sixty-seven pre-clinical medical students naïve to large joint examination were recruited. Participants completed a learning style questionnaire and were then block randomised to three study: textbook study, face-to-face seminar, or video tutorial via online platform. The same examination technique was taught in all groups, with the intervention being the method of delivery All second year students were eligible for inclusion. The single exclusion criteria was previous exposure to clinical examination teaching. Students were assessed using a standardised scoring system at baseline (pre-intervention), and days 5 and 19 post-intervention (maximum score 30). Assessors were blinded to group allocation. The primary outcome was assessment score at day 5 post intervention. RESULTS: There was no difference between the three groups at baseline assessment (mean scores 2.4 for textbook, 2.8 for face-to-face, and 3.1 for video; p = 0.267). Mean post-intervention scores were 16.5 textbook, 25.5 face-to-face, and 22.4 video (p < 0.001, η(2) = .449). There was no change between day 5 and day 19 post-intervention assessment scores in any group (p = 0.373), Preferred learning style did not affect scores (p = 0.543). CONCLUSION: Face-to-face teaching was the most effective method for teaching clinical examination of the shoulder. Technology can potentially increase accessibility and remove geographic barriers, but is not as effective if teaching techniques are simply mirrored in an online format. Online platforms allow in depth data analysis of how learners interact with educational material and this may have value in improving the design of online educational materials, and is a potential area for further research. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12909-021-02822-5. Taking a detailed history and performing a thorough clinical examination have always been the foundation to making a diagnosis. Good clinical examination skills are thought to increase the quality of care and reduce cost [1, 2] . Investigations are performed as an adjunct to confirm or refute the differential diagnoses. Undergraduate medical education has sought to instill these skills at the earliest opportunity. The core principles of teaching a clinical examination have not evolved significantly over the last 100 years and is typically taught via a combination of large group didactic and small group seminarbased teaching styles [3] . Simulated encounters have aided as an adjunct to support this [4] . Several studies have shown that clinical examination skills are often lacking in both undergraduate and postgraduate trainees [1, [5] [6] [7] [8] [9] [10] . There is also a lack of agreement over the ideal time to teach clinical examination skills, and more importantly the most effective method. Particular concerns have been raised over the quality of musculoskeletal clinical skills teaching [7, [11] [12] [13] [14] . The discrepancy between teaching time and proportion of patients seen with musckuloskeletal conditions is striking; Oswald et al. cite in Canada 2.25% of time in the curriculum is devoted to teaching the musculoskeletal system examination skills versus up-to 20% of presentations in primary care relating to the musculoskeletal system [7] . Furthermore, Freedman et al. found only 18% of post graduates had exposure to musculoskeletal medicine and 86% of those surveyed had low confidence performing these examination skills. This shows a potential problem in both the quantity of time allocated to teaching in an undergraduate setting, and the exposure to reinforce this a a post graduate [15] . Modern technology means educational content can be hosted online and accessed via multiple devices. They can be accessed at times that suit the learner and support new forms of pedagogy e.g. reverse classrooms [16] . Education via technological platforms is however not without difficulty, with only 10% of those enrolling for 'Mass Online Open Courses' (MOOCs) completing the course. This however may not be generalisable to the well-motivated and goal-orientated medical student. There are a multitude of reasons for this, with many of them yet to be successfully addressed [17] . It is clear that simply recording taught content and hosting it online is not enough to fully engage learners, and that social adaptation of technology does not immediately translate to education [18] . Our primary study outcome is to assess the effectiveness of different teaching modalities on naïve medical students performing a clinical examination of the shoulder joint. This randomised control trial is the first to directly compare seminar small group teaching, textbook learners and a custom-made video tutorial via an online platform. This study was a prospective randomised trial comparing three different teaching modalities to teach clinical examination skills to second year medical students. The study was designed to compare three different methods of teaching clinical examination skills -face to face teaching, a custom-made educational video and a textbook chapter. An identical examination technique was taught to each group, with the mode of delivery being chosen as the intervention. Detailed information on the intervention in each group is given below. The content was standardised across each modality. After permission from Cambridge University Press the Shoulder examination chapter from Examination Techniques in Orthopaedics [19] was reproduced in three formats. Once participants had been recruited there were asked to complete a 'VARK' learning styles questionnaire and then block randomised by a computer random number generator into the three intervention groups. The intervention in each group was on day zero of the study, to ensure standardisation of timelines. Participants underwent a formal assessment of their clinical examination skills before randomisation, and at day 5 and day 19 post intervention. Participants were asked to examine a patient without shoulder pathology, and were assessed by an examiner blinded to the intervention type, using a standardised scoring matrix. The study was undertaken in the medical school building then students were in a familiar environment, and was conducted in the evenings to avoid timetable clashes with scheduled teaching. The study ran for 21 days from baseline assessment to final assessment. Through opportunity sampling, we aimed to recruit second year medical students at the University of Sheffield (cohort size n = 290) to the study. The only exclusion criterion was previous teaching on musculoskeletal examination. The study proposal was presented to the entire year group, and contact details provided for the research team. Students were emailed by the University following the presentation and provided with the contact details for the research team. The students were provided with an information booklet detailing the aims and objectives of the study and outlined the requirements. Students wishing to enroll in the study were asked to complete a consent form for the study. University of Sheffield ethical approval was obtained for the study (Ref Number 013097). All methods were performed in accordance with the approved proposal. All were screened for previous musculoskeletal examination teaching, and none of those recruited had any previous teaching, meaning no students were excluded. All candidates completed a 'VARK' questionnaire [20] ascertaining their dominant learning style and all performed a baseline assessment. The VARK questionnaire measures four perceptual preferences of learning style, visual, auditory, reading/writing and kinesthetic. There are 16 questions exploring preferences and the total score indicates the individuals preferred learning style. This method has been shown to be valid and reliable [21] . The baseline assessment examined each candidate performing a shoulder examination on a patient with no shoulder pathology. They were assessed against a predetermined marking scheme. The assessment tool comprised of six domains marked out of 30; inspection (6 marks), palpation (3 marks), movement (5 marks), rotator cuff (4 marks), special tests (6 marks), joint above/ below and neurovascular assessment (3 marks), correct sequence (3 marks). Based on the candidates learning styles, they were block-randomised by a computer random number generator into three intervention groups-textbook, seminar session and access to a custom-made online video. All three versions followed a consistent sequence and exhibited the same techniques. This group were given a copy of the Shoulder examination chapter from the 'Examination Techniques in Orthopaedics' textbook. The chapter contains both written and pictorial descriptions of a standardised approach to examination of the shoulder, and describes the technique developed and taught by one of the senior authors. Study participants in this group all received their own copy of the chapter and were free to annotate this as required for their own learning. A custom-made video showing the senior author teaching the same comprehensive shoulder examination was developed. The video was recorded in studio conditions by a cameraman with significant experience in making surgical educational video resources. This was then uploaded to an online platform (SproutVideo) which allowed each candidate to have an individual password protected log-on. The platform allows advance analytics -who accessed the video, what time of day it was accessed, what device it was watched on (i.e. smart phone versus laptop), and which parts of the video were watched. The video was in exact replica of the technique covered in the textbook chapter, with the examination sequence matched to that described in the textbook chapter. The third group attended a face-to-face teaching session lasting 30 min, delivered by the senior author teaching 10/ 11 candidates per session. This session was held at the University of Sheffield Medical School to ensure learners were in a familiar learning environment. The examination sequence followed the exact method used in both the textbook chapter and the custom video. The senior author demonstrated the clinical examination technique and tests on a healthy volunteer. Students did not practice or receive feedback. The teaching sequence and method was an exact replication of the educational video used in the study. Candidates were explicitly told not to access other groups teaching modality (i.e. share log-on for access to the online video) but could practice and use other resources available to try and mirror real life practice for a students' assessment. Five participants withdrew from the study. As this was a voluntary study no further action was taken and the study continued as per protocol. No outcome data was collected for these participants. To ensure that the three groups showed equivalent ability prior to the intervention, we performed a one-way ANOVA (Video vs. Textbook vs. Seminar) on baseline scores. To examine the impact of each intervention, we calculated the difference in performance change relative to baseline for the Assessment on Day 5 and on the Retention Test on Day 19. This performance change measure served as the dependent variable for a 2 (Phase: Assessment vs Retention) × 3 (Training Group: Seminar vs. Textbook vs. Video) mixed ANOVA. We used simple effects to decompose interactions and report Bonferroni corrected p values for post-hoc comparisons. To explore the contribution of learning style, we recapitulated the analysis using a mixed effects model with Intervention Group and Stage (Assessment vs Retention) as fixed effects and subject and sex as random effects. We then compared this model to one that included VARK score as a fixed effect to see if the inclusion of learning styles would provide greater explanatory power of the results. We set an alpha threshold of .05 for statistical significance, but given the notable limitations of this approach [22] [23] [24] , we report eta squared (η 2 ) and Cohen's d for effect sizes and include measures of standard error and 95% confidence intervals so that readers are able to more accurately interpret the extent of any differences. Statistical analyses were performed using R version 3.0.2, with the mixed model analysis implemented using the "lme4" package [25] . A total of 72 students were recruited to participate in the study, with no exclusions. Five students did not attend for the baseline assessment and dropped out. The mean age of the 67 candidates who completed the study was 21 years (SD = 2.03) (range 19 to 27) and 37 participants (55%) identified as female. Figure 1 shows the participant flow through the trial. A total of 67 participants completed the study. There were 24 participants in the video group, 22 in the textbook group and 21 in the face-to-face teaching group. Inter-rater reliability of the performance scores, examined through having six of the candidates independently assessed by two examiners at the same time, indicated strong agreement between the assessors (κ = .839, p < .001). A comparison of pre-test examination scores showed no differences between the training groups (F [ In turn, the Video Group significantly (t = 4.4, p < .001, d = 1.1) outperformed the textbook group. To provide an intuitive visualization of the difference across groups, we collapsed scores across Assessment and Retention and plot the average performance improvement between Groups in Fig. 2 . Finally, there was no statistically reliable impact on the results based on the participants learning styles, as a mixed model with learning styles included as a fixed effect showed no superiority over one without (X 2 = 9.07, p = .247). This study demonstrated there was no influence of learning styles on learning, but there was an effect of intervention type, with face-to-face teaching being the most effective way to teach clinical examination of the shoulder. Notably, there was no difference in performance between initial post intervention assessment and retention test 2 weeks later. Multiple studies have shown that physical examination skills are lacking in both undergraduate and postgraduate settings. 12 = 14 Given the importance of these skills in delivering high quality care and reducing the cost of care [2] it is critical to understand the ideal way to teach these skills. Previous work by Kurihara et al. suggested no difference between interventions in their four arm randomised trial [26] . This study suggests that face-toface teaching remains the gold standard and is the most effective way to teach shoulder examination to naive medical students when compared to a custom-made educational video. Educational videos have the advantage of being accessible at any time, from any location from either a mobile phone, tablet, or computer. Despite this accessibility it was less effective than a single face-to-face teaching session. This study particularly highlights the difference between 'learning' and 'teaching' [27] as each group were taught in a different way and yet how this was learned or practiced in small groups or individually prior to the testing may have varied. Despite balanced groups prior to the intervention the learning style played no role in the outcome and the method was significant. We hypothesize this to be related to the engagement and interaction with the learner with reinforcement throughout their contact time. Lave and Wenger [28] has previously described the importance of dialogue and social interaction in learning and this is likely to play a role in the superior outcomes seen with face-to-face teaching. Of the cohort included in this study 100% had a smart phone and therefore had a wealth of information at their fingertips to learn from for their undergraduate education. The advantages offered by such resources include the fluid and ever-evolving content; textbooks can be outdated by the time they are published and distributed. Additionally, they can be viewed during opportune moments without the need to pre-plan. An example of how unreliable online resources can be has recently been published by Singh et al. [29] ; they found of over 100 "YouTube" educational videos on rheumatoid arthritis only 55% were accurate and 30% were misleading. This inaccuracy emphasizes the importance of directing the learner to valuable and reliable resources. A great deal of controversy surrounds the VARK questionnaire [30] . Fleming et al. [31] linked learning styles with four main categories; visual, auditory, reading/ writing and kinesthetic. This method has been validated [21] but evidence shows that changing the way in which material is taught to align learning style does not improve outcomes enough to justify the financial outlay [32, 33] . The balance of power has shifted over the last 10 years; the medical student is now a consumer of medical education and a customer of the Medical School. Massive increases in the cost of attending medical schools in the UK means the demand for good quality teaching is as high as it has ever been, hence the need for good quality teaching which is proven to be effective. Cost analysis was not specifically examined during this study, but we estimate our video would cost £5000 for professional filming and production. If this was to be expanded to all systems examinations or 'simple procedures' the cost may impede their production. However, looking to the future, for examination techniques one would not expect this to evolve significantly over time and therefore after the initial effort and expense of making a bespoke video it then could be reused for the foreseeable future. To our knowledge this study is the only randomised control trial comparing a custom made, internet hosted educational video to traditional methods for teaching orthopaedic clinical examination skills. The major strength of this study is the robust methodology and adherence to protocol for each candidate once randomised. The blind assessment and high consistency between our independent assessors add confidence to the reliability of the results. The limitations of this study are the inherent bias associated with the enthusiasm and motivation of those students voluntarily enrolling in a 'clinical examination' study. The importance of an instructor who is enthusiastic and knowledgeable also plays a significant role in information retention and cannot be underestimated. Given the widespread adoption of virtual platforms for meetings and education, exacerbated the pandemic, there is a need for further research comparing live teaching in face to face or via an online platform such as Zoom. Looking further into the future, there are a plethora of emerging technologies such as virtual and augmented reality that are showing a great deal of promise for delivering undergraduate healthcare education [34] [35] [36] [37] [38] [39] [40] . The need for alternative solutions to face-to-face delivery has been exacerbated by the COVID-19 pandemic [41] and we suggest that the successful implementation of these technologies will rely on how well they are able to mimic the key features of face-to-face teaching. The present work shows that this approach remains the gold standard for teaching joint examinations. Face-to-face teaching is the most effective way to teach clinical examination of the shoulder to previously naive medical students, with significantly superior scores on blinded assessments when compared with a custommade educational video or reading a textbook. These results indicate that clinical examination skills should be taught with face-to-face teaching where possible. The online version contains supplementary material available at https://doi. org/10.1186/s12909-021-02822-5. Additional file 1. How do novices learn physical examination skills? A systematic review of the literature Should auscultation be rehabilitated? Medical education today: all that glitters is not gold Simulation in medical education: brief history and methodology Cardiac auscultation skills of physicians in training Pulmonary auscultatory skills during training in internal medicine and family practice The current state of musculoskeletal clinical skills teaching for preclerkship medical students Performance in physical examination on the USMLE step 2 clinical skills examination International medical Graduates' performances of techniques of physical examination, with a comparison of US citizens and non-US citizens Competency in cardiac examination skills in medical students, trainees, physicians, and faculty: a multicenter study Teaching musculoskeletal clinical skills to medical trainees and physicians: a Best Evidence in Medical Education systematic review of strategies and their effectiveness: BEME Guide No Use of structured musculoskeletal examination routines in undergraduate medical education and postgraduate clinical practice-a UK survey The inadequacies of musculoskeletal education Doctors likely to encounter children with musculoskeletal complaints have low confidence in their clinical skills The adequacy of medical school education in musculoskeletal medicine The effects of an e-textbook and the 'reverse classroom'on surgical training Psycho-social aspects of learning in MOOC courses A psychological perspective on media literacy Shoulder examination Attempted validation of the scores of the VARK: learning styles inventory with multitrait-multimethod confirmatory factor analysis models The fallacy of the null hypothesis significance test Moving to a world beyond Using effect size-or why the P value is not enough R: A language and environment for statistical computing Academic performance and comparative effectiveness of computer-and textbookbased self-instruction Another nail in the coffin for learning styles? Disparities among undergraduate anatomy students' study strategies, class performance, and reported VARK learning styles Situated learning: legitimate peripheral participation YouTube for information on rheumatoid arthritis--a wakeup call? Student engagement and learning preferences in an online health information technology course with asynchronous instructor-created videos: Gwynedd Mercy University Not another inventory, rather a catalyst for reflection Exploring the relation between visualizer-verbalizer cognitive styles and performance with visual or verbal learning material Does tailoring instructional style to a medical student's self-perceived learning style improve performance when teaching intravenous catheter placement? A randomized controlled study Feedback and motor skill acquisition using a haptic dental simulator Early assessment with a virtual reality haptic simulator predicts performance in clinical practice Capturing differences in dental training using a virtual reality simulator A systematic review of immersive virtual reality applications for higher education: design elements, lessons learned, and research agenda Using virtual reality in medical education to teach empathy Mixed reality as a timeefficient alternative to cadaveric dissection Mixed reality anatomy using Microsoft HoloLens and cadaveric dissection: a comparative effectiveness study The role of immersive Technology in Healthcare Training & education: three principles for Progress. Leeds, UK: Centre for immersive technologies Many thanks to The University of Sheffield Medical School for their support, and to the students who voluntarily participated in the study. Authors' contributions PB, JT and FA were involved in the study idea, design and performed the intervention. MR and FA recorded and edited the video. All authors were involved in participant assessments. FM anaylsed the data. PB and JT were major contributors to writing the manuscript. All authors reviewed and approved the manuscript. No funding was required for this study. The data sets used and analysed during this study are available from the corresponding author on reasonable request. Ethics approval and consent to participate This study was approved by the ethics committee of the University of Sheffield (ref 013097). All participants signed a written consent form and participation was voluntary. The methodology was executed in accordance with the agreed protocol. Not applicable. The authors declare there are no competing interests. Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.