key: cord-1026060-dwz8n2aq authors: Sam, Amir H.; Reid, Michael D.; Amin, Anjali title: High‐Stakes Remote‐Access Open‐Book Examinations date: 2020-05-18 journal: Med Educ DOI: 10.1111/medu.14247 sha: 03c67736e0ff2486241f458773962ae49e486329 doc_id: 1026060 cord_uid: dwz8n2aq The COVID‐19 pandemic has led to unprecedented challenges in medical school assessments. Final year high‐stakes assessments have classically used closed book examinations (CBEs). Alternative methods of assessment such as open book examinations (OBEs) are emerging but are not routinely used in final year medical school exams. OBEs encourage the use of problem‐solving skills more akin to those used in real‐life. There is currently limited data comparing OBEs with CBEs. A systematic review showed there was insufficient evidence to support the exclusive use of either CBEs or OBEs in assessment, however the studies conducted to date have rarely looked at high‐stakes assessments due to concerns about the validity of OBEs(1). This article is protected by copyright. All rights reserved The COVID-19 pandemic has led to unprecedented challenges in medical school assessments. Final year high-stakes assessments have classically used closed book examinations (CBEs). Alternative methods of assessment such as open book examinations (OBEs) are emerging but are not routinely used in final year medical school exams. OBEs encourage the use of problem-solving skills more akin to those used in real-life. There is currently limited data comparing OBEs with CBEs. A systematic review showed there was insufficient evidence to support the exclusive use of either CBEs or OBEs in assessment, however the studies conducted to date have rarely looked at highstakes assessments due to concerns about the validity of OBEs 1 . Due to the restrictions put in place secondary to COVID-19, we opted to use the two final year applied knowledge tests that had been scheduled to be used in CBEs as remoteaccess OBEs. Candidates were able to access the exams from anywhere in the world using any device with internet access via an online platform. The papers were constructed from the UK Medical Schools Council bank of examination questions, which assess the candidates' ability to integrate clinical reasoning and decision-making skills. As the assessment aimed to assess the synthesis of knowledge rather than factual recall, there was no theoretical advantage to sitting the examination in an OBE rather than a CBE format. The psychometric analyses of the OBEs were compared to those of the written CBEs for the last three years. The OBEs were the same duration as the previous CBEs. Only answers submitted to the online platform during the approved timeframe of the OBEs were accepted. The order of the items in the OBEs were randomised for all candidates to mitigate against the risk of conferring. The median mark for the OBEs was identical to the median mark of the last three years of CBEs. The average discrimination of the OBEs was comparable to the CBEs when measured by mean point biserial. The number of distinctions and merits awarded was This article is protected by copyright. All rights reserved similar to that in previous years. Furthermore, the Cronbach's alpha for the OBEs remained above 0.80, demonstrating good reliability that was similar to that of the CBEs over the last three years. To the best of our knowledge, this is the first time that a final year high-stakes medical school examination has been performed both remotely and using an open-book format. Our results suggest that concerns about the use of OBEs in high-stakes assessment may be unfounded and that remote OBEs present a viable alternative to traditional CBEs if the questions appropriately assess the integration and synthesis of knowledge rather than factual recall. We propose that a combination of remote-access online OBEs and proctored CBEs might be used in the future to strike a balance between authenticity and validity of assessment programmes. Further studies should examine the value of online proctoring in high-stakes OBEs. Delivering the OBEs effectively required having the appropriate people, platform and processes in place. A dedicated team were available throughout the exams to address any issues that students encountered. We developed processes for addressing common problems such as internet connectivity issues. Having an appropriate online assessment platform was also crucial. Candidate feedback was positive and accepting of the changes to their expected assessments in light of the unprecedented circumstances. Comparing open-book and closed-book examinations: A systematic review