key: cord-0843332-edjevr2g authors: Kaell, Alan; Sangwan, Jay; Boryushkina, Varvara; Haggerty, Greg title: The use of the NEJM knowledge + online platform to supplement traditional pulmonary didactic: a resident-led educational quality improvement project at a community hospital IM GME program date: 2021-06-21 journal: Journal of community hospital internal medicine perspectives DOI: 10.1080/20009666.2021.1935595 sha: 8995e50647d5d287e506719c3bda181af76d20d7 doc_id: 843332 cord_uid: edjevr2g Introduction: Many internal medicine residents struggle to prepare for both the ITE and board test. Most existing resources are simply test question banks that are not linked to existing supporting literature from which they can study. Additionally, program directors are unable to track how much time residents are spending or performing on test preparation. We looked to evaluate the benefit of using this online platform to augment our pulmonary didactics and track time and performance on the pulmonary module and ITE pulmonary section. Method: During the month-long live didactic sessions, residents had free access to the pulmonology NEJM K+ platform. A platform-generated post-test was administered with new questions covering the same key elements, including the level of confidence meta-metric. An anonymous feedback survey was collected to assess the residents’ feelings regarding using the NEJM Knowledge+ platform as compared to other prep resources. Results: 44 of 52 residents completed the pre-test. 51/52 completed the month-long didactic sessions and the post-test. Residents’ score improvement from % correct pre-test (M = 46.90, SD = 15.31) to % correct post-test (M = 76.29, SD = 18.49) correlated with levels of mastery (t = 9.60, df = 41, p < .001). The % passing improved from 1/44 (2.3%) pre-test to 35/51 (68.6%) post-test, also correlating with levels of mastery. Accurate confidence correlated with improvement from pre to post test score (r = −51, p = .001). Survey feedback was favorable. A team of four Internal medicine residents (from all three PGY years) selected an educational intervention for their performance improvement project. Suboptimal scores on the September 2019 Internal Medicine In-Training Exam (IM-ITE) Pulmonary section for all internal medicine residents prompted this team to make an online, self-paced question bank tool freely available to all residents to supplement their upcoming traditional pulmonary didactic lectures scheduled during February 2020. Self-paced, online question banks are used predominantly for American Board of Internal Medicine-Certification Exam (ABIM-CE) preparation. Once the Accreditation Council of Graduate Medicine Education's Next Accreditation System (ACGME-NAS) was implemented in July 2014, a new continuing accreditation metric requiring a program to achieve and maintain a recent, rolling 3-year first attempt pass rate of >80% on the ABIM-CE by their graduates [1] likely led to a proliferation and increased use of such online tools. The IM-ITE performance has been correlated with passing the ABIM-CE and such supplemental use for remediation has successfully been reported [2] . We report our experiential descriptive and exploratory analysis with resident survey feedback on the use of a single module to supplement our scheduled month-long pulmonary module for all residents. In March 2020, following our experience, the CV-19 pandemic affected our and many other GME programs, necessitating remote, live or ondemand, didactic sessions and online, self-paced training. We felt it important to share our experience with readers who may consider incorporating similar strategies and conduct educational performance improvement strategies for internal medicine GME during these challenging times. Internal medicine residents in an ACGME accredited program at a community teaching hospital participated during February 2020, the traditional didactic pulmonary lectures' month, in a resident-led educational performance improvement project. The team reviewed commercially available online, self-paced, question bank platforms and evaluated the published literature to select their project's platform. No article directly addressed the use for supplementing a single module, but one platform demonstrated success when used both for remediation [3] and for ABIM-CE prep satisfaction surveys [4] , The NEJM Knowledge Plus' pulmonary module thus was selected. This platform captures individual medical knowledge improvement, degree of mastery of the key points, and confidence (metacognitive metric) for each answer (supplement 1) and permits monitoring of individual attention and performance in real time. The residents had 1 hour to complete a 30-question pre-test. The platform generated questions that covered 10 key points deemed important by faculty from the pulmonology module. During the month between the pre and post-test, residents had 24/7 free online access to the NEJM Knowledge+ pulmonology module and were encouraged to complete enough questions until the system showed they had mastered 100% of all 167 key learning points. A post-test followed the month-long series of 6 of didactic session. This post-test, similar to the pre-test, was platformgenerated with 30 different questions that covered the same key learning points. The pre and post-tests' percent correct answers and metaknowledge metrics (where residents selected their level of confidence in each answer) were captured. Each question's answer had to be rated as follows: I 'know it,' 'think so,' 'unsure,' or 'no idea.' Current unaware is calculating the percent of questions answered incorrectly that were reported as I 'know it. ' An anonymous resident feedback survey of 7 questions, designed by the team (Supplement 2), was sent using Microsoft Forms to all available 52 residents via their work email. This QI educational project was determined by the Chief Academic Officer and Chief Medical Officer to not meet the 45 CFR 46 Federal Common Rule of HSR and did not require IRB submission. Fifty-two of the 55 internal medicine residents were available during the February 2020 traditional pulmonary didactic sessions. The 52 residents included 18 PGY1s, 19 PGY2, and 15 PGY3s. 44/52 took the pretest and 51/52 completed the posttest. All 51 residents who took the post-test demonstrated varying attempts to use of the supplemental online NEJM K+ tool as captured by the percent mastery 1 of the module. Table 1 is a descriptive analysis that presents means and standard deviations of captured NEJM Knowledge+ data. 42/52 residents took both the pretest and posttest. Only 1/44 (2.3%) passed the pre-test with 21 of 30 correct answers. We used a t-test to determine if the improvement from pre-test to post test was statistically significant. The residents (n = 42) demonstrated improvement (t = 9.97, df = 41, p < . =passing rate was determined by a score of 70% correct or higher. b =Score is reported as a percent out of 100 c = Unaware is metaknowledge that was calculated by assessing whether the resident answered correctly and also reported they were confident in their answer. After each question they were asked to rate whether they 'know it,' 'think so,' 'unsure,' or 'no idea.' The platform was able to calculate a percent unaware by calculating the percent of questions reported as 'know it', but incorrectly answered. Note: Even though 44 residents completed the pre-test 42 completed the pre and post-tests. Therefore, because we looked at the 42 residents data to calculate ∆ scores and we report that number here. both improvements from pre to post-test and improvement in the passing rate for the post-test ( Figure 1 ). The residents' improvement in their confidence (metaknowledge metric) in answers to the questions from pre to post-test correlated with their change in pre to post-test score (r = −51, p = .001) and passing rate. (15.2%) did not like the question format and chose not to continue with mastery. Whether these residents had low levels of mastery or post-test performance is impossible to say in an anonymous survey. The use of the NEJM Knowledge+ platform correlates with improvements in both residents' Medical Knowledge (MK) domain and metaknowledge metrics. The MK, assessed by comparison of the pretest and posttest performance and passing rate, correlates with the individual resident's mastery of all 167 key points during their month of traditional didactic session. Since all residents participated, as assessed by attendance, in the traditional didactic sessions to the same degree, this correlation implies that the comparative absolute and relative improvement in performance on the post-test and metametric measures is likely attributable to the degree of mastery itself achieved. Resident feedback revealed that many felt the platform's functionality and content was good and would recommend it to colleagues. Many also felt that it was at least as good as any test prep program to which they had previously accessed. Whether such a tool will prove useful to them as a lifelong supplement to the domain of Practice Based Learning (PBL) and how it affects the Patient Care (PC) domain remains to be explored. Limitations to this project were that we could not adequately demonstrate that use of the NEJM Knowledge+ platform improved ITE scores which many would view is the most important metric. We did not find a relationship between the use of NEJM Knowledge+ platform and the mastery of key points for pulmonology and the ITE's pulmonology subscores (Supplement 1). It's difficult to assess the benefit of using the NEJM Knowledge+ on ITE exams because of multiple factors: 1. Not all residents participated and used the platform fully. 2. the CV-19 pandemic peaked in our hospital the week after this project finished in March of 2020 and persisted up until their ITE exam 6 months later. This could have had a very non-random effect on their mental and physical well-being. 3 . The use of the NEJM Knowledge+ platform was too distant from the date the residents completed the ITE exam. Importantly, this project took place the month prior to the COVID-19 pandemic spreading into our GME training hospital. Our experiential descriptive and exploratory analysis demonstrates this platform was effective and residents provided favorable survey ratings. Since that time, affected residency programs have switched to remote didactics and many program directors have expressed interest in additional self-paced, remote platforms that may be used as a supplemental resource to help residents learn medical knowledge for patient care training and ongoing board preparation. No potential conflict of interest was reported by the author(s). Alan Kaell http://orcid.org/0000-0001-8473-4551 Firsttime taker pass rates initial certification Correlations between the USMLE step examinations, american college of physicians in-training examination, and ABIM internal medicine certification examination Design and implementation of an academic enrichment program to improve performance on the internal medicine in-training exam An exploratory study of a novel adaptive e-learning board review product helping candidates prepare for certification examinations Survey regarding NEJM Knowledge+ We want to get your feedback on how you felt about using the NEJM Knowledge+ program.Hi Gregory, when you submit this form, the owner will be able to see your name and email address.