key: cord-0706796-9904m76c authors: Pratt, David; Wu, Aaron; Huppertz, John W title: Does Coaching Hospitalists in a Community Hospital Improve the Measured Experience of Care? date: 2021-02-26 journal: J Patient Exp DOI: 10.1177/2374373521996964 sha: 3b20eb1e9550204c06038837a4d49c823264d7c5 doc_id: 706796 cord_uid: 9904m76c Hospitals initiate physician communication training programs expecting to improve patient experience measures. However, most efforts have relied on methods with limited attention to bedside physician–patient interactions. We conducted an intensive in-person hospitalist coaching program to improve patient experience in a community hospital. Full-time hospitalists were coached twice monthly by physician-coaches using a structured process featuring direct observation of care and immediate recommendations. Coach-observed care measures improved marginally. Difference-in-differences analysis of 1137 Hospital Consumer Assessment of Healthcare Providers and Systems surveys revealed no significant improvements by trained hospitalists in preintervention versus intervention comparisons, calling into question the strategy of using coaching programs to improve hospitals’ doctor communication measures. As patient experience has become an integral component to the Hospital Value-Based Purchasing program, hospitals have attempted to improve doctor communication domain scores on the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey by implementing training programs designed to enhance interactions between physicians and patients. Training methods usually involve some combination of lecture, simulations, coaching, rounding, reminders, and recognition (1) (2) (3) (4) (5) . The results of these efforts have been mixed. Some have failed, while others have demonstrated higher doctor communication scores after the training. Improvements have been modest or limited to 1 or 2 survey questions (6) (7) (8) (9) . Furthermore, most studies that assess training effectiveness have taken place in academic medical centers with research capabilities that enable formal evaluations to be conducted. There is scant evidence of formal evaluations of HCAHPSinspired doctor communication training in community hospital settings. Additionally, prior research has focused on training aimed at internists, residents, and specialists whose HCAHPS surveys could be tracked by unit within the hospital (10) . To our knowledge, no published studies have addressed doctor communication training specifically targeted to hospitalists in community hospitals, who have emerged as important providers of inpatient care. Although anecdotal evidence suggests that hospitalists' HCAHPS scores fall below those of other attending physicians, no formal studies have examined differences between HCAHPS ratings of hospitalists and other admitting specialists (11, 12) . In this article, we report a quantitative assessment of an intensive hospitalist training program designed to improve a community hospital's scores on the HCAHPS doctor communication survey section (questions 5-7). Our initiative relied on multiple sessions of in-person coaching at the bedside with immediate feedback. The HCAHPS surveys attributed to these hospitalists were analyzed before and after the training and were compared to surveys attributed to physicians in multiple specialties who did not undergo the training. The study was conducted in a 438-bed community hospital that experienced lower than average HCAHPS scores over several quarters in 2017 to 2018, and ratings on the doctor communication domain emerged as a concern. In 2019, hospital management and physician leadership initiated a program in which hospitalists received coaching and feedback in joint rounding sessions by 2 consultants, both highly experienced internal medicine physicians, using a prescribed instructional method. After an in-hospital pilot test, the program began in November 2019 and ended in February 2020. The program concluded earlier than planned because the Covid-19 pandemic disrupted normal hospital operations. Twenty employed hospitalists were offered 2 coaching sessions each month. Although not mandatory, the program was strongly recommended by physician leadership, and 15 hospitalists participated in the training during the intervention period. Prior to the intervention, 24 measured elements of the coaching were determined based on literature review and past experience (13). In the training program, both coaches accompanied each hospitalist to the bedside and observed the actual patient encounter, noting everything from hand hygiene to the physical examination with special attention paid to clear communication as well as sitting versus standing at the bedside. After each coaching session, immediate feedback was provided and the results were recorded in a database. Interobserver bias was controlled for by comparing aggregate measurements and reconciling scoring for each coach. Encounter-focused scores among hospitalists participating in the program improved marginally from preintervention to post-intervention periods (summary provided in online Appendix, Tables A-1 and A-2). We conducted a retrospective analysis of 1307 HCAHPS surveys that assessed inpatient experience from June 2019 through October 2019 (preintervention period) and November 2019 through February 2020 (intervention). One hundred seventy surveys were excluded because of missing data, resulting in a final sample of 1137 patients; no significant demographic differences were observed for patients with missing data. Surveys were conducted by an approved external vendor within the hospital's usual HCAHPS data collection periods, following established HCAHPS protocols. The primary outcome measures were "top box" scores on the HCAHPS questions from the doctor communications domain as well as 2 global questions, overall hospital rating and recommend the hospital. Analyses of top box scores are recommended over arithmetic means from continuous measures because a high proportion of respondents give favorable ratings, skewing the distributions (4, 14) . The HCAHPS scores were attributed to the provider of record at time of admission, and we used this attribution to link physicians to patients' HCAHPS surveys for both preintervention and intervention periods. As noted in prior research, the physician identified as the provider of record for HCAHPS may not be entirely responsible for the care of the patient during their stay, and this represents a limitation of the study data (10) . Since hospitalists are not coded as a specialty in the HCAHPS survey and reporting process, we identified hospitalists using the hospital's internal provider job classification and attributed HCAHPS surveys to each physician using their National Provider Identifier number. The HCAHPS surveys attributed to nonhospitalist physicians formed a control cohort ("specialists") who did not receive the training protocol. All analyses were performed on raw HCAHPS data supplied by the survey vendor, unadjusted for patient mix, survey mode, or hospital characteristics. We used w 2 and t statistics to test for differences between the trained hospitalist and specialist cohorts and between preintervention and intervention phases. A difference-indifferences analysis was performed using multivariate logistic regression on the 2 global HCAHPS measures and 3 doctor communication domain scores for the hospitalist (intervention) and specialist (control) samples. Five hospitalists who declined the training were not included in the analysis. Covariates included patient characteristics, selfrated overall and mental health, admission from emergency department (ED), and discharge to home. Stata version 13 and SPSS version 25 were used for statistical analysis. The trained hospitalists showed modest improvements on key measures of patient interaction at the bedside (ie, seated, reassurance, use of communication whiteboard), based on trainers' assessments (see Table A -1 in online Supplemental Appendix). Patient characteristics from HCAHPS surveys are presented in Table 1 . Self-rated health and mental health were significantly lower among patients attributed to hospitalists versus specialists. In the preintervention period, education level of patients attributed to specialists was higher than hospitalists, and in both periods, patients attributed to hospitalists were less likely to be discharged to home. Additionally, the proportion of patients admitted from the ED was significantly higher for hospitalists than for specialists. No differences were found for race or ethnicity, and these covariates were excluded from the multivariate analysis. In the difference-in-differences regression analysis that included covariates, adjusted top box scores on both doctor communications ratings and global HCAHPS measures were not significantly different between the preintervention and post-intervention periods (Table 2 ). However, in both periods, the trained hospitalists had significantly lower HCAHPS scores than specialists after adjusting for overall health status, mental health status, discharge to home, and ED admission. The significantly higher proportion of patients admitted from the ED by hospitalists may have affected these results, since over 80% of HCAHPS surveys attributed to trained hospitalists were completed by patients admitted through the ED. In both preintervention and intervention periods, patients admitted through the ED gave significantly lower HCAHPS scores than patients admitted directly, with top box percentages on both doctor communication ratings and global HCAHPS measures differing by almost 10 to 15 percentage points between the 2 groups (see Appendix Table A -2). Our analysis did not show significant improvement in patient experience by the coached hospitalists, nor between hospitalists and specialists, after a concentrated communication training program intended to enhance patient experience. The findings are consistent with other studies that have failed to show significant improvement in HCAHPS doctor communication among providers who participated in communication training programs. Because the training was more intensely focused on hospitalists in a community hospital setting, administrators were optimistic that the program could produce gains that have eluded earlier efforts, since hospitalists are more involved with daily inpatient care than specialists who spend less time in the hospital. However, hospitalists received lower HCAHPS scores than specialists, both before and after the training intervention. Notably, hospitalists were much more likely than specialists to be evaluated by patients admitted through the ED, who gave lower HCAHPS scores than patients admitted directly. Additionally, patients attributed to hospitalists rated their overall and mental health lower (ie, poorer) than patients attributed to specialists, while demographic profiles of hospitalists' and specialists' patients were not significantly different. These findings suggest the need for further research about the complex relationships among the variables that affect hospitalists' service and subsequent HCAHPS ratings (15) . Our study has several limitations. Attribution of HCAHPS surveys to individual physicians is imprecise, and patients who see multiple providers during their stay may complete their HCAHPS survey based on their experience with one doctor while their survey was attributed to another. Further, although the longitudinal data allowed comparisons between trained and untrained groups, the design was not experimental and caution must be used when interpreting the findings. Additionally, our sample size was limited due to the smaller community hospital setting and size of attending staff. Because of Covid-19, the training program ended earlier than planned; however, substantial training had occurred and sufficient data had been collected for our analyses to proceed. Finally, during the intervention period, the hospital's HCAHPS percentile scores improved broadly, prompting the hospital's administrators to declare the program a success. The findings from this study call this conclusion into question, reminding us that a myriad of factors influences patient experience, and evaluation of more comprehensive hospitalwide initiatives is needed. This study was approved by the institutional review board of the hospital. The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. The author(s) received no financial support for the research, authorship, and/or publication of this article. John W Huppertz, PhD https://orcid.org/0000-0001-6610-1494 Supplemental material for this article is available online. Educational roles as a continuum of mentoring's role in medicine -a systematic review and thematic analysis of educational studies from MD to MD coaching: improving physicianpatient experience scores Improving patient satisfaction through physician education, feedback, and incentives Impact of hospitalist communication-skills training on patientsatisfaction scores Doctor coach: a deliberate practice approach to teaching and learning clinical skills Improving physician communication with patients as measured by HCAHPS using a standardized communication model Patient satisfaction with hospital care provided by hospitalists and primary care physicians The importance of physician to physician coaching, medical director and staff engagement and doing 'one thing different Physician communication coaching effects on patient experience Do HCAHPS doctor communication scores reflect the communication skills of the attending on record? A cautionary tale from a tertiary-care medical service The $64,000 question: how can hospitalists improve their HCAHPS scores Hospital impact: the powerful benefits of one ER's inpatient roundingprogram Development, implementation, and public reporting of the HCAHPS survey Community hospitalist time-flow David Pratt is a board certified internist and care consultant with over 30 years experience in healthcare.Aaron Wu is a second year medical student at Albany Medical College.John W Huppertz is an associate professor and chair, MBA Healthcare Management Program, Clarkson University.