key: cord-0858099-gho3cx1n authors: Duhon, Gabrielle F.; Simon, Andrea R.; Limon, Danica L.; Ahmed, Kelli L.; Marzano, Gabriela; Goin-Kochel, Robin P. title: Use of a Best Practice Alert (BPA) to Increase Diversity Within a US-Based Autism Research Cohort date: 2022-01-08 journal: J Autism Dev Disord DOI: 10.1007/s10803-021-05407-9 sha: 450b3c8c727eb649ebffe6a5f8cf516ed7bcc129 doc_id: 858099 cord_uid: gho3cx1n We evaluated the success of a best practice alert (BPA) in recruiting underrepresented families into an autism spectrum disorder research cohort by comparing BPA-response outcomes (Interested, Declined, Enrolled, Dismissed) in pediatric primary care practices (TCPs) serving diverse communities with those of subspecialty clinics. Compared to subspecialty clinics, TCPs had higher proportions of Interested responses for patients with private insurance (60.9% vs. 46.2%), Dismissed responses for patients with public insurance (30.1% vs. 20.0%), and Interested responses for non-white patients (47.7% vs. 33.3%). A targeted BPA can help researchers access more diverse groups and improve equitable representation. However, select groups more often had their alert dismissed, suggesting possible selection bias among some pediatricians regarding who should receive information about study opportunities. Underrepresentation of racially and ethnically diverse populations in clinical research limits our ability to generalize results for treatment intervention, which contributes to healthcare inequities (Haley et al., 2017) . Lack of representation is driven by challenges with investigators' recruitment of diverse samples, as well as patients' distrust of the medical and scientific communities (Zamora et al., 2016) . Studies examining recruitment strategies report a need for research teams to increase their time spent in active recruitment, coordination of logistics, and community involvement, yet time constraints, competing study demands, and insufficient resources pose challenges to investing in these efforts (Amorrortu et al., 2018; Zamora et al., 2016) . Researchers are in desperate need of efficient and practical tools that enable them to connect with underserved populations. Studies have shown that having healthcare providers introduce a study to their patients during their clinical visits is an effective recruitment strategy and can be helpful in engaging diverse participants (Chhatre et al., 2018; Tilley et al., 2017) . However, initiating and maintaining provider cooperation in this setting is challenging (Amorrortu et al., 2018) . Electronic Health Records (EHRs) have been shown to facilitate outreach to historically marginalized populations but traditionally involve passive methods that do not lend to effective follow-up and enrollment efforts (Devoe et al., 2019; Lai & Afseth, 2019) . Tools that can increase provider engagement with research referrals at the point of care for large numbers of patients, as well as best-practice recommendations for successful implementation of these strategies, are essential. We recently demonstrated the efficacy of an EHR-based best practice alert (BPA) in engaging subspecialty providers in research recruitment for large numbers of participants at the point-of-care (see Simon et al., under review) . However, many families of color and/or families with lower levels of socioeconomic status (SES) experience challenges that make it difficult for them to access subspecialty services (e.g., language barriers, difficulties completing referral paperwork, lack of transportation (Haley et al., 2017; Zamora et al., 2016) ); subsequently, they are not exposed to clinicalresearch opportunities offered in those settings. In these cases, it is possible that primary care providers (PCPs) may serve as the only community resource to promote the benefits of research engagement within the healthcare system. The current study focused on how a BPA can be used to overcome recruitment barriers that contribute to racial/ ethnic and socioeconomic disparities in clinical research. Specifically, we evaluated the success of a BPA in recruiting underrepresented families into a research cohort of individuals with autism spectrum disorder (ASD) by implementing it in primary care practices (PCPs) that serve racially, ethnically, and socioeconomically diverse communities. Our primary objectives were to compare (a) patient-demographic characteristics and responsiveness to the BPA between practice types (specialist providers versus PCP providers) and (b) study interest/dismissal within practice types by patientdemographic characteristics. Texas Children's Pediatrics (TCP) is a network of 52 primary care pediatric clinics that is linked with Texas Children's Hospital and serves approximately 23.0% of the greater Houston pediatric population. We initially examined characteristics of the patient community at each TCP practice and approached five that served relatively high proportions of both children with ASD and families of color. All practices were willing to participate in recruitment efforts for this pilot project. The number of providers at each practice ranged from 6 to 11. A map showing the distribution of the five TCP sites across greater Houston and in relation to the multidisciplinary Texas Children's Hospital autism center can be found in Fig. 1 . Prior to the launch of the BPA, providers and practice managers at each TCP site were educated on how to introduce the study and respond to the alert, as well as on the importance of sample diversity in research studies. As noted earlier, we previously implemented a BPA for recruitment of patients with ASD and their family members into the Simons Foundation Powering Autism Research for Knowledge (SPARK) study (Feliciano et al., 2018) . For this initial effort, 22 subspecialty clinic locations under Psychology, Neurology, Psychiatry, and Developmental-Behavioral Pediatrics had been selected to receive the BPA because these sections served the largest numbers of children with ASD across Texas Children's Hospital (Simon et al., under review) . BPA data from these subspecialty practices were used as a baseline for comparison with the TCP practices in the current study. As with the TCP sites, providers in each of these subspecialty practices were educated about the study and how to respond to the alert prior to launch of the BPA. However, they did not receive information on the importance of sample diversity in research studies. A complete description of the BPA and its development is provided in Simon et al. (under review) . Briefly, upon opening the patient's chart in the EPIC EHR, the provider sees a message that succinctly describes the SPARK study and alerts him/her that the current patient is potentially eligible (see Fig. 2 ). The provider then has the option of reporting the family's level of interest in the study by selecting either Interested (the family would like more information/to be contacted by the study team), Declined (the family is not interested/does not want to be contacted), or Enrolled (the family indicates they are already participating in the study). Alternatively, providers can dismiss the alert if it is not an appropriate time to discuss the study; the BPA will fire again the next time the chart is opened. Trigger criteria were put into place to prevent duplicate referrals of the same patient over time. When a provider records a response, an EHR-inbox message with this information is sent to the research team. The team then contacts all Interested and Enrolled families by email and telephone to send them information about the study and offer enrollment assistance. Data on the total number of patients triggered and their responses to the alert, as well as practice location, provider (Simon et al., under review) participation (i.e., responding to versus dismissing the alert), and patient characteristics were extracted from the EPIC EHR after the first eight months the BPA was active in the TCP practices (10/9/19-6/9/20). Descriptive statistics were used to calculate the frequencies of response types by practice group (i.e., TCP versus subspecialty). Chi-square tests of homogeneity were used to compare demographic information and proportions of Declined, Dismissed, Enrolled, and Interested responses to the BPA between patients in the TCP versus subspecialty groups. Within practice types, chisquare tests of homogeneity were also used to compare proportions of Interested and Dismissed responses to the BPA by race (White vs. families of color), ethnicity (Hispanic vs. non-Hispanic), and payor type (private insurance vs. public insurance; public insurance plans included Medicaid, CHIP, STAR, and other government-subsidized Health Plans). A total of 1388 patients triggered the BPA during the 8-month surveillance window, with 466 coming from the 5 TCP practices and 922 coming from the 22 subspecialty clinics. Most patients in both groups were reportedly white and non-Hispanic; however, in comparing the TCP and subspecialty groups, there were significantly (a) larger proportions of Black/African American (133 [ Table 1 . For comparative purposes, information on the racial/ethnic breakdown of the Houston population is provided in Table 2 . Responses to the BPA varied significantly between practice types. Subspecialty providers dismissed the alert more frequently than TCP providers (343 [ To evaluate responses to the BPA in relation to patient demographic factors, we compared rates of both Interested and Dismissed responses, first within practice types and then between practice types. Because of small sample sizes across many racial groups, race data were recoded for these analyses to dichotomize patients as white or individuals of color, with those of unknown race or ethnicity excluded. Similarly, payor-type data were organized to exclude those with self-pay or unknown payor status to afford the clearest comparisons. Within the subspecialty clinics, there were no differences in the proportions of either Interested or Dismissed responses between racial, ethnic, or payor-type groups. However, within the TCP practices, a significantly Table 4 . Placing a research-recruitment BPA into strategically selected primary care settings (i.e., TCPs) successfully reached families of color who were managing ASD, particularly Black/African American and Asian families. Moreover, most of these families were interested in learning more about the SPARK study. Incredibly, just five TCP practices were able to identify nearly the same volume of Interested patients of color as all 22 subspecialty-practice locations combined (102 in TCP vs. 120 in subspecialty), despite the higher concentrations of patients with ASD at subspecialty practices. This suggests that many families of color are interested in clinical research opportunities and that their receptivity to information about research studies may be enhanced when delivered in community-based care settings. It is possible that even when seeking participants with complex diagnoses such as ASD, investigators may increase sample diversity by engaging primary care providers in medically underserved communities more so than condition-specific specialists. Relatedly, the TCP group had a significantly higher rate of Declined responses to the BPA compared to the subspecialty group. This could indicate that TCP providers are more regularly engaging in a conversation about the study with families (as opposed to dismissing the alert) in order to accurately record their wishes, which is helpful to (a) reduce unwanted contact from the study team and (b) funnel research-staff efforts toward other activities. Although the TCP group had a lower proportion of Hispanic families compared to the subspecialty group, this finding was not surprising, as the practices in this pilot were chosen based on the racial and ASD-diagnostic makeup of their patient populations and not ethnicity. However, examining responses to the BPA within practices revealed that the alert was dismissed significantly more often for Hispanic patients in the TCP group compared to the subspecialty group. One eligibility requirement of SPARK is that families are fluent enough in English to understand the consent language, and for this reason, the BPA included a filter so that it would not fire for patients who required an interpreter. It is possible that some providers dismissed the BPA for Hispanic families if they were minimally fluent in English and if they had concerns about the family being able to fully understand what participation entailed. Relatedly, it is possible that more TCP providers exercised selection bias in the type of family who they perceived as "appropriate for" or "ready for" the study and that this led to partiality in the sharing of study information. Future studies of this nature may consider ways to distinguish responses reported by providers versus those reported by families directly. An important observation was that subspecialty providers dismissed the BPA significantly more often than the TCP providers (37.2% vs. 27.0%). The BPA had already been active in the subspecialty practices for more than a year, so long-term engagement and responsiveness to the BPA (i.e., introducing the study and recording a response of Interested, Enrolled, Declined) could be a challenge. It is also possible that the subspecialists were more prone to alert fatigue because they saw many more children with ASD and received the alert more frequently. At the same time, subspecialty visits are often complex and time-intensive with many competing priorities, and providers may have been more likely to simply dismiss the alert to remove it from the screen if they felt there was not time to introduce the research study. Finally, the BPA fired when an ASD diagnosis was added to the patient's chart, so some providers may have felt uncomfortable discussing research opportunities in the context of a new ASD diagnosis. Dismissal patterns also revealed potential socioeconomic disparities within the TCP practices that were not observed in the subspecialty group. Patients with public insurance (e.g., Medicaid, government-subsidized Health Plans), which we used as a surrogate measure of SES, were more likely to have the alert dismissed than to have a response recorded. Similarly, when analyzing demographic factors by BPA-response types between practices, we saw in the TCP group (a) higher proportions of both Interested and Dismissed responses for patients of color, (b) a higher proportion of Dismissed responses for Hispanic patients, and (c) a lower proportion of Interested responses for patients with public insurance. While our aim was to see if we could increase the proportion of Interested responses for families of color when the BPA was placed strategically, we did not anticipate a higher rate of Dismissals in this group, as well. However, this could be explained by the fact that the TCPs served a higher proportion of families of color compared to the subspecialists, so higher proportions of both response types are to be expected. The additional findings of increased alert dismissals among Hispanic patients and fewer Interested responses for patients with public insurance could suggest that the TCP group exercised selection bias in who they perceived as "a good fit" or "likely to engage" with the study and that this led to inequities in offering study information. Alert fatigue and overall higher dismissal rates at subspecialty locations may have obscured effects of similar biases among subspecialist providers. Given that people are often unaware of their own biases, sharing these findings and offering provider education about the need for participation from racially, ethnically, and socioeconomically vulnerable groups to help answer questions critical to empowering these communities may help minimize potential biases in the distribution of research information to these patients at the point of care. Although we demonstrated that a targeted BPA can increase outreach about study opportunities to marginalized communities, some limitations should be noted. While the actual numbers of patients who triggered the BPA within the 8-month surveillance period were high (N = 1388), dividing this number into various subsamples for comparisons resulted in smaller groups that may have limited our power to detect further significant differences. For this reason, we were unable to make meaningful comparisons by individual races, and, instead, examined general differences between white patients and patients of color. It will be important to continue this line of research with larger samples to understand whether targeted BPA-based research recruitment at the point of care is more/less successful with certain groups, as well as how race and ethnicity interact with payor type to influence both provider responsiveness to the BPA and patient interest in research opportunities. Additionally, while the current study underscores the success of a BPA in identifying underrepresented populations who are interested in research, we did not examine actual study enrollment and completion rates. The primary reason for this is that the surveillance period occurred during the emergence of the COVID-19 pandemic, which had widespread disruptive effects on people's lives. Indeed, in the weeks and months following the onset of the pandemic, we observed reduced enrollment rates across recruitment efforts and slower study progression, particularly among people of color. Also, the majority of healthcare visits were converted to virtual formats, which may have impacted providers' comfort with introducing study opportunities and/ or the likelihood of families to enroll. Further research is needed to investigate differences in whether or not people of color recruited through primary care visits are as likely to enroll in studies as those recruited other ways (e.g., virtual visits, social media, community events), as well as methods for how to increase conversion rates from Interested patients to enrolled and completed participants. Advancements in clinical care for underrepresented groups and the quest for health equity depend upon the ability of clinical researchers to obtain sufficiently large, representative study samples. EHR tools like BPAs can be used at the point of care to systematically improve awareness of clinical research opportunities among underrepresented groups. However, even with these tools, institutional biases can lead to exclusionary practices in participant selection, which may undermine equitable access to research. Investigating, quantifying, and addressing biases that influence provider practice patterns when sharing information about clinical research opportunities, as well as identifying ways to successfully engage interested families of color in the research process, is fundamental to overcoming healthcare disparities that are perpetuated by the ongoing exclusion of diverse populations. 8%) 65/112 (58.0%) Payor type c Public Recruitment of racial and ethnic minorities to clinical trials conducted within specialty clinics: An intervention mapping approach Patient-centered recruitment and retention for a randomized controlled study Use of electronic health records to develop and implement a silent best practice alert notification system for patient recruitment in clinical research: Quality improvement initiative SPARK: A US cohort of 50,000 families to accelerate autism research Barriers and strategies for recruitment of racial and ethnic minorities: Perspectives from neurological clinical research coordinators A review of the impact of utilising electronic medical records for clinical research recruitment Utilization of a best practice alert (BPA) at point-of-care for recruitment into a US-based autism research study Design of a cluster-randomized minority recruitment trial: RECRUIT Brief report: Recruitment and retention of minority children for autism research Publisher's Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations Acknowledgments The authors express their thanks to the SPARK team at the Simons Foundation, the SPARK clinical sites, and the SPARK cohort participants. Although the authors contribute to SPARK as a clinical site (#385052, RPG-K), this project was initiated by the authors without specific or supplemental funding. The sponsors played no role in designing, executing, or writing up the results of this analysis. We are also grateful to Ms. Robin Barton and the EPIC-support team at Texas Children's Hospital for their collaboration in developing the BPA and exporting the data described in this study. This work was partially supported by the Intellectual and Developmental Disabilities Research Center (1U54 HD083092) at Baylor College of Medicine. Ethical Approval Robin P. Goin-Kochel has served as a paid consultant in the design of clinical trials for Yamo Pharmaceuticals. All other authors have no known conflicts of interest to disclose. The research reported herein was approved by the Internal Review Board (IRB) of Baylor College of Medicine. Human participants were involved and the IRB approved a waiver of informed consent because of the chartreview methodology.