key: cord-0812294-gh1e3get authors: Keating, Brendan J.; Mukhtar, Eyas H.; Elftmann, Eric D.; Eweje, Feyisope R.; Gao, Hui; Ibrahim, Lina I.; Kathawate, Ranganath G.; Lee, Alexander C.; Li, Eric H.; Moore, Krista A.; Nair, Nikhil; Chaluvadi, Venkata; Reason, Janaiya; Zanoni, Francesca; Honkala, Alexander T.; Al‐Ali, Amein K.; Abdullah Alrubaish, Fatima; Ahmad Al‐Mozaini, Maha; Al‐Muhanna, Fahad A.; Al‐Romaih, Khaldoun; Goldfarb, Samuel B.; Kellogg, Ryan; Kiryluk, Krzysztof; Kizilbash, Sarah J.; Kohut, Taisa J.; Kumar, Juhi; O'Connor, Matthew J.; Rand, Elizabeth B.; Redfield, Robert R.; Rolnik, Benjamin; Rossano, Joseph; Sanchez, Pablo G.; Alavi, Arash; Bahmani, Amir; Bogu, Gireesh K.; Brooks, Andrew W.; Metwally, Ahmed A; Mishra, Tejas; Marks, Stephen D.; Montgomery, Robert A.; Fishman, Jay A.; Amaral, Sandra; Jacobson, Pamala A.; Wang, Meng; Snyder, Michael P. title: Early detection of SARS‐CoV‐2 and other infections in solid organ transplant recipients and household members using wearable devices date: 2021-05-05 journal: Transpl Int DOI: 10.1111/tri.13860 sha: 1b3c91cde2f9e1c91fed850a02d825798eac115e doc_id: 812294 cord_uid: gh1e3get The increasing global prevalence of SARS‐CoV‐2 and the resulting COVID‐19 disease pandemic pose significant concerns for clinical management of solid organ transplant recipients (SOTR). Wearable devices that can measure physiologic changes in biometrics including heart rate, heart rate variability, body temperature, respiratory, activity (such as steps taken per day) and sleep patterns, and blood oxygen saturation show utility for the early detection of infection before clinical presentation of symptoms. Recent algorithms developed using preliminary wearable datasets show that SARS‐CoV‐2 is detectable before clinical symptoms in >80% of adults. Early detection of SARS‐CoV‐2, influenza, and other pathogens in SOTR, and their household members, could facilitate early interventions such as self‐isolation and early clinical management of relevant infection(s). Ongoing studies testing the utility of wearable devices such as smartwatches for early detection of SARS‐CoV‐2 and other infections in the general population are reviewed here, along with the practical challenges to implementing these processes at scale in pediatric and adult SOTR, and their household members. The resources and logistics, including transplant‐specific analyses pipelines to account for confounders such as polypharmacy and comorbidities, required in studies of pediatric and adult SOTR for the robust early detection of SARS‐CoV‐2, and other infections are also reviewed. Post-transplant infectious disease complications are a leading cause of mortality in solid organ transplant recipients (SOTR) [1, 2] . In particular, complications of respiratory infections have been shown to have devastating consequences in SOTR, with earlier diagnosis and treatment resulting in better outcomes [3] . Recent prospective multicenter studies in adult SOTR with clinically managed influenza infection showed~66-71% of recipients required hospitalization with >30% developing pneumonia and 11-16% requiring intensive care unit (ICU) admission with mortality rates of 4-4.6% [4, 5] . Notably, SOTR who received antiviral treatment within 48 hours of influenza A (H1N1) symptom presentation showed decreased rates of ICU admission (8%) compared to those who received treatment after 48 h (22%) as well as decreased incidence of hospital admission and mechanical ventilation [4] . The recent COVID-19 pandemic presents increased risk of severe SARS-CoV-2 infection in the immunosuppressed SOTR. Literature reviews show 16-28% COVID-related mortality rates in SOTR [6] [7] [8] , although larger studies are needed to dissect known comorbidity/ risk factors. The mean incubation period of SARS-CoV-2 reported in large studies varies from 5.7 days (95% CI, 5.1-6.4) to 7.7 days (95% CI 7.02-8.53) [9, 10] . This period is longer than the median incubation periods for other common respiratory viral infections: influenza B = 0.6 days (95% CI 0.5-0.6); influenza A = 1.4 days (95% CI 1.3-1.5); rhinovirus = 1.9 days (95% CI 1.4-2.4); parainfluenza = 2.6 days (95% CI 2Á1-3Á1), SARS-CoV-1 = 4.0 days (95% CI 3Á6-4Á4); respiratory syncytial virus (RSV) = 4.4 days (95% CI 3.9-4.9) and adenovirus = 5.6 days (95% CI 4Á8-6Á3) [11] . Furthermore, a number of recent studies have shown prolonged viral shedding, and meta-analyses show that SOTRs have higher viral burdens of SARS-CoV-2 [12, 13, 14] Importantly, a number of studies have estimated that up to 50% of individuals infected with SAR-CoV-2 have asymptomatic infection courses, which significantly increases the risk of viral spread in a household or care center [15, 16] . The mean serial interval, a key parameter for assessing the dynamics of a disease, has been shown to range from 3.03 to 7.6 days for SAR-CoV-2 between the initial infectious person and the person they infect, indicating that there is ample time for transmission of SARS-CoV-2 within a household, or care facility, while individuals are in pre-symptomatic or asymptomatic phases of infection [17] . Sequencing of airway microbiota in pneumonia patients with COVID-19 (n = 62) and without COVID-19 (n = 125) showed COVID-19 patients had more perturbed airway microbiota with identification of other potential pathogen in 47% of cases, of which 58% were respiratory viruses. In nasopharyngeal and sputum samples from COVID-19 patients, enrichment of other putative pathogenic microbes was identified, including respiratory syncytial viruses (RSV), influenza, and other opportunistic pathogen [18] . Therefore, early detection of infection and early therapeutic intervention with promising corticosteroid and antibody-based regimens may be essential to mitigating the consequences of severe COVID-19 infection in SOTR. As of January 20 th , 2021, over 291 million SARS-CoV-2 viral tests were performed in the United States and~1.361 billion worldwide [19] . With an asymptomatic incubation period up to~14 days and wide heterogeneity in clinical symptoms, early detection of SARS-CoV-2 is imperative, yet there remain major barriers to widespread and continuous testing. Most existing testing platforms are not practical to administer on a daily/weekly basis due to transmission risks and significant logistical barriers. Furthermore, the results of diagnostic tests can take several days restricting the window for early intervention, contact tracing, and impeding data-driven healthcare decisions for high-risk individuals [20] . Finally, there is understandable reluctance from SOTR and their families to enter healthcare settings for routine visits due to potential nosocomial SARS-CoV-2 exposure. The lengthy asymptomatic incubation period of SARS-CoV-2 and its remarkable transmissibility, combined with a presentation altered by immunosuppression, and polypharmacy among transplant populations, reflect the urgent need for tools that can detect pre-symptomatic infection. As SARS-CoV-2 sero-prevalence rises, more SOTR and family members will become infected, and many cases may not be detected early enough for effective intervention. In the last decade, advances in wearable devices such as fitness tracker smartwatches allow a range of important phenotypes to be measured and offer the potential to shift clinical care from being reactive to proactive. A study conducted in June 2019 showed that~21% of the US population have, and regularly wear, a smartwatch 55 , and this trend appears to be increasing as they become more affordable. Generally, an increased heart rate (HR) of 10 beats per minute in children equates to an increase of one degree centigrade from their baseline temperature [21] . While activity can impact HR shortterm, prolonged periods of sustained HR increase over 12-36 hours may indicate a physiological reaction to infection. With the ability to monitor physiological parameters such as HR, body temperature, oxygen saturation (SpO 2 ), blood pressure (BP), sleep and respiratory patterns, and electro-dermal activity, commercially available wearables provide the opportunity for real-time, continuous infection monitoring to complement conventional diagnostic tests. There are many commercially available wrist watches that utilize photoplethysmography (PPG) sensors which shine light into the skin and measure the reflection back to determine blood flow and color (green light is absorbed by hemoglobin). These blood flow measurements are used to determine HR, and to estimate BP and SpO 2 [22] . Inflatable wrist-cuffs can measure arterial pressure to find Oscillometric BP and some wearable devices use single-lead electrocardiography (ECG) to detect heart rhythm, for example, Apple Watch. Over the past few years, wearable devices have been rigorously explored for the detection and/or monitoring of pathologies across a range of diseases, including atrial fibrillation, Parkinson's disease, convulsive seizure onset, and continuous glucose monitoring in individuals with type 2 Diabetes [23] [24] [25] [26] . A growing number of studies have shown that wearable devices are also a powerful and promising tool for infection detection. While wearable technologies have yet to be extensively used for monitoring of SOTRs, a study of 88 Australian adult CKD and kidney transplant recipients, a clinical-grade wearable device measuring peripheral body temperature with an infrared thermopile correctly identified infection in 65 patients with 80% sensitivity and 98% specificity [27] . Another study found that Bluetooth-enabled devices for at-home physiological monitoring of lung transplant recipients resulted in lower incidences of hospital readmissions [28] . The at-home monitoring consisted of daily updates of BP, HR, weight, blood glucose, SpO 2 , pulmonary function, and activity levels, which could be measured using wearable devices. The rate of hospital readmission and readmission days with home monitoring versus standard care was 56% and 46% respectively, demonstrating the potential value of consistently monitoring SOTRs with wearable devices to reduce hospitalizations. One of the first studies to report using wearables to detect SARS-CoV-2 infection via smartwatches was published recently by a number of co-authors of this manuscript. Using primarily retrospective data from~5,300 wearable devices, a focus was placed on individuals wearing similar devices where sufficient continuous and robust measurements were available [29] . The algorithms studied three parameters: increased resting HR (RHR) relative to previous "healthy day" windows; increased HR to activity (step count) ratio; and sleep measures including sleep duration and time in wake/light/deep/ REM stages. Wearables data from 32 individuals pre-, peri-, and post-SARS-CoV-2 confirmed infection, identified aberrant physiological signals associated with illness using various algorithms including proof-of-concept for real-time disease detection. The study showed that it is possible to identify infection prior to symptomatic onset using just three parameters using consumer-grade wearable devices. A similar study demonstrated that combining symptom data (fatigue, breathing difficulty, fever, etc.) with wearable sensor data (resting HR, sleep, and activity) resulted in greater ability to discriminate between COVID-19 and non-COVID-19 infection compared to symptoms alone (AUC 0.80 vs. 0.71, P < 0.01) [30] . The recent TemPredict study, using Oura wearable ring data from 65,000 subjects, examined 50 COVID-19 confirmed cases and showed the ability to detect early signs of fever in 93% of the cases on average 3 days before symptoms manifested [31] . Fitbit watch data on 2745 SARS-CoV-2 confirmed subjects showed that even with self-reported symptoms alone, an AUC of 0.82 AE 0.017 was observed for the prediction of the hospitalization requirement [32] . Wearable biosensors have a significant advantage over conventional diagnostic tools for early infection detection in that they can provide remote, continuous monitoring of vulnerable populations. With the introduction of newer clinical-grade wearables that can monitor additional biometrics such as temperature, respiratory rate, SpO 2 , BP, and ECG, it is envisioned that these additional physiological metrics have the potential to transform early SARS-CoV-2 detection for both broad and additional high-risk populations, such as CKD and ESRD. In kidney transplant waitlisted (n = 56) and transplant recipients (n = 80) with COVID-19, waitlisted patients required more hospitalization (82% vs. 65%, P = 0.03) and had a higher risk of death (34% vs. 16%, P = 0.02) [33] . Patients with COVID-19 can present with severe hypoxemia without proportional features of respiratory distress; this is defined as silent or apathetic hypoxia. The exact mechanisms of silent hypoxia are still not fully understood; however, different hypotheses, including the effect of SARS-CoV-2 on the respiratory system have been postulated. Sudden and rapid deterioration may occur in this subset of patients. In the initial phase of infection with SARS-CoV-2, there is alveolar and interstitial inflammation impairing gas exchange which may progress to acute respiratory distress syndrome (ARDS) due to ACE2 mediated vasoconstriction, inflammation, and apoptosis [34] . Physiological surveillances with SatO2 and respiration-related biometrics during SARS-CoV-2 progression in prospective may offer unique insight into this serious complication. Immunosuppressed subjects have a weakened immune response, and may develop ARDS, and a progressive decline in SpO 2 . They may also present with above average viral copies per oropharyngeal swab, and exhibit a temporal pattern of elevated viral load concomitant with physiological changes and worsening symptoms that require urgent medical attention [35] . With these unique pathophysiological challenges, there is a clear promise of their utility for effective, actionable wearable sensor data to optimize early infection detection and thus impact outcomes for SOTRs. Robust algorithms are needed to transform physiological data collected from wearables into reliable "triggers/ alerts" for early clinical intervention. Some schemes can attain sufficient sensitivity and specificity using simple single-parameter algorithms, such as temperature cutoffs [27] and HRV [36] , but robust algorithmic techniques are needed for adequate performance in large, heterogeneous populations and phenotypes. As more algorithms with different biometrics become available better discrimination will likely be possible for different infection types based on the given clinical population (e.g., young vs elderly) and different comorbidities, concurrent medications, and other factors. The first algorithmic approach for early infection detection using wearables was change of heart (CoH) which scans for signals in continuous HR data from wearable devices to find outlier peaks of HR elevation [36] . This approach was able to identify multiple periods of illness as defined by elevated high-sensitivity C-reactive protein or self-reported illness, with AUC of greater than 0.9 for each of four individuals. Importantly, CoH identified all periods of illness and significant signals were evident prior to reported symptoms, indicating the potential of this approach to detect illness with high sensitivity. In one of the first COVID-19 wearable studies, published by co-authors of this manuscript, an adaption of CoH was used. Termed the RHR-Difference detection (RHR-Diff) method, this approach systematically identifies periods of elevated HR based on outlier interval detection and compares each HR observation to a normal baseline to calculate standardized residuals [29] . A second method, termed "heart rate over steps" anomaly detection (HROS-AD), integrates heart rate and activity To illustrate these detection approaches, Fig. 1 outlines different algorithmic outputs for a single individual positively diagnosed for COVID-19. The participants HR, activity steps, and sleep record were collected over two months (during February and March 2020) which encompassed pre-, peri-, and post-SARS-CoV-2 infection. The average resting HR from healthy baseline days in February 2020 was compared to the average from all days in March 2020 (test days). Periods around COVID-19 infection correlated with HRs that were above the baseline HR, supporting the hypothesis that HR is elevated during COVID-19 onset. RHR-Diff reported elevated time intervals, identifying a 10-day window of significant HR elevation before the onset of reported symptoms (Fig. 1a) , during which the subject was likely contagious and may have benefited from early intervention. To enable real-time COVID-19 detection, outlier detection algorithms were developed with the goal of being both time-and activity-adaptive. Online algorithms have the advantage of continuously reporting alerts in each abnormal day. One modeling framework to test for the presence or absence of infection using biometric readouts is based on the CuSum procedure [37] which assesses changes in the frequency of an event through time [38] . CuSum has been adapted to create a non-parametric test (CuSum Sign test) that is no longer dependent on an assumption of normality and only assumes symmetry in the distribution underlying the observations [39] . In the Mishra et al. study of early COVID-19 infection detection, a CuSum alarm model alarmed in 15 of 24 COVID-19 positive individuals with at least 28 days of data prior to symptom onset, showing good agreement with the offline RHR-Diff approach in 13 cases [29] . In cases that were not detected some missed triggers appeared to be due to pre-existing Figure 1 Algorithmic analyses of wearable device biometric datasets from a single individual pre-, peri-, and post-SARS-CoV-2 infection. The patient's HR, activity steps, and sleep record were collected over all of February and March 2020, which encompassed pre-, peri-, and post-SARS-CoV-2 infection. The average resting HR from healthy baseline days in February was compared to the average from all days in March 2020 (test days). The date (in red) indicate the day the patient reported initial symptoms and the subsequent day (in purple) shows the date of formal SARS-CoV-2 diagnoses by RT-PCR. Periods around SARS-CoV-2 infection correlated with heart rates (HR) that were significantly increased above the baseline HR. The Resting Heart-Rate-Difference detection method (RHR-Diff) was used to systematically identify periods of elevated HR based on outlier interval detection, and compared a normal baseline to each HR observation to calculate standardized residuals. Panel 1a shows the RHR-Diff elevated time intervals (red arrowed horizontal line), identifying a 10-day window of significant HR elevation before the onset of reported symptoms. Online detection results based on the number of successive outlier hours (panel b) and the CuSum continuous real-time alerts (panel c). Individuals for this study were recruited with appropriate informed consent under protocol number 55577 approved by the Stanford University Institutional Review Board. The dates shown were staggered by +/-7 days to protect study participant's identities. conditions such as respiratory illnesses, indicating the importance of further study and analysis within transplant populations. Figure 1b shows results from an online detection method based on the number of successive outlier hours, in comparison to an online detection method adapted from CuSum (Fig. 1c) . Both online algorithms successfully identified the abnormal intervals, indicating the potential of applying these approaches for real-time COVID-19 detection. Extension of such online detection methods into monitoring of lung transplant recipients has already been established. CuSum algorithms were implemented into lung transplant recipients to examine an automatic detection system for events of bronchopulmonary infection or rejection. Patients used an electronic spirometer to measure forced expiratory volume (FEV) and recorded symptoms daily. Detection algorithms could be tuned for specificity and the study optimized algorithms using forced expiratory volume (FEV) data at a specificity of 80% with 3.8 false alarms per patient-year for the learning set and 86% with 2.8 false alarms for the validation set. Algorithms using symptoms data had a sensitivity of 82-83% at 4.3-4.4 false alarms per patient-year [40] . Although this study used spirometry data, rather than wearable devices, it demonstrates the value of using CuSum baseline distributions for SOTR. Recent studies have been designed to recruit wearable users from the general public into COVID-19 studies, such as COVIDENTIFY at Duke University and DETECT at Scripps Research Institute and TemPredict. Researchers in Hong Kong recently published a protocol for a study in which asymptomatic subjects under mandatory quarantine following COVID-19 exposure wear biosensors to continuously monitor skin temperature, respiratory rate, BP, pulse rate, SpO2, and proxies of daily activity (such as steps taken daily) [41] . The primary study outcomes are time to SARS-CoV-2 infection diagnoses using wearables for remote monitoring, with the aim of earlier diagnosis. Based on the outcomes of these COVID-19 studies early infection detection algorithms will likely have strong utility for most healthy household members of a SOTR, which would enable self-isolation until a trigger is verified or downgraded. Should RHR-Diff and HROS-AD or other algorithms/pipelines show validation in SOTR, real-time triggers can be deployed in which local clinical care teams can be contacted after false-positive factors have been ruled out. If a clinical care team assesses the trigger to be indicative of any early infection then the parents/caregivers can be contacted and asked to perform predetermined orthogonal measures of the triggering biometric(s), for example, validation of the participants temperature by using a thermometer, and a telemedicine consult may be instigated. My Personal Health Dashboard (MyPHD) which was developed as an open-source tool that integrates medical records, genomic, and other -omic studies, and research databases and allows for flexible data aggregation and integration, while compliant with HIPAA and the highest security protection requirements. Importantly it is currently scaling to allow hundreds of thousands of individuals with wearables to self-enroll, or enroll through specific studies/trials, and link their devices to generate RHR-Diff, HROS-AD, and additional algorithms outputs [29] . There are several filters such as sustained elevation thresholds for specific biometrics that can be used to limit the number of false-negative alarms. After validating these models on robust sample sizes, wider network can be used to implement such system at scale for early-stage COVID-19 and other infection detection and alerts to optimize RHR-Diff and HROS-AD. The algorithm's sensitivity can be adjusted to reduce false negatives for confounding factors such as medications impacting HR and ambulatory BP. In preliminary studies, the baseline curve can be set by detecting "healthy days". There are several tuning parameters available for detection optimization, for example, how many healthy days need to be included in the initial step and the best resolution to estimate the baseline day trend for balancing bias and variance. Implementation in the transplant setting should ideally utilize existing prospective studies where infectious disease is carefully monitored and wearable devices can be added. One such study is a prospective multicenter pediatric kidney transplant biomarker study being performed at 12 North American sites with >270 recipients are already recruited with data and samples from~10 standard of care time-points collected per recipient in the first year post-transplant [42] . Initial wearables recruitment is beginning in a number of these existing study sites for pediatric recipients, their sibling(s), and parents/caregivers, for a period of 48 months anticipating various infections in large portions of these recipients, including EBV, CMV, and BK virus. In these studies, there is an observational arm (Phase 1) where up to 2,300 participants (575 pediatric SOTRs and approximately 1,725 household members) are observed for a period of ~35 days where no triggers are implemented, while at least 28 days of baseline healthy day are gathered, and the data processing and algorithm deployment are assessed. If the study teams deem all Phase 1 to have conformed to the existing MyPHD pipeline in this study population, then a Phase 2 arm (interventional) is initiated which lasts 48 months to assess the alert system with the primary outcome measured including the number of hours/days for an early detection of verified infection(s) including the following: (a) respiratory viruses such as respiratory syncytial virus (RSV), influenza A and B, rhinovirus and SARS-CoV-2; and (b) de novo or significantly elevated viremia of other posttransplant infections including BK virus, EBV, CMV as assayed by the local standard of care clinical laboratory testing. In Phase 2 the adult non-transplanted individuals are alerted via their phones but the alert for the transplant recipient is further assessed. In Phase 2 a warning alert is triggered at the first time we observed a test statistic more extreme compared with the null distribution, with a p value generated by comparing the current test statistics with the baseline measurements. To reduce the number of alarms, a two-tiered warning system is employed. The first time the CuSum p value is < 0.01 (typically within the first few hours), an initial warning alert is generated and logged automatically, but not sent. Monitoring continues by the MyPHD system, and if the trigger remains elevated for 24 hours it signals a positive event. For Phase 2, each transplant center and clinical care team can decide the criteria for an acceptable balance between under-and over-triggering, as the existing algorithms can easily balance sensitivity and the number of false alarms by setting an ideal detection threshold based on the existing dataset (from the general population studies). If a sustained Phase 2 alert/trigger has been verified to be of high confidence, it is sent to our clinical care teams and contact with the transplant recipients is limited to their judgment. Triggering can thus be streamlined to the clinical care teams' preference, with a priori criteria for what alert thresholds are optimal, and how to deal with an alert, for example, at every hour before engaging the clinical care team. Such approaches are anticipated to save time for the clinical care team by decreasing the amount of data they need to examine and they may only need to consult via telemedicine with the patients/parents whose smartwatches trigger a sustained high confidence alert. Additional recruitments of heart, liver, and lung pediatric transplant recipients are also underway through sites in the International Genetics & Translational Research in Transplantation Network (iGeneTRAiN) [43, 44] , and details and resource related to the wearable studies are continuously updated 56. There are a number of important challenges in wearable studies in SOTR populations. First, the physiology and concurrent diseases of SOTR are complex, with multiple comorbidities often evident, which need to be accounted for when generating better algorithms to monitor these populations. Importantly, SOTR are typically prescribed large numbers of medications which can impact HR, HRV, BP, and body temperature, and the transplantation procedure itself can result in physiological changes in SOTRs, for example, a transplanted heart is de-innervated which impacts HR and HRV. The most relevant confounders listed in Table 1 include the following: medications; indication for transplant; presence and recurrence of pre-and post-transplant infections; and other comorbidities impacting physiological signals. There are a number of considerations when integrating these confounding variables into algorithms: if the variable alters the baseline physiologic characteristics or the physiologic responses to infection and; if there is any discernible effect on the biometric readout of interest for COVID-19 early detection and the robustness of each data element, for example, does taking a specific medication always result in change, or only under certain conditions. Furthermore, vaccinations can pose another challenge, as previous groups have identified subtle changes in HR and temperature following immunization with certain vaccines [45] . This potential confounder can be mitigated by actively documenting patient vaccination history and identifying any COVID-19 vaccine-related physiological fluctuations. Ongoing wearables studies in iGeneTRAiN sites are weighting all known transplant-specific confounders for RHR-Diff and HROS-AD algorithm iterations using available retrospective and prospective SOTR datasets and non-transplant controls. Organ-specific infection signatures are also being investigated, for example, Table 2 outlines a number of kidney transplant-specific covariates including primary disease and post-transplant complications which may impact wearables outputs. Refining transplanted organ-specific RHR-Diff algorithms to recognize "healthy days" (including medication data for each SOTR and the temporal impact on their HR, HRV, temperature, SpO 2 , and other biometrics) and examining downstream periods for the earliest signs of infection using the improved algorithm(s) is ongoing. This approach has the significant advantage that all comparisons used to trigger clinical interventions are intra-participant, allowing control for variation in physiology between individuals. Pre-transplant wearable and phenotype data would allow dissection of the effect of transplant medications and other stressors that Implementation and compliance issues with the wearable devices are also anticipated. In order to effectively establish baseline healthy days as well as track potential signs of infection, the smartwatches should be worn consistently in both asymptomatic and symptomatic periods. The recent adult retrospective COVID-19 wearable study showed that many individuals neglected both charging and wearing the devices during symptomatic periods (as many only wore their devices when exercising), so compliance with all transplant household members wearing the devices is imperative [29, 46] . It is important in such studies to emphasize to the SOTR/household participants that they should not rely on smartwatch devices as means of infection detection, and if they are showing signs of infection, or other symptoms, that would typically cause them to reach out to their clinical care team, then they should do so. At Phase 2 should an alert be triggered, participants can be guided to take actions to validate any signs of potential illness with orthogonal reading of the biometric(s) of relevance, such as a using a thermometer to assess potential fever, and they may self-isolate until the trigger is further sustained or downgraded. The most anticipated hurdles for implementation of smartwatches for early detection of infection in transplant household are the workflows for managing large complex datasets systematically, accurately, and securely while reducing the burden on the clinical management teams. Resources such as MyPHD, Future use of data can be used where allowable, for example, to aggregate large-scale prospective transplant datasets to generate early infection detection pathogen-specific algorithms. Figure 2 illustrates the various components of clinical decision support (CDS) and return-of-results (RoR) to manage care of SOTRs between and outlines some of the basic workflows. Legal frameworks that set guidelines for the collection and processing of personal information, such as the General Data Protection Regulation (GDPR) for participants from the European Union (EU), regulate that consent must be clear, properly informed, freely given, and specific to the study in question 31946621 [47] . Other GDPR principles that relate to the lawful basis of processing data include the following: data minimization, so that the minimum relevant amount of personal data necessary for processing is collected; data security, to ensure personal data are processed in a manner that ensures appropriate security. Such frameworks also ensure that retention of data (personal data are not stored for longer than necessary) and are securely encrypted. Consents in such studies may include sharing of de-identified meta-data for comparison of existing algorithms and development of newer algorithms. Patient medical records and device data can be linked to secure data encryption keys and held in highly secure hospital/University or Institute servers, and using resources such as MyPHD ensures that studies can be performed under local institutional review board (IRB) regulatory guidelines, with encrypted wearables data streamed to servers that have the same high-level security as other electronic medical record (EMR) servers containing PHI (this is illustrated in Figure S1 ). This approach also allows smartwatch data not to have to be streamed onto a smartwatch vendor's system. Both traditional machine learning and deep learning approaches have been used to successfully perform numerous classification tasks related to COVID-19, including prediction of SARS-CoV-2 seropositivity and risk stratification of confirmed positive cases using multimodal datasets. Gradient boosting classifiers, a traditional approach involving an ensemble of "weak learners" such as decision trees combined in an additive process that is optimized via gradient descent, have been used to predict COVID-19 infection with up to 0.74 AUC using clinical variables such as presence of anosmia, cough, shortness of breath, and patient age [48] . Convolutional neural networks, a high-dimensional "deep neural network" design utilized for image processing and pattern recognition, can distinguish COVID-19 pneumonia from non-COVID-19 pneumonia on chest CT scans with AUC of 0.95 [49] . The promise that artificial intelligence tools have shown in these areas of COVID-19 detection leaves little to imagine regarding how such approaches can be translated to analysis of data from wearable devices. Geographical and environmental exposures are also important consideration for wearable datasets. The "exposome" field has advanced significantly in the last few years [50, 51] ,. and additional data relating to weather, pollution, and socioeconomic status can be derived from geocoding from a subject's 9-digit zip code [52] . Epidemiological models clearly demonstrate that slowing the spread of COVID-19 will prevent hundreds of thousands of deaths [53] . While SARS-CoV-2 therapeutics and vaccine administration and further development are being carried out at a frenetic pace, there are still uncertainties how they will perform long-term in vulnerable immunocompromised SOTR, and vaccines are not yet available to SOTRs <16 years of age. The long-term consequences of post-COVID-19 disease are also very unclear in the general populations, let alone in SOTR. A recent prepublication indicated that metaanalyses of 47,190 patients estimated that appreciable portions of the patients that were infected with SARS-CoV-2 developed at least one or more long-term symptoms [54] . The increased sophistication of wearable sensors and cheap data storage/analyses has made data from commercially available devices extremely useful for accurate health surveillance. By combining clinical-grade wearable devices, advanced algorithms, and novel data management platforms, new processes are being developed to assess the health of SOTR, and their household members, by guiding decision-making in COVID-19 testing and triggering risk-informed medical actions such as self-isolation and early pathogen detection, especially with the emergence of specific clinical SARS-CoV-2 home-testing assays to rule-in or rule-out infection. Smartwatch studies from over 100,000 participants from the general population including the DETECT, TemPredict, and Stanford MyPHD wearable studies show very promising data for early infection detection. Thus it reasonable to envisage in a clinical study setting, household members of transplanted recipients using such smartwatch systems to allow early detection and reduce the risk of infection transmission, including self-isolation, until an alert/trigger is verified. The benefit of being able to receive continuous measurements of temperature, blood pressure, and heart rate data from the smartwatch of a participant every 5-15 minutes allows short-to long-term assessment of biometric correlations with various infection-related outcomes. Using wearable devices in conjunction with opensource secure systems is an innovative approach that can also allow effective integration with electronic health records (EHR). Smartwatch datasets from large-scale transplant studies have only recently begun to be initiated and implemented, many challenges remain, and larger more formal rigorous studies still need to be conducted to truly prove its impact on transplant patient care. Quantitative and qualitative metrics of outcomes, including those related to morbidity and mortality, using smartwatches/wearables for surveillance and intervention early infections, need to be performed in properly designed, well-powered clinical studies before wearable devices enter routine clinical practice. The infrastructure needed to take this important step to prove its impact on patient care can be facilitated using existing resources like MyPHD where SOTR and household members can provide remote consent, and linkage to their own wearable datasets and EHR under remote HIPAA compliant systems for remote monitoring. It is also important that the sensitivity of the infection detection based on biometrics such as HR, HRV, and activity can be tuned to reduce false-alerts but the algorithms used for infection detection will become more sensitive and specific for a range of post-transplant pathogens as monitored populations increase in size and diversity. This will also allow suspected SARS-CoV-2 or other infection cases to be quickly identified and managed, limiting risk for the transplant patient and additional burden on healthcare systems. Infection in solid-organ transplant recipients Infection in organ transplantation Community-acquired respiratory viruses in transplant patients: diversity, impact, unmet clinical needs Outcomes from pandemic influenza A H1N1 infection in recipients of solid-organ transplants: a multicentre cohort study A 5-year prospective multicenter evaluation of influenza infection in transplant recipients Transplantation in the era of the Covid-19 pandemic: How should transplant patients and programs be handled COVID-19 in solid organ transplant recipients: dynamics of disease progression and inflammatory markers in ICU and non-ICU admitted patients Covid-19 and kidney transplantation Incubation period of severe acute respiratory syndrome novel coronavirus 2 that causes coronavirus disease 2019: a systematic review and meta-analysis Estimation of incubation period distribution of COVID-19 using disease onset forward time: a novel cross-sectional and forward follow-up study Incubation periods of acute respiratory viral infections: a systematic review Viral clearance and serological response to SARS-CoV-2 in kidney transplant recipients COVID-19 in heart transplant recipients: a multicenter analysis of the northern Italian outbreak Cycle thresholds among solid organ transplant recipients testing positive for SARS-CoV-2 Presymptomatic SARS-CoV-2 infections and transmission in a skilled nursing facility Temporal dynamics in viral shedding and transmissibility of COVID-19 Rapid review of available evidence on the serial interval and generation time of COVID-19 Metatranscriptomic characterization of COVID-19 identified a host transcriptional classifier associated with immune signaling Interpreting diagnostic tests for SARS-CoV-2 The relationship between body temperature, heart rate and respiratory rate in children Current progress of photoplethysmography and SPO2 for health monitoring WellDoc mobile diabetes management randomized controlled trial: change in clinical and behavioral outcomes and patient and physician satisfaction Monitoring motor fluctuations in patients with Parkinson's disease using wearable sensors Large-scale assessment of a smartwatch to identify atrial fibrillation Multicenter clinical assessment of improved wearable multimodal convulsive seizure detectors Monitoring skin temperature at the wrist in hospitalised patients may assist in the detection of infection Use of a Bluetooth tablet-based technology to improve outcomes in lung transplantation: A pilot study Pre-symptomatic detection of COVID-19 from smartwatch data Wearable sensor data and self-reported symptoms for COVID-19 detection Feasibility of continuous fever monitoring using wearable devices Assessment of physiological signs associated with COVID-19 measured using wearable devices COVID-19 outcomes in patients waitlisted for kidney transplantation and kidney transplant recipients Silent hypoxia: a frequently overlooked clinical entity in patients with COVID-19 COVID-19 in a severely immunosuppressed patient with lifethreatening eosinophilic granulomatosis with polyangiitis Digital health: tracking physiomes and activity using wearable biosensors reveals useful health-related information The cusum test of homogeneity with an application in spontaneous abortion epidemiology Distribution-free tests for sparse heterogeneous mixtures Automatic event detection in lung transplant recipients based on home monitoring of spirometry and symptoms Artificial intelligence mobile health platform for early detection of COVID-19 in quarantine subjects using a wearable biosensor: protocol for a randomised controlled trial Validating Injury to Renal Transplant Using Urinary Signatures in Children (VIRTUUS) Translational Research in Transplantation N. Design and implementation of the international genetics and translational research in transplantation network Genome-wide study updates in the international genetics and translational research in transplantation network (iGeneTRAiN) Characterization of potential biomarkers of reactogenicity of licensed antiviral vaccines: randomized controlled clinical trials conducted by the BIOVACSAFE consortium Pre-symptomatic detection of COVID-19 from smartwatch data Secure stream processing for medical data A prediction model to prioritize individuals for SARS-CoV-2 test built from national symptom surveys Artificial intelligence augmentation of radiologist performance in distinguishing COVID-19 from pneumonia of other origin at chest CT Dynamic human environmental exposome revealed by longitudinal personal monitoring Deep longitudinal multiomics profiling reveals two biological seasonal patterns in California Geographic information systems and chronic kidney disease: racial disparities, rural residence and forecasting More than 50 Longterm effects of COVID-19: a systematic review and meta-analysis King Abdulaziz City for Science and Technology (KACST) grant number 12-MED2799-46 for Amein K. Al-Ali, Fatima Abdullah Alrubaish, Fahad A. Al-Muhanna. UnitedHealthGroup R&D and Optum Labs. M.P.S. is cofounder and a member of the scientific advisory board of Personalis, Qbio, January, SensOmics, Protos, Mirvie, and Oralome. He is on the scientific advisory board of Danaher, GenapSys, and Jupiter. The co-authors have no conflicts of interest. Additional supporting information may be found online in the Supporting Information section at the end of the article. Figure S1 . Data Protection and Processing Pipeline for Smartwatch data from consented participants.