key: cord-0041433-pyqc6ymz authors: nan title: Oral Abstracts date: 2019-04-29 journal: Am J Transplant DOI: 10.1111/ajt.15405 sha: 9382fde0f1d120de64b42b93a60b41333facd3a9 doc_id: 41433 cord_uid: pyqc6ymz nan sleeve arm compared to the control arm (51 vs 62 ml/min, p=0.16). Overall, operative time was longer (3.5 vs 4.8 hours, p<0.05) for the sleeve arm but the warm ischemia time was comparable among the two groups (50.4+13.5 vs 51. 6+15.7, p=0.92 ). In the sleeve arm, one patient required a conversion due to poor perfusion to the kidney allograft and another had a small bowel obstruction while the control arm had a patient have a laceration and thrombus in the allograft kidney successfully treated and reimplanted at the same time of the original transplant (Clavien Dindo Class IIIa or greater). Although, length of stay was longer in the sleeve arm (8.2+3.3 vs 6.6+3.7, p=0.28) it was not significant. Conclusions: The combination of simultaneous kidney transplant and sleeve gastrectomy in obese end-stage renal disease patients is safe and results in significant weight loss and a numerically higher GFR. It appears to be a feasible approach to address renal failure and obesity with a high likelihood of sustained graft outcomes. drives expansion of CD4 TEM and CD8 TEMRA. CMV-specific T cells have significantly more TEMRA whereas EBV-specific T cells have more TCM. viSNE and Citrus identified the appearance of CD4+ and CD8+ populations with reduced costimulatory receptors CD28 and/or CD27 but increased expression of CD57, CD11a and co-inhibitor CD244 (Fig2). Rphenograph revealed 17 clusters of EBV and CMV tetramer+ cells. EBV-specific cells were enriched for CD28+CD57-TEM whereas CMV-specific cells were enriched for CD28-CD57+ TEM and CD28-TEMRA. Studies of belatacept and tacrolimus treated kidney recipients are underway. Conclusions: Latent CMV infection impacts both CD8 and CD4 subsets resulting in loss of naïve cells and expansion of populations associated with belatacept-resistant rejection. Exploration of donor-reactivity in the CMV memory pool and stratification based on CMV risk status should be considered in biomarker studies. Purpose: Whole organ perfusion decellularization has been proposed as a promising method for the generation of non-immunogenic organs from allogeneic or xenogeneic donors. However, the ability to recellularize organ scaffolds with multiple patient-specific cells in a spatially-controlled manner remains challenging. This study describes a modified decellularization technique in an attempt to address these limitations. Methods: Rat and porcine organs (including kidneys, liver, heart, limbs) were treated in order to selectively eliminate donor endothelial cells while keeping the remaining tissue intact and viable. We used in-situ, isolated cold perfusion decellularization, followed by normothermic perfusion recellularization. This model allows an easily obtainable, single-site central cannulation, useful for accessing any target organ. Stem cells isolated from human placentae were used to assess the ability to replace endothelial cells in rat kidneys. Results: Perfusion decellularization of organs under controlled flow conditions resulted in successful selective removal of endothelial cells (Fig. 1) . Sub-endothelial tissues remained intact and viable. Placental stem cells were shown to readily engraft within de-endothelized glomeruli (Fig. 2) . In-situ organ perfusion while keeping it in its native anatomical location yielded less peri-organ dissections and better control of perfusate leakage (Fig. 3) . Conclusions: Our findings suggest that limited decellularization of donor endothelial cells followed by re-endothelization with non-immunogenic cells is feasible and may be used to generate fully functional, possibly tolerable organs for transplantation. CITATION INFORMATION: Cohen S., Partouche S., Gurevich M., Mezhybovsky V., Tennak V., Eisner S., Nesher E., Mor E. Endothelial Cell Replacement -A Closer Step to Personalized, Tissue-Engineered Transplants Am J Transplant. 2019; 19(suppl 3) . Purpose: "Off the shelf" tissue-engineered organs will require use of allogeneic cell sources. Allogeneic human endothelial cells (ECs), necessary for organ perfusion, can initiate graft rejection in absence of professional antigen presenting cells. ECs express both class I and class II MHC molecules that can both bind donor-specific antibody and activate alloreactive T effector memory (T EM ) cells. Here we employ CRISPR/Cas9 ablation of MHC molecule expression to engineer ECs that evade alloimmune rejection and retain core EC characteristics including the ability to form microvessels in vivo. Methods: Human cord-blood derived ECs were subjected to CRISPR/Cas9-mediated biallelic ablation of both β 2 -microglobulin and CIITA. Clones of ablated and control ECs, before and after cytokine activation, were analyzed by FACS for surface protein expression and by RNA sequencing for transcriptional profiling. Human alloantibody binding was also assessed by FACS and complement-induced signaling was assayed by immunoblotting. The ability of ECs to activate allogeneic CD4 + and CD8 + T EM cells in vitro was assayed by CFSE-dilution and CTL-and NK cell-mediated cytolysis was assayed by calcein AM release. The ability to form perfused microvessels in vivo was assessed by implantation of collagen gel-suspended ECs into SCID/bg mice and immunoevasion was tested by quantifying microvessels with and without introduction of allogeneic human peripheral blood mononuclear cells (PBMCs) into mice with implants. Results: Cultured β 2 -microglobulin null /CIITA null human ECs lack both class I and class II MHC molecule expression before and after IFN-γ treatment. RNA sequencing of β 2 -microglobulin null /CIITA null cells does not reveal significant changes in expression of genes unrelated to MHC molecule expression. Normal cultured EC characteristics are also maintained, such as formation of tight monolayers with junctional expression of CD31 and CD144, thrombin or TNF-α-induced leakiness, and normal TNF-α-dependent upregulation of E-selectin and ICAM-1. Importantly, these cells no longer bind human alloantibodies, have markedly diminished capacity to activate allogeneic CD8+ and CD4+ T EM cells and are resistant to killing by CD8+ alloreactive cytotoxic T lymphocytes in vitro. β 2 -microglobulin null /CIITA null cells do not trigger NK cell activation or cytolysis. When suspended in a collagen gel and implanted into an immunodeficient mouse, β 2 -microglobulin null /CIITA null ECs form perfused microvessels and, unlike control ECs, are protected from destruction following inoculation of allogeneic PBMCs. Conclusions: These data suggest that tissue engineered constructs can be perfused through vessels lined by allogeneic MHC-absent ECs that likely will be significantly less prone to rejection. CITATION INFORMATION: Merola J., Reschke M., Qin L., Spindler S., Pierce R., Manes T., Li G., Bracaglia L., Kirkiles-Smith N., Baltazar T., Saltzman W. M., Tietjen G., Tellides G., Pober J. MHC Molecule-Ablated Human Endothelial Cells Form Alloimmune-Evasive Microvessels Am J Transplant. 2019; 19(suppl 3) . Purpose: Realizing the liver's unique ability to regenerate, this ongoing study examines Normothermic Machine Perfusion (NMP) as a promising platform for evaluating & potentially simulating regeneration of liver segments in an ex-vivo environment. Methods: 16 rat liver grafts were perfused for 12 hours at mean temperature of 37°C. Following the first 1.5 hours of perfusion, 8 livers underwent a 2/3 partial hepatectomy (PHx) & were subsequently perfused for the total duration of 12h. Perfusion was performed using an oxygenated, William's E based medium flowing continuously through a closed circuit. Parameters of biochemical & synthetic liver function, as well as molecular & immunohistochemical parameters of early liver regeneration were analyzed pre & post hepatectomy. Ki-67 was used as an immunohistochemistry target, Proliferating Cell Nuclear Antigen (PCNA) & p27 -a cyclin dependent kinase inhibitor in quiescent cells -were targeted in western blot analyses. Ki-67, PCNA & p27 levels were correlated to a control group of livers that underwent in-vivo 2/3 PHx. Results: Perfused livers showed positive Ki-67 staining, comparable but lower than the amount observed in vivo after PHx. Notably, the staining was predominantly in the hepatocyte nuclei. Moreover, the number of stained cells increase with time similar to the in vivo case. In parallel p27 -a key inhibitor protein of cells which needs to be reduced for cells to undergo proliferation -displayed a significant decrease over 12 hours in perfused livers (p<0.05) similar to in vivo PHx. This decrease was common in both perfused groups, although it appears to be faster in PHx perfused livers compared to whole perfused livers. On comparing hemodynamic performance of whole & partial livers during NMP, both groups behaved similarly during the first 1.5 hours of perfusion. After PHx there was a significant & sudden increase in Purpose: A Shroom3 intronic risk locus & Shroom3 protein have been associated with increased fibrosis but reduced albuminuria in recipients of allografts with the risk genotypes. Methods: To understand specific mechanisms downstream of Shroom3 giving rise to these opposing phenotypes, we utilized inducible knockdown mice. Results: We identified that glomerular Shroom3 knockdown (CAGS-TG mice), which associated with albuminuria and podocyte actin cytoskeletal changes, also interestingly induced reduction in glomerular volume (Vglom) [Fig 1a] , podocyte fraction of Vglom and single podocyte volume[1b] vs control mice(NTG). In Shroom3 knockdown human podocytes, a corresponding reduction in cell volume was identified vs control podocytes. To investigate whether glomerular Shroom3 knockdown regulated Vglom hypertrophy, we performed unilateral nephrectomy in control and knockdown mice and examined Vglom in remnant kidneys. At day 7, Shroom3 knockdown mice showed restricted Vglom hypertrophy vs controls(17% vs 28% delta Vglom, respectively; P<0.001), and compared to published FSGS controls (35-50%) [1c] . To investigate whether reductions of cell size were related to inhibition of cellular protein synthesis we measured protein:DNA ratios. Here Shroom3 knockdown podocytes had markedly reduced Protein:DNA ratios (n=5 sets; P<0.01) [1d] . Since MTOR is a key cell growth pathway regulating protein biosynthesis, we examined whether Shroom3 impinges on MTOR signaling. Phospho AMPKinase, a negative regulator of MTORC1 was significantly increased with Shroom3 knockdown (n=3 sets)[Representative image -1e]. Ribosomal biogenesis (18S / 5S RNA), downstream of MTORC1, was significantly inhibited by qPCR in Shroom3 knockdown podocytes. Other MTOR signaling proteins are being investigated. Conclusions: In summary our data suggests regulation of podocyte volume and Vglom by Shroom3 by regulating MTOR signaling. These findings could have implications to glomerular damage in allografts from high risk donors and in live donors with high risk Shroom3 genotypes. Purpose: Lymphatic endothelial cells (LEC) highly expres S1PR1 and S1PR2. The ligand S1P is produced by LEC and regulates T cell migration. The mechanisms of how LEC utilize S1P/S1PRs to regulate T cell migration are not understood. We hypothesized that LEC use S1PRs to regulate CD4 T trafficking from tissues through afferent lymphatics into draining lymph nodes (dLN). Methods: CD4 T cells were transferred into mice to measure migration into afferent lymphatics and dLN. Specific pharmacologic and genetic S1PR blockade was employed in vitro and in vivo. Mouse primary LEC and LECs line were used to assess migration, chemokine signals, adhesion molecules and S1PR function in vitro. Results: S1P selectively promoted various CD4 T cell subsets (naïve, activated, memory, regulatory) to migrate across LEC. Blocking S1PR2, but not S1PR1, specifically inhibited T cell migration toward S1P across the LEC. Similarly, various CD4 T cells also migrated less efficiently into the lymphatic vessels and dLN of S1PR2 antagonist treated or S1PR2 deficient recipients. S1PR2 blockade increased lymphatic endothelial cell layer integrity and decreased permeability in vivo and in vitro. The S1PR2-ERK pathway controlled adherens tight junction molecules expression (VE-cadherin, zonulin, occluding) and TEM migration. S1PR2 antagonists decreased VCAM-1 expression, and reduced T cell contact with LEC surface VCAM-1 + domains, which were important for TEM migration. Treatment with neutralizing antibodies against adhesion molecules demonstrated that S1P driven migration was dependent on VLA4VCAM1 interactions. S1P-S1PR2 driven migration preferentially utilized a transcellular instead of a paracellular migration pathway, and S1PR2 antagonists and anti-VCAM-1 treatment reduced T cell transcellular migration. Conclusions: While both S1PR1 and S1PR2 are expressed by LEC, and only S1PR2 is used to regulate T cell migration across afferent lymphatics into dLN. S1PR2 downstream ERK signaling regulates VCAM1 expression, which is required for CD4 T cell transendothelial migration. S1PR2 also regulates VEcadherin expression as well as LEC junctional integrity and permeability. These findings implicate S1PR2 as a novel and specific drug target for regulating the lymphatic migration of CD4 T cells in immunity and tolerance. We assessed the impact of the hematopoietic cell dose and the immunosuppressive drug (IS) withdrawal on the persistence of mixed chimerism after combined HLA haplo-identical kidney and hematopoietic cell transplantion. Methods: Twenty two patients were given combined HLA haplotype matched living related kidney and hematopoietic cell transplants to establish persistent mixed chimerism, and facilitate IS drug withdrawal. Patients were conditioned with total lymphoid irradiation (TLI) and anti-thymocyte globulin (ATG) after kidney transplantation, and given an infusion of a defined escalating dose of donor CD3+T cells (3 to 100x10^6 cells/kg) and a dose of enriched G-CSF mobilized CD34+ hematopoieic progenitor from the donors. Results: There was no correlation between the dose of injected T cells and the persistence of mixed chimerism for at least 1 year. The dose of CD34+ cells varied from 8 to 26x10^6 cells/kg, and a second mobilizing agent, plerixafor, was administered to the last 10 consecutive donors in order to increase the yield of CD34+ cells. All of the latter donors had yields of at least 10x10^6 cells/kg, whereas 5 of 12 donors without plerixafor had yields of less than 10x10^6 CD34+ cells/kg. None of the recipients given less than 10x10^6 CD34 cells/kg developed persistent mixed chimerism for at least 1 year, and about 60% (10/17) of recipients given at least 10x10^6 cells/kg developed persistent mixed chimerism without GVHD. The latter chimeric patients were withdrawn from steroids and mycophenolate mofetil, and were maintained on tacrolimus monotherapy at the end of the first year posttransplant. Persistence of mixed chimerism during the second year was dependent on continuation of tacrolimus monotherapy. Chimerism among T cells at all time points was significantly reduced as compared to all other cell lineages including B cells, NK cells, and granulocytes. The mean levels of T cell chimerism were about 30% below that of B cell chimerism during the first year posstransplant. The recovery of naïve T cells among all T cells was markedly delayed as compared to the recovery naïve B cells among all B cells. Chimerism among CD4+ T cells was confined to the naïve T cells, and almost all memory CD4+ T cells were of recipient origin. In addition, the delayed recovery of naïve T cells during the first year was associated with delayed recovery of the diversity of the T cell repertoire as determined by TCR gene sequencing by RNA seq. The persistence of recipient CD4+ and CD8+ memory T cells was associated with freedom from opportunistic, and chronic viral infections. Conclusions: Mixed chimerism for at least 1 year was safely achieved in the HLA haplotype matched patients reduced to tacrolimus monotherapy. The dose of donor CD 34+ cells rather than CD3+ T cell correlated with persistence of mixed chimerism. Oral Abstracts this period. Freedom of rejection off IS drugs was observed with or without loss of chimerism. No patients developed evidence of graft versus host disease (GVHD). Chimerism among T cells in the blood at all time points was significantly reduced as compared to all other lineages including B cells, NK cells, granulocytes, CD34 hematopoietic progenitors, and whole blood. Mean percent chimerism among B cells was about 30% higher than that of T cells. Although there was a similar profound depletion of blood naive T and B cells immediately after conditioning with TLI and ATG, there was a difference in the kinetics of immune reconstitution during the first year post-Tx. There was a marked increase in the percentage and absolute number of B cell precursors (CD19+CD20-CD38+IgD-pro-B cells) among all CD19+ B cells that peaked at about 6 weeks post-Tx in association with the marked reduction in the percentage of naive B cells(CD19+CD20+IgD+).However, by 6 months post-Tx, the percentage of naive B cells and pro-B cells returned pre-Tx levels. In contrast, the frequency of thymic T cell precursors in the blood, identified by TCR excision circle (TREC) analysis, was reduced to almost undetectable levels at 6 weeks post-Tx along with a marked reduction of (CD45RO-CD62L+) naive T cells. The reduction of T cells and T cell precursors began to gradually recover after 6 months, and did not approach the pre-Tx levels until the second year post-Tx. Conclusions: Persistence of mixed chimerism for a least 6 month was associated with successful withdrawal of IS drugs in a majority of recipients of combined HLA matched kidney and hematopoietic cell transplants receiving a TLI ATG conditioning regimen. We observed a marked delay in T cell reconstitution as compared to B cell reconstitution associated with low levels of chimerism among T cells as compared to B cells. The low levels of donor chimerism among T cells is likely to have contributed to the absence of GVHD. CITATION INFORMATION: Busque S., Scandling J., Lowsky R., Shizuru J., Meyer E., Jensen K., Shori A., Hoppe R., Engleman E., Pham T., Strober S. Persistent Mixed Chimerism, Immune Reconstitution, and T. Lim 1 , P. Ruiz 2 , A. Kurt 2 , E. Kodela 2 , M. Martinez-Llordella 2 , T. Tree 3 , A. Sanchez-Fueyo 1 , 1 Institute of Liver Studies, King's College Hospital, London, United Kingdom, 2 Department of Liver Sciences, King's College London, London, United Kingdom, 3 Department of Immunobiology, King's College London, London, United Kingdom Purpose: Low dose interleukin-2 (LDIL-2) is capable of expanding endogenous CD4+CD25+FoxP3+ regulatory T-cells (Tregs) in vivo but its role in human solid organ transplantation is unclear. We investigated the effects of LDIL-2 treatment in liver transplant (LT) recipients under calcineurin inhibitors and its capacity to promote the complete discontinuation of maintenance immunosuppression. Methods: 'LITE' is an open-label, activity, safety and efficacy, prospective single arm clinical trial in which stable LT recipients <50 years old and 2-6 years post-LT receive 1 million IU s.c. injections of IL-2 daily for up to 6 months. After 4 weeks of LDIL-2 treatment, recipients with >2-fold increase in circulating Tregs and no subclinical rejection in a liver biopsy initiate weaning of immunosuppression with the aim of achieving complete discontinuation in 3 months. The primary endpoint is complete immunosuppression discontinuation for 1 year with biochemical and histological stability. Results: Six patients initiated LDIL-2 treatment and all achieved sustained >2-fold increase in the absolute number of circulating Tregs after 1 month of treatment (median 4.4, range 2.1-13.3). One patient was found to have sub-clinical rejection while 5 initiated drug withdrawal. Although immunosuppression was discontinued in 2 patients, all patients eventually rejected and required immunosuppression reinstitution. Expansion of non-regulatory T and NK cell subsets expressing CD25 was noted in all patients. Conclusions: A regimen of 1 million IU daily doses of IL-2 effectively expands circulating Tregs in LT recipients on Tacrolimus but fails to promote the successful discontinuation of immunosuppression. The increases observed in non-Treg circulating immune cell subsets and intra-hepatic lymphocytes indicate lack of selectivity and the need to explore alternative dosing regimens. Purpose: Screening for anti-HLA antibodies in heart transplantation candidates is complicated by transfusions as well as by the presence of ventricular assist devices (VADs), which have been shown to be an independent risk factor for developing anti-human leukocyte antigen (HLA) antibodies. As a result, these patients require frequent antibody screens to monitor for the development of anti-HLA antibodies following these sensitizing events. The goal of this study is to define the frequency of HLA alloimmunization in patients that have received VADs. Methods: We performed a retrospective analysis of heart transplantation candidates at our center who received a VAD from 1/2015-10/2018, had at least one negative antibody screen pre-VAD (median of 13 days pre-VAD), and were screened at least once post-VAD (first screen performed at a median of 26 days post-VAD). Patients were screened an average of four times post-VAD, and screens after transplantation were not included. Results: Of the 52 patients that had a negative pre-VAD antibody screen, 30 remained negative on all subsequent screens post-VAD, while 22 had at least one positive antibody screen post-VAD. The two groups were similar in terms of age, sex and VAD type. On detailed analysis, 15/22 patients with positive post-VAD screens had antibodies that did not fall into the well-characterized epitope group patterns observed in post-transplant or post-pregnancy alloimmunization. All 15 had antibodies with MFI < 3000 on the first positive screen (average MFI ~1700), which would not lead to a positive virtual crossmatch, and 9/15 were inconsistently present on subsequent screening. Furthermore, a subset of these antibodies were detected only by Luminex single antigen beads and not by Luminex screens or panels, suggesting that they may be binding to cryptic epitopes exposed on the recombinant HLA proteins present on single antigen beads. The 15 patients with non-epitope grouped antibodies received a median of 2 transfusions, and the 7 with canonical epitope patterns received a median of 8 transfusions. Conclusions: Taken together, these results suggest that the epitope grouped antibodies arose due to transfusion, while the non-epitope grouped antibodies could represent solid phase HLA cross-reactivities related to the presence of a VAD. In conclusion, the implantation of a VAD appears to lead to canonical HLA alloimmunization in <15% of patients, likely due to multiple transfusions, while frequently leading to development of low-level antibodies of unclear clinical significance. We prospectively evaluated all CMV+ HT recipients in 1 institution in whom we performed an IFN-γ ELISPOT prior to HT and 2 weeks post-HT. The ELISPOT result could be low risk (IE-1 > 23 spots and pp65 >135 spots) or not low risk. This information plus clinical assessment led to deciding between pre-emptive therapy or prophylaxis with valgancyclovir 900 mg daily for 3 months. We collected incidence of CMV viremia (any positive CMV PCR), CMV infection (CMV PCR > 1500 IU/ml) and CMV disease and incidence of leucopenia (< 3.5x10E9/L) during follow-up. We collected need for prophylaxis if previously on pre-emptive or treatment if the patient developed CMV infection. Results: We included 29 HT recipients with 26 IFN-γ ELISPOTs pre-HT and 28 IFN-γ ELISPOTs post-HT. Mean age was 56 years old, 24% female and 45% were emergency HT. Donor CMV serology was positive in 62% of cases. Induction therapy was given in 90% of patients. All received Tacrolimus + Mycophenolate mophetil + steroids. Figure 1 shows the main outcomes of the study. One patient died before the second IFN-γ ELISPOT determination and is not included. No patient with a low risk ELISPOT post-HT developed CMV infection, giving a positive predictive value of the test of 100%. The incidence of leucopenia was 7(39%) from 18 patients treated with prophylaxis. 2 of these patients needed G-CSF. Conclusions: Evaluation of CMV-specific T cell response using the ELISPOT assay in CMV+ HT recipients can help stratify risk for CMV infection. Patients with a low risk ELISPOT 2 weeks post-HT may safely undergo pre-emptive therapy. CGI with equal distribution (32%) of Dme CpGs on the promoter and gene body regions. Genes mapped to the Dme CpGs were related to the global GE microarray data. A total of 208 genes in the LI group and 45 genes in the HI group were common between the DNAm and GE datasets. The analysis of these genes showed activation of fibroblast cell proliferation, cell survival, cell cycle progression and inhibition of cell death in LI while cell movement of neutrophils, phagocytosis of blood cells, and cell viability were affected in the HI group. From the integrative analysis, ALOX5AP and S100P genes associated to the Highinjury group were identified as hypo-methylated at TSS site in the DNAm analyses and concurrently up-regulated (logFC >1; FDR<0.1) in the corresponding GE data set. Expression of ALOX5AP and S100P genes were also validated by RT-PCR. ALOX5AP showed significant up-regulation (HI Pre-Imp vs. Post-Rep: Adjusted P-value 0.003, LI Pre-Imp vs. Post-Rep: Adjusted P-value 0.0003). S100P followed the trend of up-regulation with less statistical power of significance P = 0.058 . MMP9 gene expression validated by RT-PCR showed significant up-regulation during reperfusion within high injury group (adjusted P-value =0.0001) and also between high and low injury group (adjusted P-value =0.002). Conclusions: Hereby, a key effect of ischemia on graft DNAm patterns was identified. For Pre-Imp biopsies, it was observed that DNAm modifications occurred mainly in DD samples. From integrative analyses it was observed that DNAm patterns play a key role in I/R injury by inducing differential GE changes affecting severity of graft injury and outcome. In this prospective, non-randomized controlled study, 30 donor livers were transplanted in the IFLT group. Livers that were transplanted using a conventional procedure during the same period were treated as controls. The primary end-point was the incidence of early allograft dysfunction (EAD). Results: During the operation, the body temperature and mean arterial pressure were more stable and the incidence of post-reperfusion syndrome was much lower in IFLT. Of the 30 patients in the IFLT group, 1 (3%) had EAD, compared with 47 of 89 (53%) patients in the control group (absolute risk difference, -46 percentage points; 95% confidence interval [CI] , -65 to -25). The peak aspartate aminotransferase (365 versus 1551 U/L, P<0.001) and alanine aminotransferase (169 versus 660 U/L, P<0.001) serum levels were much lower in the IFLT group. The total bilirubin level on day 7 post-transplantation was lower in the IFLT group than in the control group (2.3 versus 5.7 mg/dL, p<0.001). No PNF occurred in the IFLT group, while there were 5 cases of PNF (6%) in the control group. The patient and graft survival were comparable between the two groups. The pathological studies revealed minimal injury of hepatocytes and biliary epithelium during IFLT. The bile with good quality was continually produced throughout procurement, preservation and implantation in IFLT. There was no non-anastomosis biliary stricture in the IFLT group, compared with 8 cases in the control group (9%). Histological analysis of IFLT allograft biopsies and TUNEL showed a minimal injury to the liver tissues and bile duct. long term outcomes. We identified 5 risk factors (cold ischemia time, warm ischemia time, recipient hypertension, donor hypertension, and male donors) for moderate to severe IRI. Careful consideration of these factors during donor-recipient matching may assist in optimizing graft utilization and LT outcomes. Purpose: Belatacept is a costimulatory blocker that is used as de novo maintenance immunosuppression in kidney transplantation to avoid toxic effects of calcineurin inhibitors and improve long term outcomes. However, in patients maintained on belatacept, mycophenolate (MPA), and corticosteroids, acute rejection has been noted to be more frequent and severe than in calcineurin inhibitors. In view of experimental studies showing synergy between costimulatory blockade and mTOR inhibitors (mTORi), we developed a belatacept regimen incorporating everolimus. Methods: We prospectively enrolled 67 kidney transplant recipients (43 deceased; 24 living donors) at our center to receive denovo belatacept from August 2012 to October 2018. PBMCs were collected prior to transplantation and at the time of cause and protocol biopsies. All patients received thymoglobulin for induction (3mg/kg divided into 2 doses) with belatacept 10mg/kg administered on POD 1, 4, 14, 28, 56, and 84 . Monthly maintenance dose of 5mg/kg was given starting week 16. Patients were started initially on MPA but were converted to everolimus after 1 month, and all patients were maintained on prednisone. Protocol biopsies were performed at 6 months. Results: In the first 12 months post-transplant, 16.4% of patients developed acute rejection: 2 patients were noted to have ACR 1a (at 4 weeks, 6 weeks), 1 with ACR 2a (at 6 months), 6 with ACR 2b (at 6 weeks (n=2), 2 months, 3 months, 6 months, 9 months), 1 with AMR (at 4 months), and 1 with simultaneous ACR 1a and AMR (at 9 months). All 11 rejections occurred in those who were on MPA and not on mTORi (3 switched back from mTORi to MPA due to intolerance; 8 had not switched to mTORi due to clinical contraindications). In addition, 12 patients were found to have borderline rejection on protocol biopsies (5 on mTORi, 7 on MPA). 33 patients did not have any inflammation on biopsies. 57 patients remained on belatacept, and 10 patients were converted to tacrolimus (1 patient was switched to tacrolimus due to poor intravenous access and not due to rejection). Conclusions: This trial of combining belatacept with mTORi shows that it is possible to significantly reduce the rate of acute rejection in belatacept-based regimens. All acute rejection occurred in patients who could not be converted to mTORi or needed to be converted back to MPA due to intolerance to mTORi. The synergy between mTORi and belatacept may be related to mTORi's inhibitory effect on memory CD8+CD28-CD38+ cells that are refractory to costimulatory blockade (Castro C et al, ATC abstract 2018) . Pretransplant immunotyping of circulating T cells to identify those with low percentage of CD8+CD28-cells may select patients who will be at lowest risk of rejection on belatacept-MPA based regimen ( , multicenter, open-label study, de novo RTxRs were randomized to receive EVR+rCNI (N=1022) or MPA+sCNI (N=1015) with induction and steroids. Efficacy assessments included incidence of the composite efficacy failure (treated biopsy-proven acute rejection [tBPAR], graft loss, or death), its components, antibody-mediated rejection, and de novo donor-specific antibodies (dnDSA; any mean fluorescence intensity [MFI] ). Results: Baseline characteristics were mostly comparable between arms. At M24, the incidence of composite efficacy failure, other efficacy outcomes, and dnDSA was comparable between arms (Table) . The overall tBPAR and recurrent tBPAR rates were low in the study and comparable between arms. The incidence of tBPAR was low in both arms in patients who received rabbit anti-thymocyte globulin induction vs those with basiliximab induction. In patients with tBPAR, higher incidences of deceased donors, diabetes, Caucasians, and higher cold ischemia time were reported in the EVR+rCNI arm. Most tBPAR events were mild (grade IA or IB) and occurred in the first 3 months posttransplant in both arms (EVR+rCNI [6.2%] vs MPA+sCNI [5.7%]; Table) ; severe rejections (grade IIB and III) were more frequent in the EVR+rCNI arm. At tBPAR onset, the mean EVR trough level (C0) was 5.1 ng/mL and mean TAC C0 were 5.4 (EVR+rTAC) and 9.5 (MPA+sTAC) ng/mL. Similar TAC and cyclosporine A C0 were reported between patients with and without tBPAR in the EVR+rCNI arm from Week 4 onwards. Conclusions: De novo EVR+rCNI regimen provides comparable antirejection efficacy to MPA+sCNI with most tBPAR events of mild severity, onset within the first 3 months, and comparable drug exposure patterns in patients with vs without tBPAR. CITATION INFORMATION: Tedesco Silva H., Citterio F., Henry M., Srinivas T., Watarai Y., Steinberg S., Basic-Jukic N., Garcia V., Mor E., Peddi V., Narvekar Purpose: The safety of early steroid withdrawal (ESW) in African American (AA) deceased donor renal transplant recipients (DDRTs) is unclear. This study assessed the impact of ESW with alemtuzumab induction on graft outcomes of AA DDRTs as compared to non-AA. Methods: This was a single-center, retrospective, cohort study of de novo DDRTs transplanted 1/2011-9/2015. Adults aged 18-65 years received alemtuzumab and a 21-day prednisone withdrawal taper with tacrolimus and mycophenolate maintenance therapy. Exclusion criteria were pre-transplant steroid use, pre-existing donor specific antibodies (DSA), multi-organ transplant, primary non-function, and conversion from tacrolimus within 3 years. Graft loss, biopsy-proven acute cellular (ACR) and antibody-mediated rejection (AMR), and patient survival were evaluated at 1 and 3 years. Results: Baseline characteristics of the included 155 AA and 100 non-AA DDRTs are summarized in Table 1 . Patients were followed for a median duration of 3.7 years. Maintenance immunosuppression was similar throughout the study period, with comparable prednisone reinitiation rates at 1 year (9.2% vs 10.3%, p=0.819) and 3 years (12.7% vs 13.9%, p=0.832). A total of 311 surveillance and for-cause biopsies were performed in the AA group and 180 in the non-AA group. There was no difference in graft loss at 1 year (2.6% vs 3%, p=1.000) or 3 years (10.3% vs 5%, p=0.164) ( Figure 1 ). While glomerular filtration rate (GFR) was higher throughout the study period in AA DDRTs, the incidence of chronic AMR and Class II DSA development at 3 years was significantly greater ( Table 2) . Conclusions: These data suggest AA DDRTs undergoing ESW with alemtuzumab induction have similar graft survival at 3 years as non-AA DDRTs. However, the increased incidence of chronic AMR and DSA development raises concern, and longer follow-up is warranted to sufficiently assess graft outcomes. Further evaluation is needed to identify the subpopulation benefiting from steroid continuation. Purpose: Comparison of two immunosuppressive protocols, steroid-free with antithymocyte globulin (ATG) induction, and steroid containing with basiliximab induction, regarding the incidence of PTDM, one year after renal transplantation. The study is an open-label, multicenter, randomized controlled trial where eligible patients were assigned prior to renal transplantation, to receive either ATG, low-dose tacrolimus and mycophenolate mofetil (MMF) (arm-A), or basiliximab, low-dose tacrolimus, MMF, and prednisolone (arm-B) . Adult patients at low immunological risk, without pre-transplant diabetes mellitus, receiving single organ, ABO compatible, kidney transplant were included. The primary objective was to determine the cumulative incidence of PTDM at one year, assessed according to the American Diabetes Association recommendations. The secondary objective was the composite measure of freedom from biopsy-proven active rejection (BPAR), graft survival and patient survival. Results: A total of 222 patients were included in the study and randomized to arm-A (N=113) and arm-B (N=109). The cumulative incidence of PTDM one year after renal transplantation was overall low, 13.3% in study arm-A and as compared to 18.3% in control arm-B (p=0.4). The rate of BPAR (arm-A 19% vs. arm-B 15%, p=0.5), graft and patient survival was similar in both groups. Higher proportion of patients in arm-B needed >=3 antihypertensive drugs (arm-B 33% vs. arm-A 20%, p=0.03), and lipid-lowering medication (arm-B 33% vs. arm-A 20%, p=0.05). Conclusions: Steroid-free immunosuppression with ATG induction and low-dose tacrolimus did not reduce the incidence of PTDM within the first year after renal transplantation, but offered easily manageable hypertension and hyperlipidemia, as well as a good safety profile. Purpose: Eculizumab is a recombinant humanized IgG monoclonal antibody that inhibits C5, thereby preventing the activation of C5a and formation of membrane attack complex C5b-C9. In renal transplant patients eculizumab has been used to treat or prevent antibody-mediated rejection and disorders associated with abnormal complement activation. One study (Herlitz et al, JASN 2012) showed that eculizumab may deposit in native kidneys in patients with C3 glomerulonephritis (C3GN) or dense deposit disease, with IgG kappa deposition resembling monoclonal immunoglobulin deposition disease (MIDD). Eculizumab deposition in renal transplants has not been systematically studied. Methods: We identified 50 renal transplant patients who received eculizumab for at least 4 weeks from June 2008 to June 2018. Eculizumab was used in these patients to prevent or treat acute antibody-mediated rejection or C3GN. To be included in this study, the patient must have had at least one renal biopsy during the eculizumab treatment or within a month of the last dose. Biopsies were evaluated by light microscopy, immunofluorescence (IF) and electron microscopy (EM). Results: 50 kidney transplant recipients (19 M, 31 F) met the inclusion criteria and had 391 allograft biopsies in total (including 49 implantation biopsies). 44 kidneys were from living donors and 6 from deceased donors. The mean follow-up time was 35 months (range 2 -108). Among the 50 patients, we identified 5 (10%) with post-transplant glomerular IgG deposition. Only one patient (2%) showed glomerular changes resembling MIDD with IgG2, IgG4, and kappa staining, consistent with eculizumab deposition; both subsequent biopsies over 4 months from this allograft showed similar changes. The other biopsies with glomerular IgG deposits had diagnoses of: segmental membranous glomerulonephritis (n=1), recurrent lupus nephritis (n=1), recurrent proliferative GN with monoclonal IgG3 lambda deposits (n=1), and de novo mesangial proliferative immune complex GN, not otherwise specified (n=1). Interestingly, the single patient with eculizumab deposits had recurrent C3GN, whereas none of the other patients in this series had C3GN or DDD as the native disease. Conclusions: Only rare kidney transplants (<1% of biopsies) from eculizumabtreated patients show eculizumab deposits, and such deposits were found only in a patient with C3GN. Glomerular eculizumab deposition may be associated with abnormalities in the alternative complement pathway. Other glomerular diagnoses with IgG deposits are more common. We previously showed that patients who were randomized to receive C1INH (serine protease inhibitor that inhibits MBL/MASP) had earlier recovery from delayed graft than placebo. Here we report a post-hoc analysis on long-term outcomes in the subgroup of HS patients enrolled in the prior randomized placebo-controlled study. Methods: From November 2014 -February 2017, 70 patients at risk for IRI/DGF were enrolled and randomized 1:1 to receive C1INH 50U/kg (n=35) vs. placebo (n=35) intra-operatively & 24hrs later. Of the 70 patients, 17 were HLA sensitized. All transplanted HS patients received antibody induction with alemtuzumab and were maintained on standard immunosuppression with tacrolimus, cellcept, and tapering prednisone. Rejection episodes, patient and graft survival, and eGFR up to 48M were compared between recipients of C1INH and placebo. Results: Ten patients received C1INH and 7 received placebo. Briefly, cPRA 99-100% & previous transplants were similar between the 2 groups. All patients were off dialysis by day 15 and no significant differences were seen with time to dialysis cessation between C1INH and placebo. No ABMR was seen in C1INH treated patients vs. 2 patients (29%) in placebo. CMR was seen in 2 patients (20%) in C1INH vs. 1 patient (14%) in placebo. There were no graft losses among C1INH patients vs. 2 graft losses in placebo (p=0.07) {Fig 1}. One death occurred in the C1INH group @ 41M post-transplant d/t sepsis. A linear mixed effects model observed progressive deterioration in eGFR up to 48M in the placebo group (slope eGFR: -10.0 ml/min/1.73 m 2 , 95% CI: -18.5 to -1.48), whereas eGFR was stable among C1INH-treated recipients (slope eGFR: -0.08 ml/min/1.73 m 2 per year; 95% CI: -6.97 to 6.81) {Fig 2}. Conclusions: C1INH therapy was associated with sustained graft survival with no antibody rejection episodes. There was progressive deterioration in eGFR among placebo treated patients, suggesting that the addition of C1INH may lead to longterm preservation of graft function in HS patients receiving donor allografts at risk for IRI/DGF. CITATION INFORMATION: Vo A., Huang E., Ammerman N., Choi J., Kim I., Peng A., Najjar R., Sethi S We found an enhanced spontaneous NETosis regardless of whether they had rejection compared to the donors. Therefore, we decided to extend the study focusing in patients with acute antibody-mediated rejection (AAMR). The purpose of this exploratory study was to evaluate the in vivo NETosis in plasma and tissue samples of adult KTR with AAMR (n=14) and compare them with KTR without any immunologic event (n=14). We used healthy donors (n=12) as the reference group. In vivo NETosis was evaluated by a sandwich ELISA to detect the plasmatic citrullinated histone 3-DNA complexes. The amount of NETS was quantified with the Optic Density Index (ODI). In renal biopsies, the expression of myeloperoxidase (MPO) and citrullinated histone 4 (cH4) was evaluated by confocal microscopy. Results: KTR with AAMR had the higher amount of NETs (figure1). Only in 2 AAMR biopsies we found netting neutrophils with decondensed chromatin colocalizing with cH4 in the glomeruli (figure 2). Also, in the peritubular capillaries we detected polymorphonuclear cells with cytoplasmic MPO (figure 3). Conclusions: KTR, particularly those with AAMR show enhanced in vivo NETosis. Although more studies are needed, the higher amount of plasmatic NETs and the detection netting neutrophils in tissue sample from AAMR suggest a role of NETosis in the pathogenesis of this type of kidney rejection and a state of permanent activation of neutrophils.. single dose of rituximab is enough to deplete CD19+ cells. Aim: to analyze CD19+ cell behavior during the 12 months following a 500 mg dose of rituximab and its correlation with clinical and histological outcomes. Methods: This is a prospective cohort study of 122 kidney transplant recipients with biopsy proven ABMR who received a single dose of rituximab 500 mg as part of standard ABMR treatment between 2012 and 2018. Peripheral CD19+ cells were measured at baseline, 15, 30, 90, 180, 270 , and 360 days after rituximab infusion, and correlated this data with clinical and histological outcomes. Results: 122 patients were included. Mean age was 27 y, 56.6% were female, in 93.4% of patients this was their first transplant, 71% were from living donors, and the median time to rejection was 6 y post-transplant. Treatment included plasmapheresis and IVIG in 75% of patients, 21% also received Bortezomib. All patients received 500 mg of rituximab, except for 7.3% of patients that received 375 mg/m2 BSA. Patients were followed for a median of 21 mo (0.1-83) . Median allograft survival after ABMR was 5.5 years. CD19+ cell depletion patterns are shown in Figure 1 . CD19+ cell depletion (<10 cells) at 1 month was associated with less IFTA at follow-up biopsy and improvement of proteinuria at last follow-up. Early CD19+ cell repopulation was associated with higher graft loss and dead. Persistent CD19+ cell depletion (<10 cells) at 12mo was associated with better graft and patient survival. Conclusions: A single dose of Rituximab achieved CD19+ cell depletion in more than 80% of patients, which lasted at least 6 months. CD19+ cell depletion is associated with improvement of proteinuria and less IFTA; and the persistence of CD19+ cell depletion at 12 months improves graft and patient survival. KU Leuven, Leuven, Belgium, 2 University Hospitals Leuven, Leuven, Belgium, 3 Purpose: The association between the kinetics of pretransplant DSA levels and outcome after kidney transplantation is currently not well known. In this large cohort study (N=924 with 4260 post-transplant biopsies), we investigated the evolution and clinical significance of pretransplant donor specific antibodies (preDSA), positive with the single antigen beads assay but negative in complement-dependent cytotoxicity crossmatch. The donor specificity of the preDSA (N=107 patients with 500 biopsies) was determined by retrospective high-resolution genotyping of all donor-recipient pairs and evaluating all HLA A/B/C/DRB1/DRB345/DQA1/DQB1/DPA1 and DPB1 loci. Results: We found that in 52% of the patients with preDSA, the DSA spontaneously resolved within the first 3 months after transplantation, without receiving specific therapy for removal of the preDSA. PreDSA that persisted after transplantation had higher pretransplant MFI values (6143±4565 vs. 2874±2391, p<.0001) and more specificity against HLA class II (78.5%), especially against locus DQ (49%). Although patients with resolved and persistent preDSA both had a high incidence (53.6% and 58.8%, respectively) of histological picture of antibody-mediated rejection (ABMR h ) with similar histological appearance, the patients with preDSA that persisted after transplantation had worse 10-year graft survival compared to resolved preDSA and DSA-negative patients (43.9% vs. 81.2% vs. 87.4%, p<.0001) . Compared to cases without DSA, Cox modeling revealed an increased risk of graft failure in the patients with persistent preDSA, in the presence (HR=8.3, p<.0001) but also in the absence (HR=4.3, p=0.001) of ABMR h . In contrast, no increased risk of graft failure was seen in patients with resolved preDSA, again independent of the presence or absence of ABMR h (HR=1.5, p=0.47 and HR=1.4, p=0.62, respectively) . We conclude that persistence of preDSA after transplantation has a negative impact on graft survival, beyond the diagnosis of ABMR h according to the current Banff classification. Even in the absence of antibody-targeting therapy, low-MFI DSA, and non-DQ DSA often disappear early after transplantation and are not deleterious for graft outcome, despite the association with transient histological abnormalities. Purpose: Post-transplant diabetes mellitus (PTDM) is a highly prevalent condition following renal transplantation, associated with increased risk of cardiovascular disease and impaired patient survival. Sodium-glucose cotransporter-2 (SGLT2) inhibitors have recently been shown not only to improve glycemia in patients with type 2 diabetes mellitus (T2DM), but also to protect against cardiovascular events in patients at high cardiovascular risk and to delay progressive impairment in glomerular filtration rate (GFR) . The aim of this study was to investigate the safety and efficacy of using empagliflozin in renal transplant recipients with PTDM. Methods: From November 2016 to January 2018, 49 renal transplant recipients were included in an investigator-initiated, single-center, prospective, double blind study, and randomized to receive either 10 mg empagliflozin or placebo once daily for 24 weeks. Renal transplant recipients transplanted >1 year ago, diagnosed with PTDM, stable renal function with eGFR >30 mL/min/1.73m 2 and stable immunosuppressive therapy (tacrolimus/cyclosporine/everolimus in combination with mycophenolate and prednisolone) were eligible for inclusion. All variables were analyzed as median of differences in change from baseline to week 24 between the groups. Results: In total 44 renal transplant recipients (34 male, 10 female) completed the study. Median (IQR) change in HbA1c and fasting plasma glucose were significantly reduced after 24 weeks of empagliflozin treatment compared to placebo; -0. There were no significant differences between the groups in adverse events, trough levels of immunosuppressive drugs or eGFR after 24 weeks of treatment. Conclusions: Empagliflozin improves glycemic control compared to placebo after 24-week treatment with a concomitant reduction of body weight in stable renal transplant recipients. The treatment was well tolerated with no serious adverse events. Thus, empagliflozin seems to represents a novel treatment option for renal transplant recipients with PTDM. Purpose: Low physical activity and reduced physical functioning are common after renal transplantation, resulting in a reduced quality of life. The aim of this study was to evaluate the benefit of exercise rehabilitation with or without diet counseling, on physical functioning, quality of life and post-transplantation weight gain in renal transplant recipients (RTR). The Active Care after Transplantation study was a multicenter randomized controlled trial. RTR from 3 Dutch university hospitals were randomized to usual care or to exercise intervention (3 months supervised exercise 2 times/week followed by 12 months active lifestyle coaching), with or without additional dietary counseling (12 sessions over 15 months). General linear mixed models were used for intentionto-treat analyses. The exercise component was investigated with adjustment for diet (so comparing exercise and exercise+diet versus controls) and the diet component with adjustment for exercise (so exercise+diet versus exercise and control). Results: In total, 221 patients were included (age 51.8 years, 62% male, on average 5.0 months after transplantation, 26.3% pre-emptive). Quality of life in relation to physical functioning improved after start of the intervention by 5, 5.8 and 2.7 units after 3, 6 and 15 months respectively in the control group, but improved considerably more after the exercise program with on average 11.5, 9.7 en 8.3 units after 3, 6 and 15 months respectively (P TIME*INTERVENTION =0.022). Muscle strength improved more after the exercise program (T3=+21%, T6=+20%, T25=+20%, P TIME*INTERVENTION =0.028), as well as exercise capacity (T3=+7.4 W, T15=+11.9 W, P TIME*INTERVENTION <0.001) when compared to the control group. No significant changes were found in body composition, glucose homeostasis, plasma lipid profile and blood pressure (medication not yet taken into account). The dietary intervention was heterogeneous in its aims and did not seem to have a clear group-effect on weight gain or cardiometabolic risk factors. Conclusions: Exercise rehabilitation followed by lifestyle counseling significantly improved quality of life, strength and exercise capacity in renal transplant recipients on the short term. Importantly, this was sustained on the long term. Purpose: Morbid obesity is a barrier to kidney transplantation in patients with endstage renal disease (ESRD) and is associated with progression of chronic kidney disease (CKD). Despite previous studies showing the efficacy of sleeve gastrectomy (SG) in this population, the safety of operating on these high-risk patients is of concern. Moreover, the long-term outcomes after SG among patients with kidney disease is unknown. Methods: We reviewed prospectively collected data on patients with ESRD and CKD undergoing SG from 2012 to 2018. During the study period, 195 patients with ESRD and 45 patients with CKD (stages 1 to 4) met NIH guidelines for metabolic surgery and underwent SG by a single surgeon. Average follow-up period was 2.3±1.5 years. Methods: We performed a retrospective, chart review of SOT recipients (more than 18 years) with diabetes. We identified 63 and 25 recipients on dulaglutide and liraglutide respectively and collected data at 6, 12 and 24 months. The primary endpoint was: reduction in weight, BMI, and insulin requirement. Safety endpoint included hypoglycemia, GI side-effects, and cancers. Secondary endpoints were: HbA1c, renal and liver function. Results: Percentage decrease in weight was 2, 4% and 5.2% with dulaglutide (baseline mean-weight 98.7 kgs) and 0.09%, 0.87% and 0.89% with liraglutide (baseline mean-weight 112.6 kg) at 6, 12 and 24 months respectively. BMI followed a similar trend with percentage reduction of 2.4, 6, and 8% with dulaglutide (baseline mean-BMI 32.8kg/m2) and minimal decreases of 0.24, 1.4, and 0.54% with liraglutide (baseline mean-36.8 Kg/m2) at 6, 12 and 24 months respectively. The percentage reduction for insulin requirement was 26% with dulaglutide when compared to 3.6% with liraglutide at the end of follow-up. The baseline insulin requirement in dulaglutide was 23 units when compared to 50 units with liraglutide. Mean baseline A1c was 7.5 with both the GLP-1-analogs. There was a trend of decrease of HbA1c throughout the follow-up for dulaglutide with a percentage decrease of A1c being 10%, 5.3% and 8.4% at 6, 12, and 24 months respectively. Whereas, in liraglutide group, there was an initial decrease followed by an increase in A1c (percentage decrease of 5.3%, 3% at 6, and 12 months followed by an increase of 2% in A1c at 24 months). There was a 10% reduction in creatinine and a 15% increase in egfr at the end of 24 months with dulaglutide (baseline creatinine 1.73 and egfr 47). Whereas there was an increase in creatinine by 7% at the end of 24 months with liraglutide. The trend reflected in a decrease in egfr of 8% with liraglutide at the end of the study period (baseline creatinine 1.85 and egfr 42.48). There was no increased incidence of pancreatitis, transaminitis or cancer. Immunosuppressive agents remained unchanged with both the GLP-1-analogue. Hypoglycemia or other GI manifestations was lower in both the groups, none requiring discontinuation of medications. There was one graft failure, anginal episode, and two mortality each in dulaglutide, and liraglutide group throughout the follow-up. Conclusions: Both agents exhibited a favorable side-effect profile without any interference with immunosuppressant. There was a sustained reduction in weight, BMI, insulin requirement and HbA1c with Dulaglutide when compared to liraglutide. Future prospective randomized trials are warranted. Table 2) . Moreover, compared with awaiting non-preemptive ABOc LDKT, preemptive ABOi LDKT was associated with a 48% decrease in mortality risk (aHR 0.52; 95%CI: 0.36-0.73). Conclusions: Among preemptive waitlist candidates, there was a survival benefit associated with desensitization and subsequent preemptive ABOi LDKT compared with awaiting ABOc DDKT or non-preemptive ABOc LDKT. Although identification of compatible transplants through kidney paired donation programs is ideal, for preemptive candidates who face long wait times and the need to initiate dialysis pursuit of desensitization and ABOi LDKT should be considered. TGF. The predictive performance of the LKDPI in the Canadian cohort of KTR was evaluated: Discrimination was assessed using Harrell's C statistic and calibration was assessed graphically, comparing observed vs. predicted probabilities of TGF. Results: There were 645 KTR included in the cohort. The distribution of the LKDPI of the study cohort is shown in Figure 1 . The median LKDPI was 13 (IQR 1.1; 29.9 ). There were 43 graft failures and 52 deaths with graft function over a median follow up time of 5 years (IQR 2.7; 7.8) . Higher LKDPI scores were associated with an increase in risk of TGF (Hazard Ratio per 1 unit increase in LKDPI 1.01 (95% CI 1.0; 1.02), P value 0.02)). Model discrimination was modest with a C statistic of 0.55 (95% CI 0.48; 0.61) . Calibration at 1-year and 5-years post transplant is shown in Figure 2 . Conclusions: In this Canadian cohort of KTR, there is an association between the LKDPI and the risk of TGF. However, in this external validation study, the LKDPI is not that predictive with modest discrimination and poor calibration, particularly after 1 year post-transplant. Validation of prediction models is an important step prior to clinical use in external populations. Potential re-calibration of the LKDPI or derivation of a new prediction model may be required prior to integration into the Canadian context. A. Uffing 1 , T. TANGO-consortium 2 , P. Cravedi 3 , L. V. Riella 1 , 1 Renal Division, Brigham & Women's Hospital, Harvard Medical School, Boston, MA, 2 NA, Boston, MA, 3 Department of Medicine, Icahn School of Medicine at Mount Sinai, New York, NY Purpose: Primary focal segmental glomerulosclerosis (FSGS) is a glomerular disease that frequently recurs after kidney transplantation. Several circulating factors have been proposed to cause recurrence of FSGS in the allograft, but pathophysiology remains largely unclear and treatment options are limited. Clinical predictors have not been validated since cohorts are small due to low incidence and large registries suffer from misclassification of disease and missing data. Methods: We established The Post-TrANsplant GlOmerular Disease (TANGO) study to comprehensively analyze the recurrence of glomerular diseases post-transplant in a large, international cohort of kidney transplant recipients. All consecutive subjects with biopsy-proven, primary FSGS who received a kidney transplant between 2005-2015 were retrospectively identified in 15 TANGO-study centers in Europe (n=5), United States (n=6) and Brazil (n=4). Detailed serial patient information was collected at the time of transplant and every year after transplant. Serum of samples from FSGS patients are currently being collected. Results: 552 kidney transplant recipients have been recruited to the study so far, of which 106 patients had the diagnosis of primary FSGS after stringent exclusion of all patients with possible secondary causes of FSGS. The incidence of recurrence was 30% (95% CI: 0.21-0.39). Contrary to previous studies, Cox regression analyses showed an increased risk of recurrence with age of primary diagnosis (aHR 1.04 per year, p=0.005, 95% CI 1.01-1.08). Recipients of grafts from living donors had an aHR of FSGS recurrence of 2.56 compared to deceased donor recipients (p=0.026, 95% CI: 1.12-5.88 ). Pre-transplant plasmapheresis had no effect on recurrence rate. However, in patients with a recurrence, pre-transplant plasmapheresis increased the chance of achieving complete remission with post-transplant treatment (aOR: 3.99, p=0.014, 95% CI: 0.81-7.17). Within patients with recurrent FSGS, only 7 (15%) and 16 (33%) patients achieved complete or partial remission after treatment, respectively. There was no difference in the remission rates across different treatment strategies, including high dose steroids, plasmapheresis/IVIG and/or rituximab. 16 (33%) of patients with FSGS recurrence experienced graft failure within 23 months of recurrence. Conclusions: Our international multicenter cohort of kidney transplant recipients with primary FSGS allowed us to identify risk factors for recurrence and predictors of response to therapy. Complete remission was only achieved in 15% of patients independent of applied treatment strategies. Further studies to refine our understanding of the natural history of this severe condition and to explore the underlying mechanism with biobanked samples will be crucial to tackling this clinical challenge. Purpose: African Americans(AA) account for 13% of the US population, yet compose more than one third of dialysis patients. Inheriting two Apolipoprotein L1 gene (APOL1) renal risk alleles(G1/G1,G1/G2 or G2/G2) is associated with an increased risk of nondiabetic kidney disease in patients with African ancestry. Compared to non-AAs where APOL1 renal risk variants are absent, about 13% of AAs carry the high-risk genotype(2 risk alleles) and 39% carry 1 risk allele. In this study, we investigate the utility of APOL1 genotyping in potential AA living kidney donors. Methods: Since May 2017, all potential AA living kidney donors evaluated at our center underwent APOL1 genotyping. Individuals with low-risk genotypes(0 or 1 risk allele) were eligible for further donor evaluation. Baseline characteristics and follow-up data were retrospectively analyzed using descriptive statistics. Results: Twenty potential donors were genotyped. One(5%) had 2 risk alleles and was therefore declined. Eleven(55%) and eight(40%) had 1 and 0 risk alleles respectively. Baseline characteristics are presented in Table1. Six individuals with low-risk genotypes donated a kidney and had acceptable kidney function, no proteinuria and normal blood pressure at up to 6 months of follow-up(Table2). Conclusions: In our cohort, the frequency of one risk allele was higher than that reported in the general AA population. We attribute this to our sample that is enriched with AAs wanting to donate to first degree relatives with advanced kidney disease. This further supports the utility of APOL1 genotyping in such individuals. Shortterm outcomes for low-risk genotype donors were favorable which is consistent with the existing literature. We propose a role for systematically incorporating APOL1 genotyping in the evaluation and informed consent process of potential AA living donors particularly those of younger age. Figure 1 presents data for over 260 US kidney transplant programs for NL-LDCR and EL-LDCR. Marked variability in NL-LDCR was observed amongst programs varying from 0% to almost 60%. EL-LDCR demonstrated a 1-2 log lower rate of LDCR (between 0 -20%), and with a lesser degree of heterogeneity amongst transplant programs. Finally, regression analysis examining the relationship between NL-LDCR and EL-LDCR for individual programs demonstrated a relatively low correlation rate (r=0.34) that was significant however,, indicating heterogeneity in processes for achieving LD transplantation involve differing processes. Conclusions: Kidney transplant program efficiency at achieving LD transplantation can now be measured by two approaches-NL-LDCR and EL-LDCR that show significant heterogeneity amongst US programs, indicating substantial room for growth in many programs. Importantly, achieving LD transplantation in a newly referred patient is a different process that achieving LD transplantation in a chronically wait listed patient, and these data indicate that processes should be developed and widely adopted that help address the low LDCR in this population. (prior to 1990, 1990-1999, 2000-2009, and 2010-2018) , high vs. low volume pediatric kidney transplant center (high volume being defined as a program that has performed > 100 kidney transplants in its history), and UNOS Regions. Donor and recipient demographic data were examined, and survival and outcomes were examined. A p-value of <0.05 was considered significant. Results: 9,900 pediatric kidney transplants were performed over this time period, of which 131 were AKI donors (1.3%). Based upon era, there was no difference in utilization rates. There was a significant difference in between use of these kidneys across regions, where Region 9 was the most likely to transplant an AKI kidney (3.3%) vs. Region 1 being the lowest utilizer (0.29%; p < 0.05). There were no significant differences between the share characteristics of the organ (local vs. regional vs. national; 80.1% vs. 10.7% vs. 9.1%; p = NS). There were also no significant differences between the HLA mismatch in the AKI vs. non AKI groups ( Purpose: Though a purpose of the kidney allocation system (KAS) enacted in 2014 was to prioritize the transplantation of highly sensitized recipients, we explored the effect of this on pediatric graft quality, wait time, and early graft outcomes. Methods: A retrospective analysis of deceased-donor transplants from January 2008-January 2018 was conducted using the UNOS database. The mean donor KDPI was compared between highly sensitized, 0-mismatch (0-mm) and pediatric recipients pre-and post-KAS. The students t-test and X 2 test were used to compare continuous and categorical variables. Differences in wait times pre and post-KAS, as well as annually post-KAS were calculated. Delayed graft function and 1 year rejection rates were compared. STATA12 was used for all analyses. The mean KDPI significantly improved among highly sensitized (39% to 37%, p<0.05) and pediatric allograft recipients (20% to 16%, p<0.05), yet worsened for 0-mm recipients (38% to 44%, p<0.05). While wait time improved for highly sensitized patients (1210 to 1140 days, p<0.05), it worsened for pediatric recipients (350 to 398 days, p<0.05) . There were no observed differences in rates of DGF or early allograft rejection. In the 3 years since implementation of the KAS, pediatric waiting times have remained prolonged. Conclusions: Pediatric and 0-mm recipients are not receiving lower quality allografts, likely reflecting the close scrutiny transplant practitioners give these grafts prior to acceptance. Wait times have increased over the two eras. Despite the expected bolus effect from transplantation of highly sensitized patients, pediatric wait times have remained prolonged each year post-KAS. There have not been any differences observed in early graft outcomes like DGF and 1yr rejection rate. Median changes (Z-score) in height (0.28 vs 0.23) and weight (0.60 vs 0.64) from randomization to M36 were comparable between EVR+rTAC and MMF+sTAC arms. Overall, incidence of adverse events (AEs) and serious AEs was comparable between study arms, with higher drug discontinuation due to AEs in the EVR+rTAC vs MMF+sTAC. Conclusions: Despite higher drug discontinuation rate in the EVR+rTAC arm, the outcomes of this trial in pKTxRs showed that conversion to an EVR+rTAC regimen at 4-6 weeks post Tx with an early steroid withdrawal offers comparable efficacy, safety, and renal function vs MMF+sTAC at M36. The graft survival rates were similar between the two donor age groups, between en-bloc and single kidney transplants and among donor types (DBD v DCD); overall and in single kidney groups. I and 22 Class II) at 3 to 4 week post-vaccination follow-up, originating from 3 patients (33.3%). Most (78.6%) potential emergent Ab were Class II (p<0.0001). After supporting antibody testing, only 2 of the initial 28 potential de novo Ab were determined to be HLA specific; the remaining 26 were deemed non-specific. Median MFI increase in nonspecific and de novo anti-HLA Ab is reported in Table 1 . 26/28 (92.9%) nonspecific Ab achieved > 25% MFI increase at 3 to 4 week follow-up. 8/9 patients included in the study continued to transplant. Conclusions: Vaccination resulted in a significant increase in median MFI from baseline to 3 to 4 week follow-up, although a majority of observed changes were deemed non-specific responses, possibly to denatured antigens on single antigen beads, as only two confirmed de novo anti-HLA antibodies were detected. These data suggest that there is limited clinical impact of vaccinations on emergence of de novo anti-HLA Ab, although there appears to be an association with nonspecific antibody response. Identification of nonspecific Ab responses detected by single antigen testing aids definition of true de novo anti-HLA Ab, increasing the accuracy of donor selection. Purpose: End stage liver disease is becoming more prevalent in the United States with approximately 4.9 million adults carrying the diagnosis. It is estimated that 40,000 deaths will occur each year due to liver disease. Liver transplantation offers the only solution for this illness, and the mortality rate on the liver transplant waiting list is approximately 5-10%. With a growing demand for liver transplantation, expanded criteria donors are increasingly utilized. Although it varies by center, expanded criteria donors typically include donation after circulatory death and those over the age of 50-60 years. These donors represent an under-utilized resource in liver transplantation. Our center reviewed 10 patients who received livers from expanded criteria donors, specifically DCD (donor after cardiac death) donors over the age of 50 years. The mortality rate was zero at 1 year post transplant. These donors are an important resource for patients on the waiting list, who may otherwise expire waiting for a liver, and we will show there use can be equally as successful as standard criteria donors. Methods: We analyzed the last 10 recipients of DCD livers over the age of 50. All donors received HTK as the flush solution, and the WIT (warm ischemia time) on average was below 30 minutes. We did not maintain a hard 30 minute cutoff in WIT, instead choosing to start the clock when the systolic blood pressure dropped below 60mmHg and/or SpO2 decreased below 80%. Keeping with those parameters, all donor livers had a WIT below 30minutes. In addition, we recently began using N-acetylcystine when the recipient was an-hepatic, in an attempt to reduce reperfusion injury. The results of the last 10 recipients receiving a DCD donor over the age of 50 show 100% patient and graft survival at 1 year, and excellent graft function at discharge. The average length of stay was 12.9 days with the majority of the patients leaving the hospital in less than 7 days. In addition, there were no livers with primary non-function or with ischemic cholangiopathy. We believe that DCD donors over the age of 50 are a severely underutilized resource for liver transplant recipients. Certain factors may be considered for our success with these donors. Using HTK to flush the donor organs provides rapid cooling of the liver which would take considerably longer with Wisconsin solution. In addition, the rapidity with which the donor liver is excised and the portal vein is flushed and the liver further cooled contributes to the excellent graft function. Purpose: Identification of hard-to-place deceased donor livers is important due to its potential role in expedited placement. Determining hard-to-place must be based on data available at allocation (generation of the match run or time of offer). We investigated the utility of two sources of donor information: DonorNet and the Deceased Donor Registration (DDR). DonorNet has comprehensive information but may be impractical due to optional data fields. DDR information is required and more complete, but may be unavailable at match run. An Expedited Placement work group of transplant professionals provided guidance on data likely to be known by organ procurement organizations at the time of match run or offer. For each data source, a predictive model estimated the probability of a local or regional liver transplant from donors with any recovered organ. LASSO was used to improve predictions and create parsimonious models. Models were estimated with donors recovered January 1, 2016-December 31, 2016. The predictive performance of simple decision rules based on each model was investigated and compared to a decision rule that classified hard-to-place livers as only from donation after circulatory death (DCD) donors. Predictive performance was evaluated with donors recovered January 1, 2017-December 31, 2017. Results: Rules based on the DonorNet model had similar, although marginally better, correct classification rates than rules based on the DDR model (Table 1) . Rules based on the predictive models performed better than the DCD-only rule by better classifying donors without a locally or regionally transplanted liver as hardto-place, i.e., better specificity. Conclusions: Despite the similar or marginally better correct classification rate, the DonorNet model had 41 non-zero covariate effects, including 10 non-zero effects for missingness of certain covariates. The DDR model had only 20 non-zero covariate effects. Thus, a decision rule based on DDR information may provide a parsimonious and practical approach for the a priori identification of hard-to-place livers. Table 1 . Predictive performance of decision rules based on predicted probability of local or regional placement from the DonorNet and DDR models, and on donor DCD status. its characteristics, demonstrating how such a system could be implemented. Characteristics of the feasibility function will be decided through the OPTN policymaking process. Methods: In a continuous distribution system, candidates are offered organs after being sorted by a score. The score is the sum of two components: a medical priority score and a geographic feasibility score. The medical priority score is based on donor and candidate clinical characteristics. The MELD score estimates candidates' disease severity, and donors and candidates are matched by blood type. The geographic feasibility score represents medical and financial costs of shipping organs long distances. A simple score function ( Figure 1 ) gives preference to programs within driving distance of the donor hospital up to a threshold (proximal zone, 150 nm) with decreasing boost as distance increases thereafter. The maximum boost distance was 6000 nm for all comparison scenarios. Status 1 candidates were assigned a medical priority score of 50 MELD units. Results: Two parameters varied, representing the relative importance of geography: max boost and mid boost. The baseline for comparison was a simulation with no geographic feasibility score; i.e., national distribution sorted only on MELD score. Conclusions: Under the baseline (MELD only) sorting system, MELD score at transplant was similar across the country, with increasing variability as geographic feasibility scores increased. Changes toward more uniform MELD scores at transplant occurred due to more travel for the baseline and less travel for the scenarios with increasing geographic feasibility scores ( Figure 2 ). This protocol was adopted nationally for priority listing for LT though the experience with down-staging in other UNOS regions is largely unknown. In this multi-region prospective study, we aimed to examine down-staging success rates as well as intention to treat outcomes related to down-staging. Methods: Consecutive patients from 6 LT centers in 3 UNOS regions (2, 5, 9) with HCC meeting down-staging (UNOS-DS) eligibility criteria (1 lesion >5 cm and ≤ 8 cm, 2-3 lesions at least one >3 cm but ≤ 5 cm and total tumor diameter (TTD) ≤ 8 cm, or 4-5 tumors ≤ 3 cm with TTD ≤ 8 cm) were enrolled from 2015-2018 and prospectively followed. Results: Among 210 patients with tumor burden meeting UNOS-DS criteria, 79 (38%) were not considered for LT. The common reasons for exclusion were medical (40%) or psychosocial (33%) contraindications to LT, AFP >1000 ng/ml with decompensated liver disease (15%), and bilirubin >4 mg/dL (6%). The remaining 131 patients comprised the study cohort (median age 63, 85% male, 61% Caucasian, 61% HCV). Pre-treatment median MELD score was 9 (IQR 7-11), Child-Pugh score was 6 (5-6), TTD was 6.4 cm (5.6-7.2), and AFP was 16 ng/ml (5-88). 25% underwent a single down-staging treatment and 33% received >3 treatments. Transarterial chemoembolization (TACE) was used in 83%, Y-90 radio-embolization in 32%, and ablation in 18%. Cumulative probability of successful down-staging to within Milan criteria at 1 and 2 years from first down-staging procedure were 82% and 90%, respectively. Dropout occurred in 40 patients (30%), mostly from tumor progression (67%). Cumulative probability of dropout by competing risks (CR) at 1 and 2 years from first down-staging procedure were 25% and 36%, respectively. Multifocal disease 6 (67%) Median cumulative diameter was 8.5cm with median largest lesion 6.5cm. 6/9 (67%) had multifocal disease. Tumors were well differentiated in 1/9, moderately differentiated in 4/9, and poorly differentiated in 4/9. One pt died 14mo post-LT due to recurrence, with overall survival of 100%, 83%, and 83% at 1-,3-&5yrs. To date, 3/9 (33%) pts developed recurrence a median of 7.1mo after LT, with recurrence free survival of 83%, 50%, and 50% at . There were no discernable associations of recurrence with grade, stage, perineural, or lymphovascular invasion. recipients. The aim of our study was to determine whether patients weaned off immunosuppression have favorable changes in their molecular profiles in regard to malignancy risk. Methods: Liver transplant (LT) recipients having undergone direct conversion from tacrolimus to sirolimus monotherapy, followed by successful weaned off of sirolimus at a later date (tolerance), were included. Gene expression profile on the whole blood was performed with Affymetrix HT HG-U133 Plus PM platform at the three time points. Key dysregulated genes, their overlap with hepatocellular carcinoma (HCC), colon cancer and Hodgkin Lymphoma (HL) datasets from Gene Expression Omnibus (GEO) and pathway enrichment analysis were obtained. Results: Twenty non-immune, non-viremic patients (age 57.2±8; 3.5±2.1 years post-LT)were included and eight of them achieved tolerance. Two independent analyses were performed 1) sirolimus conversion vs tacrolimus monotherapy, 2) tolerance time point vs sirolimus. Genes identified as dysregulated in each analysis were identified as affected in HCC, colon and HL but frequently with an opposite modulation, in particular compared to HCC ( Figure 1A , C, D). These genes were involved in cell cycle (FDR p-value 2.28 e-12), DNA replication (FDR p-value 3.02e-10) and mitotic phase transition (FDR p value 1.85e-25). Figure 1 : Overlap of the dysregulated genes affected at the different time points with HCC, colon cancer and HL and their protein-protein interactions obtained by STRING. Genes affected by conversion from tacrolimus to sirolimus (A,B), and sirolimus to tolerance (C,D) were identified. Conclusions: This unique longitudinal study of patients progressing from tacrolimus to sirolimus, and finally to tolerance reveals a potential added benefit to achieving tolerance, i.e. decreased cancer susceptibility on the basis of favorable changes in their molecular profiles. Future validation will include assessment as to whether these gene expression patterns predict reduced malignancy risk in patients who achieve tolerance. Purpose: Spindle and kinetochore-related complex subunit 3 (SKA3) is a part of the spindle and kinetochore-related complexes, which are crucial for the proper timing of late mitosis. Up-regulation of SKA3 suggests poor prognosis in patients with renal cell carcinoma and colorectal cancer. However, the correlation between SKA3 and hepatocellular carcinoma (HCC) is not clear. We aimed to explore the mechanism of SKA3 in HCC. We analysed the datasets in TCGA and GEO database to find different genes. Then we performed immunohistochemical staining and qRT-PCR to certify the up-regulation of SKA3 in tumor tissues. Small interfering (si) RNA transfection was used to down-regulate SKA3 in LM3 and Huh7 cell lines. Cell count KIT-8 (CCK8) assay was used to analyze cell proliferation. Cell migration and invasion were measured by scratch wound healing test and migration test. The subcutaneous xenotransplantation model was used to investigate the role of SKA3 in tumor formation in vivo. Finally, we used co-immunoprecipitation to explore the correlations between SKA3, CDK2 and p53 phosphorylations. Results: Bioinformatics analysis revealed that the high expression of SKA3 in HCC suggests poor prognosis . Consistently, immunohistochemical staining of 95 pairs of tumors and adjacent normal liver tissues (ANLT) also showed up-regulation of SKA3 expression (Fig A) . Down-regulation of SkA3 significantly inhibited tumor proliferation and invasion in vivo and in vitro (Fig B) . Gene enrichment analysis (GSEA) showed that SKA3 may affect tumor progression through cell cycle and P53 signaling pathway. In addition, SKA3 knockout resulted in G0/G1 phase arrest and severe apoptosis, as demonstrated by inhibition of CDK2/p53 phosphorylation and down-regulation of p21 and BAX/Bcl-2 level in HCC cells ( Fig C) . Conclusions: It is suggested that SKA3 can promote the cell proliferation and inhibit apoptosis of tumor cells in HCC through interaction between CDK2 and p53 (Fig 2) . We have shown that PI3K/ mTOR signaling regulates the metabolic switch that controls the balance between glycolysis-driven effector and Treg cells, which rely more on mitochondrial oxidative phosphorylation (OXPHOS). Recent data has emerged in support of the fundamental role of Wnt/β-catenin pathway in the functional fitness of T cells. In this study, we investigated the metabolic effect of the Wnt/β-catenin signaling inhibitor FH535 in Treg cells and the potential interplay with the PI3K/mTOR pathway. Methods: Sorted CD25 + Treg cells were placed in culture with different doses of the β-catenin inhibitor FH535, the mTOR pathway inhibitor Rapamycin, alone or in combination. Multi-parametric FACS analyses were used to monitor the expansion rates, mitochondrial function and integrity, and suppressor function of the cells. The bioenergetics profiles of the drug-treated cells, including OXPHOS and glycolytic rates, were assessed on an extracellular flux analyzer. Results: Addition of FH535 to the Treg cell culture induced a rapid but temporary increase of cell expansion with a concurrent inhibition of their suppressor function. These effects were linked to substantial metabolic changes, including higher glycolytic rates and impaired mitochondrial respiration. Simultaneous inhibition of the mTOR pathway with Rapamycin prevented the effects of FH535. However, the sequential addition of FH535 followed by Rapamycin increased between 20 and 50% the levels of Treg cell expansion and promoted higher suppressor activity when compared to Rapamycin alone. These effects were linked to the metabolic changes consistent with glycolysis enhancement similar to that induced by FH535 alone, and OXPHOS increase similar to the Rapamycin-induced alone. Results: We show that in activated human Tregs granzymes (Gr) A and B leak from cytotoxic granules and lead to their apoptosis, despite an increase in endogenous inhibitor proteinase inhibitor 9. Confocal imaging shows GrA and GrB outside of the CD107 + granules both in the cytosol and in the nucleus of Tregs. The release of GrB in the cytosol significantly activates its protease activity, leading to multiple substrates cleavage. Gr knockdown in healthy human Tregs with shRNA-expressing lentiviruses protects them from undergoing apoptosis. We hypothesize that decreasing GrB expression in Tregs could protect them from succumbing to apoptosis via granzymes. TORC1 inhibitors have been known to reduce GrB expression in Tregs. In vitro studies show GrB production and self-inflicted damage in activated human Tregs that was reversed by TORC1 inhibition (~2 folds decrease, P*<0.05). We also utilize a humanized mouse model to study the effect of rapamycin treatment on transferred human Tregs overall survival. We observed in the rapamycin-treated group that (57% ±8%) of Tregs expressed GrB compared to 75% of the Tregs in the control group (n=6, p* < 0.05). Subsequently, we also observed that the apoptosis of Tregs from the rapamycin-treated group also decreased to 66% from (80% ±5%) in the control group (n= 6, p* < 0.05). Furthermore, we also observed a variation in GrB expression in Tregs isolated from unidentified human peripheral blood samples. This led us to perform genotypic analysis to identify two known GrB single nucleotide polymorphisms (SNPs), rs714436 and rs8192917, and correlate them with the corresponding GrB expression. However, we found no correlation between the identified SNPs and GrB production (n = 20). By using cytometry by time of flight (CyTOF), we observed an increase in GrB-expressing Tregs in the peripheral blood and renal allografts of transplant recipients undergoing rejection. Those GrB-expressing Tregs were significantly more apoptotic than non-GrB expressing Tregs (n=20, P*<0.05). We are currently in the process of identifying renal transplant patients on TORC1 inhibitors and performing CyTOF to observe their effects on Tregs homeostasis in these patients. We have previously shown an association between undernutrition and the development of viremia in transplanted children. While undernutrition is associated with impaired T cell immunity and leptin deficiency in non-immunosuppressed children, the mechanism in transplanted pediatric recipients is unknown. We aimed to determine how undernutrition impacts T cell immunity using a murine skin transplantation model. Methods: Fed and fasted C57BL/6 mice received skin grafts from either syngeneic or Balb/C (allogenic) donors. Graft survival was monitored, spleens were harvested for T cell flow cytometry analysis at Day 5, and serum was analyzed for leptin levels with ELISA at Day 2. Additional C57BL/6 mice were inoculated with murine cytomegalovirus (mCMV) and fasted for 48 hours, an established surrogate for undernutrition, or fed ad libitum. Mouse spleens were harvested at Day 5, stimulated in vitro with mCMV viral peptide, and T cell function examined by flow cytometry. Viral titers were quantified by mCMV qPCR. Results: Fasted C57BL/6 mice had prolonged skin allograft survival p<0.0001 ( Fig 1A) . Leptin levels were also significantly decreased in fasted compared to fed mice, p=0.01 ( Fig 1B) . Furthermore, the frequency of IFNγ producing CD8 T cells was lower in fasted mice (p=0.003), and the addition of leptin during fasting increased relative frequencies of these CD8 T cells, p=0.02 ( Fig 1C) . As undernutrition can be immunosuppressive, T-cell specific mCMV viral responses were examined. Fasted mice had increased viral burden 209030 ± 49367 compared to fed mice 44859 ± 11902, p=0.005 and had impaired mCMV-specific T cell responses compared to fed mice, p=0.0009 ( Figure 1D ). Conclusions: Undernutrition affects cytotoxic and effector T cell memory responses leading to prolonged allograft survival, while also impairing mCMV T cellspecific immunity. Signaling through the leptin pathway may modulate CD8 T cell responses, giving insight into the mechanism by which undernutrition decreases T cell alloreactivity, at the expense of impairing viral-specific T cell immunity and predisposing transplanted children to viremia. 1A) . Also in the (dxb/d) parent to F1 model, EPO-induced inhibition of T FH (-20.6±3.5%; p<0.01) and GC B cells (-3.6±0.8%; p<0.001) and this was associated with reduced autoantibodies, proteinuria and renal histological changes ( Figure 1B-D) . In vitro, EPO did not affect IgG class switch rate of stimulated B cells. Conclusions: EPO administration inhibits T FH and GC B cell formation, together reducing the clinical and histological expression of murine lupus nephritis in parentto-F1 models. Together with our published evidence that EPO promotes human Treg, the data support safety/efficacy testing of rEPO as an immunemodulating agent in to inhibit T FH and alloantibody formation. Percentages of splenic H2kD -CD4 + PD1 + CXCR5 + T FH and B220 + GL7 + Fas + GC B cells in (bxd)F1 mice injected with B6 T cells 2 weeks prior and treated with vehicle or EPO (A). Representative renal histology (B), urinary albumin/creatinine (C) , and anti-dsDNA IgG (D) Methods: DD Ktxs performed at UCSF (n=569) from 2013-16 were analyzed. Based on pre-tx DSA status, recipients were divided into 3 groups: DSA -ve (n=465), non-DP DSA +ve (n=81), and DP-DSA +ve (n=23). All recipients received standard immunosuppression, and DP-DSA +ve recipients received additional 1 dose IVIG at day-1 post-tx. Grafts were evaluated by 6 mo protocol biopsies (Bx) and cause Bx. HLA antibodies were tested using single antigen bead assay (One Lambda). Results: There were more highly sensitized patients (CPRA>97%) and well-matched Ktx in DP-DSA +ve group compared to the other 2 groups (Table) . All recipients in DSA -ve and non-DP DSA +ve groups had a -ve T and B cell pronase flowcrossmatch (FXM), while 30% (n=7) of DP-DSA +ve group that had a DP-DSA of >10,000 MFI, displayed a +ve B-FXM (MCS~200; 120 cutoff). There were no hyper acute rejection episodes in all 3 groups. The DP-DSAs were self-decayed overtime post-tx in all 23 recipients transplanted with pre-formed DP-DSAs ( Figure) . The Bx findings revealed no significant difference in the rate of antibody-mediated rejection (ABMR) or in acute cellular rejection (ACR) between the 3 groups. Two recipients transplanted with DP-DSAs but -ve XMs lost their grafts: one was a 46F re-tx recipient with 100% CPRA, 1/12 HLA-mismatch, DP4 DSA (MFI=6956), lost due to ACR in 4 months; another was 62M, 19% CPRA, 7/12 HLA-mismatch, DP1 DSA (MFI=4165), suffered Polyomavirus nephropathy at 5 months. Conclusions: Kidneys can be transplanted across DP-DSAs with no apparent effect on graft survival. This strategy increases the DD Ktx in highly sensitized patients without pre-tx desensitization. The long-term outcome studies are ongoing. -2206) or active site (GDC-0068) inhibitors reduced phosphorylation of c-Raf-1 Ser259. Importantly, the Akt allosteric inhibitor MK, but not GDC, caused over activation of ERK, suggesting that combined HLA-I and II Ab-mediated over activation of ERK is predominantly Akt-dependent. Furthermore, combined treatment of EC with MEK inhibitor UO126 and rapamycin abrogated combined Ab-stimulated activation of Akt and ERK, and further blocked EC proliferation. Conclusions: Inhibition of the PI3K/Akt/mTORC2 signaling axis suppresses a novel negative mTORC2/Akt-dependent feedback loop, leading to enhanced MEK/ ERK pathway activity following combined HLA-I/II Ab-activated signal networks in EC. This mechanism is different from the HLA-II antibody-mediated feedback loop, which is predominantly Akt-independent. Our data suggest that combined ERK and dual PI3K/mTORC2 inhibitors will be required to achieve optimal efficacy in controlling combined HLA I and II Ab-mediated AMR. Purpose: B-cell activating factor (BAFF) is a cytokine that plays a role in the survival, proliferation and differentiation of B-cells. The aim of this study is to develop an allosensitized mouse model using HLA.A2 transgenic mice, and to observe the effects of anti-BAFF monoclonal antibody (mAb) in this model. Methods: Wild-type C57BL/6 mice were sensitized with skin allografts from C57BL/6-Tg (HLA-A2.1)1Enge/J mice and were treated with intraperitoneal injections of anti-BAFF mAb (named Sandy-2) or control IgG1 antibody. Donor specific antibody (DSA) responses were observed by measuring anti-HLA.A2 antibodies in serum of the mice using the luminex assay. B-cell fractions in mice bone marrow and spleen were determined using flow cytometric analysis, and mRNA profiling was done using microarray analysis. (T-614) in the inhibition of DSAs following kidney transplantation. Methods: A rat renal transplant ABMR model was established. Levels of de novo DSAs following skin transplant and renal transplant were detected by flow cytometry assay. Levels of Bregs (CD19+Tim-1+), Tregs (CD4+CD25+ Foxp3+), as well as the Th17 cells (CD4+IL-17A+), in peripheral blood monocytes (PBMCs) collected from recipients were also examined. Various inflammatory cytokines, such as IL-2, IL-4, IL-10, and IL-17, were tested by ELISA assay. To further explore the mechanism involved, primary Bregs were extracted from the spleen of recipients and co-cultured with PBMCs of recipients in vitro. De novo DSAs were detected. Moreover, the Bregs and balance of Th17/Tregs were examined. We found that the administration of Iguratimod could induce the significant reduction of de novo DSAs in skin transplant and secondary DSAs following renal transplant. Moreover, the allograft function was remarkably improved in the treatment of Iguratimod. To further explore the related mechanisms, flow cytometry assay showed the significantly increased expression of Tregs, as well as the decreased expression of Th17 cells. Furthermore, our primary results from randomized controlled trials in our renal transplant center also supported the favorable efficacy of Iguratimod on decreasing the production of DSAs in sensitized renal transplant recipients, as well as the imbalance of Th17/Tregs. Results: Immunization with the model antigen NP or with allogeneic cells resulted in significantly reduced titers of antigen specific IgG by 3x fold in the DAF-TM+/+ animals vs. controls (C) . Analyses of GCs confirmed DAF downregulation on control GC B cells but persistent DAF expression on DAF-TM+/+ mice (B), and demonstrated 3x fold fewer GC B cells by flow cytometry (D). Surface expression of other complement regulators and receptors, including CR2 known to be involved in B cell activation, did not differ between genotypes (E) . Because DAF regulates C3 convertases and DAF deficiency augments C3a and C5a production, we tested whether C3a/C3aR and C5a/C5aR ligations on GC B cells is essential for their function. Flow cytometry showed that C3aR and C5aR expression are upregulated on GC B cells vs naïve B cells (F) . We next generated mice conditionally deficient in both C3aR and C5aR, and crossed them to the Cg1-Cre mice such that both receptors are specifically deleted GC B cells. Absence of GC B cell C3aR/C5aR prevented antibody formation and markedly reduced GC B cell differentiation following immunization with NP (C,D) or allogeneic cells (G), newly demonstrating that C3aR/C5aR signaling on GC B cells is required for GC function. Conclusions: Together, our data show that DAF down-regulation on GC B cells facilitates C3aR/C5aR signaling required for GC function. In addition to delineating a paradigm shifting role for DAF and complement as regulators of GC dynamics, the findings have therapeutic implications for preventing and treating alloantibody formation in transplant recipients. Methods: c57/B6 mice underwent kidney transplantation with Bm12 kidneys (minor MHC mismatch), a well-described model for cAMR where animals cannot make donor specific antibody but rather make antinuclear antibody. Following transplantation, animals received TACI-Ig (to block APRIL and BLyS) or no treatment. Animals were continued on treatment until harvest 4 weeks following transplant. Serum was analyzed for circulating antinuclear autoantibodies using HEp-2 indirect immunofluorescence. Spleen and transplanted kidneys were analyzed via H&E. Results: Antinuclear autoantibody production was significantly decreased in APRIL/ BLyS blockade treated animals (p<0.0001) (fig1). No significant difference in autoantibody production was found between syngeneic transplant control (B6 to B6) and APRIL/BLyS blockade treated animals (p=0.90). Additionally, disruption of splenic germinal center architecture was noted in the APRIL/BLyS blockade treated animals. Despite the significant decrease in autoantibody production and germinal center disruption, no significant difference in lymphocyte infiltration was noted in the transplanted kidney. Conclusions: APRIL/BLyS blockade resulted in a significant decrease of autoantibody production and disrupted splenic germinal center formation in a chronic kidney transplant model; however in this model no difference in kidney transplant pathology was seen, which may have to do with the absence of any T cell centric immunosuppression. Regardless, these findings suggest that APRIL/BLyS blockade may play a role in decreasing antibody formation long-term in kidney transplantation. Future investigations will use APRIL/BLyS blockade in conjunction with T lymphocyte depleting agents to determine its efficacy in chronic rejection. Purpose: Despite the progress in immunosuppressive regimens, antibody-mediated rejection remains a major contributor to renal allograft loss in recipients with donor-specific antibodies (DSA). IRE1a, an endoplasmic reticulum (ER) stress sensor, is an activator of XBP1 which is required for plasma cell differentiation and antigen-specific antibody production. The goal of this study is to determine the role of B cell-specific IRE1a in renal allograft rejection under sensitized and non-sensitized settings. Methods: In non-sensitized groups, renal allografts from BALB/c mice were transplanted into bi-nephrectomy B cell-specific IRE1a KO (CD19 cre/+ IRE1a flox/flox ) or their wild type (CD19 +/+ IRE1a flox/flox ) littermates. Cells from spleens and renal allografts were used for phenotypical and functional analyses. In sensitized groups, KO and WT mice were challenged with C3H skin grafts and received C3H kidney allografts 2 weeks later; the recipients were either untreated or treated with FK506 (3mg/kg/day) from day-1 to day10. Results: Lack of IRE1a in B cells significantly decreased donor-specific antibody production in either sensitized or non-sensitized recipients post-transplantation. In non-sensitized recipients, the WT renal allografts developed severe rejection with a median survival of 49 days, whereas most KO allografts survived over 150 days (P<0.0001, vs control). Despite the comparable T cell infiltration seen in both WT and KO allografts at POD12, the long-term-survived KO allografts showed significantly diminished cellular infiltration, improved renal function, and intact renal structure than the WT allografts. The prolonged survival in KO allografts was associated with decreased frequency of plasma cells and increased frequency of CD5 + CD1d hi regulatory B cells, aligned with the augmented IL-10 production seen in KO B cells stimulated with LPS in vitro. Furthermore, in sensitized recipients, the WT allografts with a transient FK506 therapy were rapidly rejected in 13.5 days. In contrast, the majority of the KO allografts with the same FK506 treatment had a significantly prolonged survival to over 60 days (P<0.01, vs control). These results suggest that deficiency of IRE1a in B cells inhibited DSA production, promoted regulatory B cells, and protected renal allografts from rejection in sensitized and non-sensitized recipients, thereby offering a potential therapeutic target for antibody-mediated rejection. We previously reported the novel activity of alloprimed CD8 + T cells that suppress post-transplant alloantibody production. The purpose of the current study was to investigate the expression and role of CXCR5 on antibody-suppressor CD8 + T cell (CD8 + T Ab-supp cell) function. Methods: C57BL/6 mice were transplanted (Tx) with FVB/N hepatocytes (Hc). Alloprimed CD8 + T cells were retrieved on day 7 from hepatocyte Tx recipients and CD8 + T cell subsets were sorted into CXCR5 + CXCR3 -(9.3±1.0%) or CXCR3 + CXCR5 -(8.4±0.3%) subsets by flow cytometry (both subsets were CD44 + IFN-γ + ). Results: Flow-sorted (CXCR5 + CXCR3and CXCR3 + CXCR5) alloprimed CD8 + T cell subsets were analyzed for in vitro cytotoxicity and capacity to inhibit in vivo alloantibody production following adoptive transfer (AT) into C57BL/6 or high alloantibody-producing CD8 KO hepatocyte Tx recipients. Alloprimed CXCR5 + CXCR3 -CD8 + T cells mediated in vitro cytotoxicity of alloprimed IgG + (antibody-producing) B cells (12.7±1.8%, n=10, p<0.0001) while CXCR3 + CXCR5 -CD8 + T cells did not (1.9±1.9%, n=10, p=ns). Only flow-sorted alloprimed CXCR5 + CXCR3 -CD8 + T cells (not flow-sorted alloprimed CXCR3 + CXCR5 -CD8 + T cells) suppressed alloantibody production after AT into C57BL/6 HcTx recipients (titer=60±10 compared to 120±20 in untreated control HcTx mice, N≥4 in each group, p=0.02). The lower alloantibody titer following AT of CXCR5 + CXCR3 -CD8 + T cells into C57BL/6 recipients also correlated with a 2-fold reduction in the quantity of germinal center B cells (p=0.001) and IL-4 + IL-21 + CD4 + T FH cells (p=0.003) compared to control recipients without AT. Adoptive transfer of CXCR5 + CXCR3 -CD8 + T cells also suppressed alloantibody production in high alloantibodyproducing CD8 KO HcTx recipients (titer= 90±40 vs.1,300±500 in control mice, N≥5 in each group, p<0.0001). CD8 KO hepatocyte recipients that received AT of CXCR5 + CXCR3 -CD8 + T cells had prolonged allograft survival (MST=day 32; N=5, p=0.002) compared to those that received AT of CXCR3 + CXCR5 -CD8 + T cells (MST= day 14, N=5) or those with no AT (MST= day 14, N=7). Adoptive transfer of CD8 + T cells retrieved from wild-type (but not CXCR5 KO) mice into CD8 KO HcTx recipients suppresses alloantibody production (titer= 300±70 vs. 1,250±250 in control mice p=0.004). Conclusions: These data support the conclusion that expression of CXCR5 by antigen-primed CD8 + T cells is critical for the function of CD8 + T Ab-supp cells. Abstract# 172 Purpose: Influenza A virus (IAV) infection is known to be an important cause of morbidity in solid organ transplant (SOT) recipients. The objective of our study was to analyze the detailed repertoire of antibody production in SOT patients after influenza infection or vaccination, using a microarray technique that simultaneously quantifies antibodies against multiple influenza antigens. Methods: Two groups of adult SOT recipients were included in this study: 1) patients with IAV infection and 2) patients who were vaccinated with standard dose trivalent seasonal influenza vaccine. Serum was either collected at the time of IAV infection diagnosis and four weeks after or immediately prior to vaccination and four weeks later. A diverse collection of influenza antigens (n=86), including those from various subtypes, host origins and geographical locations, were printed onto nitrocellulosecoated slides and probed using patient sera and fluorescently-conjugated IgG and IgA secondary antibodies. Following measurement of median fluorescent intensity (MFI), antibody responses (fold changes in MFI) were compared among naturally infected and vaccinated patients, using significance analysis of microarrays (SAM) with a false discovery rate <1%. Results: A total of 120 patients were included: 80 in the IAV infection group (40 A/ H1N1, 40 A/H3N2) and 40 in the vaccine group. The median time from transplant to infection/vaccination was 3.0 (IQR 0.8-8-1) years. All IAV infected patients were treated with oseltamivir. H1N1 infected patients showed a significantly higher IgG and IgA antibody response for 12 out of 26 H1N1 antigens (11 out of 12 targeting hemagglutinins) compared to vaccine patients ( Figure 1 ). Purpose: The High-dose (HD) influenza vaccine has been shown to induce better humoral immunity than the standard-dose (SD) vaccine in solid organ transplant (SOT) patients. We aimed to evaluate whether the HD vaccine also induced better cell-mediated immunity (CMI) than the SD vaccine. Methods: A subset of patients enrolled in a randomized trial of the HD vs. SD influenza vaccine were enrolled in CMI analysis substudy during the 2016-17 influenza season. Peripheral blood mononuclear cells (PBMC) were collected pre-and 4 weeks post-immunization. PBMCs were stimulated with 1µg of vaccine antigens corresponding to seasonal influenza A/H1N1, A/H3N2 or B viruses. CD4+ and CD8+ T-cell populations were analyzed using flow cytometry with intracellular cytokine staining for IFNγ, TNFα, IL2 and IL4. Results: We enrolled 55 patients in the CMI substudy, of which 30 and 25 received the HD and SD vaccine, respectively. Median age was 57 years and median time since SOT was 4.8 years. The most frequent organ transplant was kidney (40%), followed by liver (16%), lung (15%) and heart (13%) and combined (16%). CD4 + . Following stimulation with influenza B, patients who received the HD vaccine had a higher proportion of influenza-specific IFNγ + (p=0.009), IL4 + (p=0.001), IL2 + (p=0.017), TNFα + (p=0.003), IFNγ + TNFα + (p<0.001), TNFα + IL2 + (p=0.003), IFNγ + IL2 + (p=0.006), and IFNγ + TNFα + IL2 + (p=0.002) CD4+T-cells (Fig. 1A ). Following stimulation with influenza A/H1N1, patients who received the HD vaccine had a higher proportion of influenza-specific IFNγ + (p=0.003), IL4 + (p=0.049), TNFα + (p=0.016) and TNFα + IL2 + (p=0.014) CD4 + T-cells (Fig. 1B) . Following stimulation with A/H3N2, patients who received the HD vaccine had a higher proportion of influenza-specific IFNγ + (p=0.043), IL4 + (p=0.035), TNFα + (p=0.014) and TNFα + IL2 + (p=0.025) CD4 + T-cells (Fig. 1C ). CD8 + . Following stimulation with influenza B, patients who received the HD vaccine had a higher proportion of influenza-specific TNFα + (p=0.007) and IFNγ + TNFα + (p=0.022) CD8 + T-cells (Fig. 1A ). Following stimulation with A/H1N1, patients who received the HD vaccine had a higher proportion of influenza-specific IFNγ + (p=0.047] and TNFα + CD8 + Tcells (p=0.047) (Fig. 1B) . Following stimulation with A/H3N2, CMI between HD and SD was similar (Fig. 1C ). The HD vaccine displayed enhanced CMI when compared to the SD vaccine, and this was especially evident for CD4 + T-cell immunity. For optimal CMI, our study suggests that HD vaccine may be the preferred vaccine for SOT patients. Methods: We performed a retrospective observational study on all HIV-positive KTRs at Hahnemann University Hospital from 2001 to 2017 to determine the incidence of PJP by the end of follow-up. Based on our institution's protocol, all patients received a six-month PJP prophylaxis. Results: We identified 122 HIV-positive KTRs in the 16-year period. Eighty-two percent were male (n=101) and 83% were African American (n=102). The mean age at transplant was 48 ± 9 years. High rate of HCV coinfection was observed (37%, n=45). Eighty-nine percent of kidneys were from deceased donors (n=109); none of the donors were HIV positive. Mean pre-transplant CD4 count of KTRs was 461 ± 127 cells/µL. Majority received induction therapy with basiliximab with or without intravenous immunoglobulin, while one patient received antithymocyte globulin induction. Maintenance immunosuppression was with a calcineurininhibitor (tacrolimus or cyclosporine) an anti-proliferative agent (mycophenolate or sirolimus), with a steroid tapering regimen. In addition to PJP prophylaxis, all patients also received cytomegalovirus prophylaxis for 6 months. . Acceptability of an IRD organ also increased as the specified transmission risk of HIV or HCV decreased (p<0.001). Patients were also more likely to accept an IRD organ if they were educated on the benefits of IRD organs ( Figure 1 ) eg, knowledge that an IRD organ was better quality than non-IRD increased overall acceptance from 41.1% to 63.3% (p<0.001). Our survey provides insight into transplant candidates that would benefit from greater education on IRD organs. Strategies targeting specific educational points are likely to increase acceptability. Purpose: Opioid use before and after liver transplant is strongly associated with morbidity and mortality; yet data assessing opioid utilization has centered on sources with known inaccuracies. The purpose of this study was to evaluate the accuracy of various opioid use data sources compared to state-required opioid prescription data (PDMP, gold-standard), and to determine the impact that the source has on measured outcomes. Methods: This was a retrospective, single-center cohort study of opioid use in liver transplant recipients between 2010-6 assessing associated readmission outcomes. Opioid prescription data was obtained via medication reconciliation, a national pharmaceutical claims database, and the state-mandated PDMP. Results: Of 441 liver transplants within the study timeframe, 133 (30%) had PDMP-reported opioid prescriptions filled in the 3 months prior to transplant. Other opioid sources or combinations of opioid sources were able to accurately identify patients that had used opioids 54 to 78% of the time, yet also inaccurately identified 9 to 25% of patients (who did not fill a prescription for opioids during the time period) as opioid users ( Table 1 ). The associations between pre-txp opioid use and readmission rates varied significantly based on source of information used to assess opioid use ( Table 2) . In order to most optimally mitigate the opioid epidemic, researchers must be able to use the most accurate opioid source data to measure utilization and determine risk factors for COU. It is imperative for researchers to have the ability to match state PDMP data with patient-level characteristics to identify at risk patients within this population. We strongly encourage healthcare workers and researchers to work with their state-level politicians to make this possible. Results: Five classes of symptom experience were identified: "all symptoms low" (11% of patients), "sleep problems/fatigue" (31%), "sleep problems/fatigue + pain" (20%), "sleep problems/fatigue + pain + gastrointestinal distress + respiratory distress" (33%), and "all symptoms high" (5%). The figure shows the probability of experiencing select symptoms for each latent symptom experience class. Patients in the "all symptoms high" class had a high probability of experiencing each symptom, and patients in the "all symptoms low" class had a low probability of experiencing each symptom. In between, we identified groups of patients ("sleep problems/ fatigue," "sleep problems/fatigue + pain") with high probability of some symptoms but low probability others (e.g., gastrointestinal distress), as well as a group of patients with high probability of all symptoms from the "sleep problems/fatigue + pain" class, but adding higher probabilities of gastrointestinal and respiratory distress. Male patients were significantly more likely to be in the "sleep problems/fatigue" (b=0.97, p=0.04) and "all symptoms low" (b=2.42, p=0.002) classes, while age was significantly associated with membership in the "sleep problems/fatigue + pain + gastrointestinal distress + respiratory distress" class (b=0.03, p=0.05). Conclusions: This study improves upon previous research by identifying not only common symptoms, but also "phenotypes" of patients more and less likely to experience distinct groups of symptoms, painting a more comprehensive picture of post-LDKT health. In doing so, we found significant variation in symptom experience after LDKT, with older and female patients experiencing multiple bothersome symptoms. The results of this study can be used to target patients for symptom management interventions. In the BENEFIT trial (belatacept vs. cyclosporine), kidney txp recipients were assessed via the SF-36 (HRQoL) and MTSODS-59R (symptoms) at baseline, 12, 24, and 36 months post-txp. Latent class analysis was used to identify distinct trajectories of SF-36 general health scores over time ( Figure A) . Baseline symptom ratings and changes in symptom ratings from baseline to month 12 were examined for associations with general health trajectory. A TRS score was calculated by summing distress scores for selected items; this score was compared between treatment arms to demonstrate its potential utility as an endpoint, and examined as a predictor of txp outcomes. Results: After excluding patients with missing MTSODS-59R data, graft failure within the first year, and drug discontinuation within the first year, 394 patients were included in the analysis. Most patients grouped into one of three relatively stable trajectories defined by level of general health at baseline (similar to, 0.5 SD worse, or 2.0 SD worse than the general population) but a 4 th group began 2SD below the general population mean and improved dramatically between transplant and 1 year ( Figure A) . The TRS score (Cronbach's alpha=0.86) calculated from these items at baseline was not associated with graft failure but TRS change from baseline to month 12 was (HR: 1.11, 95% CI: 1.01-1.22, p=0.03, Figure B ). Neither baseline TRS or change from baseline to month 12 differed by treatment arm. However, TRS at month 12 differed by treatment, particularly for patients receiving the belatacept LI (less intense) regimen (β = -2.30, p = 0.02, Figure Purpose: New immunosuppressive agents in transplant should not only be efficacious but also help patients feel better (i.e. lower patient reported symptoms and improve health related quality of life (HRQoL). Herein, we analyzed patients in the BENEFIT and BENEFIT-EXT trials longitudinally to determine the relative symptom and HRQoL differences between cyclosporine (CsA) and belatacept. Methods: Patients were evaluated using the SF-36 (HRQoL) and the MTSOSD-59R (symptom) at baseline, 12, 24, and 36 months post-transplant for SF-36 (n=831) and MTSODS-59R (n=394, inclusive of patients who reported SF-36). We examined change from baseline and differences between treatment groups for all 8 subscales of the SF-36. The most distressing symptoms were identified and compared by treatment over time using ridit analysis. Overall, compared to CsA, belatacept treated patient had better HRQoL scores post-transplant across most SF-36 subscales ( Figure A) . Between treatment groups, there were a higher number of distressing symptoms (p<0.01) in CsA treated patients at 12, 24, and 36 months post-transplant relative to belatacept treated patients. The symptoms that highly differentiated cyclosporine from belatacept were trembling hands, feeling of warmth in hands and feet, swollen ankles, change in facial features, increased hair growth, muscle cramps, and swollen gums ( Figure B ). No symptoms appeared more distressing in patients treated with the belatacept less intense (LI) regimen. Conclusions: Active adherence monitoring and coaching is feasible and accepted by RT recipients. We found similar findings amongst groups, but a full analysis including electronic pillbox data and pharmacy records is underway to more completely assess the impact of various monitoring strategies + interventions on M-NA post RT. Identifying RT subgroups that may benefit from adherence interventions may minimize negative outcomes post-RT. Active M-NA interventions may have added benefit over passive alerts/reminders. Abstract# 186 The opioid epidemic has raised the question of how transplant teams should work with patients who are prescribed opioids. Benzodiazepine use has also received increased attention, as more than 30 percent of overdoses involving opioids also include benzodiazepines. Pre-transplant opioid use has been associated with increased risk of graft loss and both opioids and benzodiazepines are associated with a higher risk for mortality. However, there is no consensus or uniformity of practice regarding these issues when determining candidacy for lung transplant. Centers must determine how to evaluate patients for transplant with opioid, opioid substitution therapy (e.g. methadone or suboxone), or benzodiazepine use. This survey of lung transplant centers sought to identify existing policies and practices regarding use of opioids, opioid substitution therapy (OST), and benzodiazepines in adult lung transplant candidates. Methods: Recruitment was done via an online survey posted on the American Society of Transplantation (AST) community of practice listservs. Transplant centers that did not respond to the AST post were contacted by email. Descriptive statistics were used. Results: Of 64 adult lung transplant centers, 34 (53.1%) responded. Most respondents were pulmonologists (41.7%) or psychologists (25.0%). There were four centers that considered opioid use an absolute contradiction (11.8%), while 82.4% considered it a relative contraindication. Regarding OST use, seven centers (20.6%) considered it an absolute contraindication and 58.8% considered it a relative contraindication. Only three centers considered benzodiazepine use an absolute contradiction (8.8%), while 64.7% considered it a relative contraindication. There was a range of evaluation practices with considerable variability across centers. Most centers did not have written policies. Centers were most likely to have a written policy regarding opioid use (32.4%), and the majority of centers did not have written policies regarding OST or benzodiazepine use (79.4% and 70.6%, respectively). Over half of respondents (61.8%) believed that a national consensus policy is needed to address opioid and OST use in lung transplant candidates. Conclusions: Survey results highlight the range of policies and practices within the lung transplant community regarding opioid, OST, and benzodiazepine use, as well as the opportunity to develop a national consensus statement regarding these topics. We included all LTx recipients older than 45 years and excluded patients with retransplants, simultaneous transplants, or other previous cardiac surgery and those whose status could not be assessed. Categorical variables were compared with Chisquare tests, and continuous variables with Wilcoxon tests in R ver.3.4.3. Outcomes were cumulative incidence of overall mortality in CABG vs. Non CABG as well as mortality due to specific causes. The study cohort included 14,093 patients, 325 of which had previously undergone CABG (left, n= 76; right, n= 198; bilateral, n= 51). The CABG group was older (median 65 y vs. 61 y, p<0.001), more male, had a greater proportion of interstitial pulmonary fibrosis, and higher creatinine than Non CABG patients. The cumulative incidence of all-cause mortality at one year was 23% for prior CABG and 14% for others. We identified major causes of death (COD) in the first year post-transplant in CABG vs. Non CABG recipients. COD were grouped into six categories: BOS/graft failure/pulmonary, Infection, Cardiovascular/cerebrovascular (CV), Hemorrhage, Malignancy, and Other. The cumulative incidence of mortality at one year due to graft failure was 7% for CABG vs. 4.5% for others (overall logrank p=0.024). There was no statistically significant increased risk of death from CV causes. Risk of death from hemorrhage was 2% for CABG vs. 0.5% for Non CABG (overall log-rank p=0.001). Conclusions: Lung transplant patients with history of CABG are at increased risk of death due to graft failure at one-year post transplant and at risk of early death from hemorrhage after transplant. There is no significant difference in mortality due to cardio/cerebrovascular causes between two groups. Patient survival as well as freedom from bronchiolitis obliterans syndrome (BOS), treated for acute rejection, severe renal dysfunction (defined as post-TX dialysis, creatinine >2.5 mg/dl, or renal TX) were estimated using the Kaplan-Meier method and compared using the log-rank test. Cox regression analysis was used to determine the association between DCD TX and survival within 3 yrs in the presence of other characteristics. Results: As shown in Table 1 , the median recipient age at TX was similar between DCD and DBD TXs. DCD donors were older than DBD donors. Most DCD donors died of anoxia or head trauma. A higher proportion of DCD TXs were double lung procedures and DCD TXs had a longer ischemic time than DBD transplants. Median length of hospital stay was longer in DCD than DBD TXs. There was no significant difference in unadjusted patient survival rates (Figure 1 ), freedom from BOS and freedom from severe renal dysfunction between DBD and DCD TXs within 3 yrs after transplant. Rate of treatment for acute rejection within 3 yrs was higher in DBD than DCD TXs (44% vs. 37%; p=0.02). After adjustment for differences in donor, recipient, and transplant characteristics, 3-year survival after DCD and DBD TXs was similar (p= 0.80). Conclusions: Although the number of DCD lung TXs performed in the US is still relatively small, post-TX outcomes within 3 yrs are comparable to DBD TXs. Rate of treatment for acute rejection within 3 yrs is lower in DCD than in DBD LU TXs. These results suggest adoption of utilization of DCD donor lungs should be further expanded. Purpose: Lung transplantation (LTx) offers a survival benefit for patients with end-stage lung disease. When suitable donors are identified, centers must accept or decline the offer for a matched candidate on their waitlist. The purpose of the study was to evaluate the degree to which center-level variability in organ acceptance impacts candidate survival. We performed a retrospective cohort study of candidates aged ≥ 12 waitlisted for isolated LTx in the US using UNOS/OPTN data from 5/2007-5/2017. Centers that never had candidates ranked first on a match run >10 times in a year were excluded. Logistic regression was fit to assess the relationship of offer acceptance with donor-, candidate-, and geographic factors. Listing center was evaluated as a fixed effect to determine the adjusted per-center acceptance rate. Competing risks analysis employing the Fine-Gray model was undertaken to ascertain the relationship between the adjusted per-center acceptance rate and waitlist mortality. Results: Of 15,847 unique organ offers, 4,735 (29.9%) were accepted for the firstranked patients. After adjustment for important covariates, transplant centers varied markedly in acceptance rate practices (9% to 67%). Higher cumulative incidence of 1-year waitlist mortality was associated with lower acceptance rate ( Figure 1 ). For every 10% increase in adjusted center acceptance rate, the risk of waitlist mortality decreased by 36.3% (subdistribution hazard ratio 0.637; CI 0.592, 0.685). Importantly, high-acceptance centers that accepted ≥ 40% of their first-ranked offers had improved 1-year posttransplant survival compared to centers with adjusted organ acceptance rates <25% (88.7% vs. 82.7%, log-rank test p=0.003). Conclusions: Variability in center-level behavior potentially represents the largest modifiable risk factor for prolonged waitlist times and mortality in LTx. Further intervention is needed to standardize center-level organ offer acceptance practices and minimize waitlist mortality at the national level. Purpose: We proposed to evaluate physical frailty in lung transplant candidates to determine whether physical frailty correlated with LAS score and added to ability to predict patient outcomes including death on the waiting list. Methods: In this single-center prospective study, frailty was assessed in lung transplant candidates ≥ 60 years of age using Fried's Frailty Phenotype (FFP) and the Short Physical Performance Battery (SPPB). The FFP score is based on selfreported weight loss, fatigue, and physical activity, and measures of gait speed and grip strength. The SPPB score consists of measures of gait speed, balance, and lower extremity strength. Using previously established cutoffs, an FFP score of 0 = non-frail; 1-2 = pre-frail; and 3-5 = frail, while an SPPB score of 0-6 = frail; 7-9 = pre-frail; and 10-12 = non-frail. Results: Out of 44 subjects, 38 completed both SPPB and FFP assessments, while 6 completed only SPPB assessments. The median SPPB and FFP scores were 8 and 3 respectively ( Figure 1 ). By SPPB criteria 34% of subjects are frail, 41% are prefrail, and 25% are non-frail. By FFP criteria, 55.3% are frail, 39.4% are pre-frail, and 5.2% are non-frail ( Figure 2 ). The average LAS score is 44.2 (±14.2). Both the SPPB (r 2 =0.19, p = 0.01) and FFP (r 2 =0.33, p<0.01) are correlated with LAS score. 6 patients died on the waiting list. 16 candidates underwent transplantation. Conclusions: Although both physical frailty assessments correlated with LAS score, the FFP score is more strongly correlated. The SPPB and FFP assessments differed, resulting in discordant predictions of the prevalence of frailty in lung transplant candidates. Ongoing studies will determine whether SPPB data can add to the predictive value of LAS in terms of death on the waiting list and post-transplant outcomes for the growing numbers of older lung transplant candidates. The Results: Forty-eight SPK recipients with graft survival for more than 25 years transplanted between 1985 and 1993 were analyzed. Forty-seven patients underwent SPK with Bladder Drainage (BD). During the course, 8 patients lost their pancreas and kidney graft after lasting for at least 25 years. Forty still had a functional pancreas graft for more than 25 years as of 10/31/2018. Of these forty SPK recipients, transplanted between 1986 and 1993, 55% were male and all Caucasian. Thirtynine had BD. OKT3 was used in 33 patients for induction. Overall 3 had pancreas rejection. 27 required enteric conversion. Renal allograft failure occurred in 12 patients and 10 underwent a re-kidney transplant. Lower extremity amputation was required in 8 patients. Overall only 6 patients were on their original immunosuppressive regimen and 8 were not on a CNI based regimen. At last follow up, 39 recipients had hypertension, 27 had hyperlipidemia and 3 were on low dose insulin or oral agent for diabetes. The mean HgbA1c was 5.6 ± 0.6% and mean serum creatinine was 1.6 ± 0.6 mg/dl. Conclusions: With careful and detailed follow-up and attention to complications, some recipients of pancreas grafts have outstanding outcomes. Our report suggests, normalization of glucose control alone does not prevent secondary complications or perhaps immunosuppressive therapy causes secondary complications. As the number of pancreas transplant recipients with prolonged graft survival may be rising, health care providers should be aware of the management of complications associated with prolonged graft survival in this unique group of patients. Purpose: Outcome data following rejection in pancreas recipients are sparse. We sought to study the influence of C4d positivity and the presence of circulating DSA on the probability of becoming insulin dependent after initial rejection and the probability of recurrent rejection after initial rejection. Methods: A retrospective analysis of 272 primary simultaneous kidney and pancreas (SPK) and pancreas alone transplant (PAT) recipients from 1/1/2009 to 3/31/2018. Of these 53 recipients were treated for acute pancreas rejection. The presence of C4d staining on biopsy, circulating donor specific antibodies (DSA) and the date of return to insulin were identified. 8 subjects were excluded due to missing C4d staining. C4d status and circulating DSA grouped the subjects into 3 groups: C4d and DSA negative (-/-), C4d and DSA Positive (+/+), and either alone positive (+/-). Kaplan Meier curves were created to examine the probability of becoming insulin dependent after the initial rejection and the probability of recurrent rejection after initial rejection episode. Results: Out of 45 unique initial pancreas rejection episodes in SPK and PAT recipients, 27 recipients were in (-/-) group, 10 recipients were in (-/+) group, and 8 recipients were in (+/+) group. Estimates of insulin free survival at one-year post rejection were 93% (-/-), 55% (+/-), and 38% (+/+). There were significant differences in overall insulin free survival among groups (figure 1, log-rank p<0.001). Pairwise differences occurred for the both negative group vs. the positive and mixed groups (adjusted p<0.01). Estimates of recurrent rejection free survival at one-year postrejection were 85% (-/-), 60% (+/-) and 63% (+/+). There was no difference in overall recurrent rejection free survival (figure 2, log-rank p=0.29). In the settings of initial acute rejection, the absence of both circulating DSA and C4d positivity, compared to the presence of circulating DSA and/or C4d positivity, is associated with a lower risk of return to insulin dependence (surrogate of pancreas graft loss), but not the risk of recurrent rejection in the year following the initial episode. More studies are needed to identify and validate predictors of the point of no return to help inform the decision to treat and perhaps the choice of treatment. Purpose: Organ donors frequently develop hyperglycaemia in intensive care, which is managed with insulin. In islet and solid pancreas transplantation (PT) we reported that Donor Insulin Use (DIU) predicts worse beta cell function. Here, we aimed to: a) determine relationships between DIU and graft failure in PT; and b) describe donor phenotypes related to DIU predicting optimal outcomes. Methods: In data from the UK PT programme, regression models determined: a) associations between DIU and graft failure; and, b) the relationship between several donor phenotypes and graft failure, relative to an optimal donor phenotype. Net Reclassification Improvement assessed the added value of DIU as a predictor of graft failure. Results: 107 patients satisfied our selection criteria, 23 in Re-Ptx+ and 84 in Re-Ptx-. Most patients in both groups were male and Caucasian. Mean interval from SPK to pancreas failure was significantly shorter in the Re-Ptx+ compared to the Re-Ptx-group, 16.0 ± 31.9 vs 45.7 ± 47.0 months (p=0.005) respectively. There was no significant difference in kidney graft follow up post SPK between two groups (p=0.38). At last follow up, kidney graft failure was significantly higher in the Re-PTx-group compared to the Re-PTx+ group, 67% vs 43 % (p=0.04). Death-censored kidney graft failure was also higher in the Re-PTx-group, 48% vs 26% (p=0.06). In Kaplan-Meier survival analysis, kidney graft survival probability was significantly lower in the Re-PTx-group ( Figure 1 ). A similar pattern was seen after 1:1 matching for the interval between SPK and pancreas graft failure. The Re-Ptx+ and Re-PTxgroups had similar eGFR and serum creatinine at last follow up. In SPK recipients with pancreas graft failure, repeat pancreas transplant after pancreas graft failure is associated with better kidney graft survival. Even though Re-PTx patients take the risks associated with repeat pancreas surgery, the provider should discuss this option. Purpose: CNI based Immunosuppression is a well known risk factor for worsening renal function. We report our experience in converting Pancreas Transplant recipients from tacrolimus to Belatacept in order to avoid further worsening of kidney function. Methods: Chart review was performed on all patients with a pancreas transplant who were maintained on Belatcept maintenance immunosuppression. Results: Eight EBV IGG positive (7 PTA, 1 SKP, mean age at start of Belatacept= 52.4+/-7, 7 Females 7 Caucasians) patients initially maintained on tacrolimus, sirolimus and mycophenolate with biopsy proven chronic kidney fibrosis consistent with CNI toxicity were converted from Tacrolimus to Belatacept. Median Follow up of 53.7 months. Mean tacrolimus levels prior to switch were 6.4+/-1.4 ng/ml.Median time from transplant to conversion was 4.7 years (range0.3-9.5). Tacrolimus was weaned off over 4-6 weeks. Patients were maintained on a steroid free regimen of Belatacept, Sirolimus (level 3-6ng/ml) and Mycophenolate. Pre-conversion the mean MDRD eGFR was 29.6+/-9.9., which stabilized or improved over the follow up period to an eGFR of 35.4+/-9.8 ml/min. One patient could not tolerate the oral regimen due to GI side effects and another patient was non compliant. 1 case experienced elevation of Lipase requiring steroid therapy with subsequent successful response and continued on the same regimen with Belatacept. Subsequently at 21 months of Belatacept this subject (2) , 2008) , and to demonstrate the use and limitations of UNOS/OPTN data in ascertaining long-term outcomes in RCT participants. Methods: Astellas provided the original study data file containing identifying data for participants. UNOS used a multi-step process to link participants to UNOS/ OPTN data. A de-identified data set was produced and the original study data was returned to Astellas. Both intention to treat (ITT) and per protocol (PPR) analyses were performed: 70 CSWD and 73 CCS patients had discontinued the trial assigned treatment protocol by five years after the date of transplantation. We were unable to determine protocol discontinuation beyond five years because of absent data regarding maintenance immunosuppressant use. Results: Follow up to fifteen years, and outcomes could be determined in 385/386 study participants in UNOS files including all 196 of the patients randomized to CSWD at 7 days after transplantation, and 191/192 of the patients randomized to CCS. The figure shows that time to allograft failure from any cause including death (a), death censored allograft failure (b), and death with allograft function (c) were similar in patients randomized to CSWD or CCS (results were unchanged in PPR analyses). After 15 years, the MDRD estimated GFR was 55.7± 24.0 ml/min/1.73m2 in CSWD and 55.3 ± 27.2 ml/min/1.73m2 in the CCS groups with similar results in PPR analyses. The results were consistent in stratified analyses among African American (AA) and non-African American participants, and among living and deceased donor recipients (Table 1) . Table 2 shows 10 year outcomes stratified by donor source compared to single center data in patients treated with CSWD and the University of Minnesota (Rizzari et al CJASN 2012), and contemporary national data (2002 SRTR ADR). Irrespective of treatment group, outcomes in trial participants were generally superior compared to single center or national outcomes. Conclusions: Conclusions: Patients randomized to treatment with early CSWD or CCS had near identical outcomes after 15 years of follow-up with consistent results in AA and non-AA participants, and among living and deceased donor transplant recipients. The UNOS registry can be successfully utilized to obtain long-term RCT outcomes and this capacity should be expanded to promote future trials in transplantation. There is a lack of data about the impact of immunization after transplantation on re-transplantation. The aim of this study was to examine the timing, epitope triggers and long-term effects of HLA immunization, graft failure or re-listing on re-transplantation. Methods: From 1997 to 2017, 267 kidney graft failures of adults were detected and retrospectively analyzed. DnDSA were detected by solid-phase assays. Epitope matching was performed to predict development of dnDSA using the PIRCHE algorithm. Furthermore, the waiting time between re-listing at Eurotransplant and re-transplantation as well as graft survival and patient mortality depending on immunization were analyzed by means of Cox proportional hazards regression. Landmark analysis was used to avoid immortal time bias when assessing the effect of immunization on graft survival. Results: In total, the risk for graft failure was highest in the first two years after transplantation and then moderately decreased ( Fig. 1 ). DnDSA were detected in 137 allograft failures (51.3 %). The occurrence of dnDSA was associated with a higher PIRCHE score (Fig. 2 ). When immunization was diagnosed within one year ( Fig. 3 ) or within three years ( Fig. 4) after transplantation, the patients had a significantly higher risk of graft loss. The waiting time for re-transplantation was longer when patients were immunized before relisting at Eurotransplant (Fig. 5) . Surprisingly, when dnDSA were diagnosed before the first graft failure (66 patients), their mortality was lower, especially within the first two years after graft failure (18.8 % vs. 34.4 %), (Fig. 6 ). The immunization seems to have a great impact on waiting time for re-transplantation and kidney allograft survival. The PIRCHE algorithm may help to reduce the risk of dnDSA. The relationship between immunization and mortality needs further evaluation. Purpose: A decline in estimated glomerular filtration rate (eGFR) during the first 12 months following kidney transplantation is associated with poor outcome and increased graft loss. Assessing whether elevation in donor-derived cell-free DNA (dd-cfDNA) is predictive of 2 nd year eGFR decline may help to predict long term outcome. Methods: 173 patients identified from the DART study had dd-cfDNA (AlloSure®) and eGFR measured 1-10 times during the first-year post transplant and 1-6 times during follow up visits during the second year. The mean eGFR results from year 2 were compared to the mean from year 1 (day 90-365) in patients with ≥1 elevated dd-cfDNA (AlloSure ≥1%) in the first year vs. those without dd-cfDNA elevation. Association between elevated dd-cfDNA (≥1%) and future events, defined as an instance of low eGFR below a target level of 15 -30 ml/min/1.73m 2 , were also tested. dd-cfDNA was also explored as a risk factor in a Cox proportional adjusted hazards model. Results: 33 patients with ≥1 elevated dd-cfDNA in days 90 -365 were compared to 133 patients without dd-cfDNA elevation. 24/33 (73%) of patients with high 1 st year dd-cfDNA (≥1%) had a significant drop in eGFR in year 2 (median eGFR change -25%, IQR -46% to +2%) compared to 60/140 (45%) patients without elevated dd-cfDNA (median eGFR change +2%, IQR -18% to +45%), p = 0.002. dd-cfDNA ≥1% was associated with eGFR < 30 mL/min (p = 0.040, Chi Square test) and was a significant risk factor for eGFR below a threshold of 30 in the Cox model (p = 0.047), having a hazard ratio of 2.31 (95% CI 1.01 -5.28). The cause of eGFR decline remains multifactorial; however, elevations in 1st year dd-cfDNA (≥1%) are associated with declining eGFR in year 2. dd-cfDNA may have utility beyond detecting acute events, and regular surveillance may be a useful tool in prediction of long term outcomes. Purpose: Development of donor specific anti-HLA (DSA) although is a known risk factor for Chronic immune injury (CII), it is undetectable in many patients with CII, suggesting that immune response to non-HLA antigens may contribute to rejection. Previous reports have described antibody-mediated damage and poor allograft survival caused by immune response to several kidney self-antigens (SAgs) Collagen IV (Col-IV), Perlecan, and Fibronectin (FN)). The aim of this study was to determine if humoral immune responses to SAgs correlates with progression of CII changes at 1 year or 2 years (ci>1, cg>0). Methods: Patients who received a kidney transplantation at our institution over a 3 year period and had 1 or 2 year biopsies with ci>1, cg>0 were included in this study. We matched all cases to 2 controls, on recipient age, race, donor age, Living or Deceased and CV>0 on 4 month biopsy (figure 1.). Sera were obtained at 0, 4, 12, and 24 months and retrospectively analyzed for the presence of antibodies to SAgs using ELISA. The presence of antibodies to SAgs were compared at 0, 4, 12, and 24 months using conditional logistic regression. We identified a cohort of 214 kidney transplant recipients. Of these, we identified 33 cases (mean age: 52.4 years, mean KDPI: 66.5, Alemtuzumab induction: 58%, pretransplant DSA with >2000 MFI: 12.1%) and matched 66 controls. Logistical regression showed an odds ratio of 1 with the confidence interval crossing 1 for presence of response to SAgs at all the time points. In conclusion, in this preliminary study, humoral immune responses to either SAgs alone or in combination with DSA was not associated with progression of CII changes at 1 and 2 years after kidney transplantation. (15) and recurrent TG (34) patients. Patient Characteristics were similar between groups. Results: There were 233 genes more than 2-fold differentially expressed between normal and TG patients. Of these 233 genes, majority of them (199) were upregulated. To investigate the possible role of circulating miRNAs in the gene expression in renal tissue, we performed upstream analysis using Ingenuity Pathway Analysis (IPA) and predicted miR21, miR-155 and miR-223 as regulators of these gene expression changes. We have validated the changes in miR-21, miR-155 and miR223 by RT-qPCR using TaqMan miRNA probes. As seen in Figure 1A , the genes in pink were up regulated in TG patients compared to normal and miR-155 downregulate these genes directly or indirectly. As expected, miR-155 was downregulated in TG compared normal serum samples based on RT-qPCR results ( Figure 1B ). This observation encouraged us to examine the global miRNA changes in serum of TG patients using next generation sequencing (NGS). Figure 1C shows the representative gel image showing the cloned library from circulating RNAs for NGS. Our initial results demonstrated correlation between the circulating miRNAs and gene expression in graft tissue. Further evaluation of these small RNAs will provide further critical information about TG recurrence post KT Figure Odds of receiving a KDPI≤20% organ were significantly higher in Region 6 than any other part of the country, and lowest in Region 9 [odds ratio 0.19 (0.13-0.28)] [ Figure 1 ]. Conclusions: Over a third of KDPI≤20% kidneys go to patients with EPTS>20% on account of multi-organ allocation and prioritization for sensitized patients. There is considerable unexplained geographic variation in the odds of obtaining a KDPI≤20% organ for candidates with EPTS≤20% at listing that needs further study. Since OPTN policy prescribes that the kidney "follows the life-saving organ (e.g., liver, heart)," we extended the methodology to quantify and track the access advantage afforded KI candidates also listed for a liver (SLK) or heart (HR-KI), in light of the recent policy change requiring evidence of renal insufficiency for SLK Tx. Methods: Poisson rate regression with 17 candidate covariates was applied to 22 quarterly, period-prevalent cohorts (1/1/13-6/3/18) of active KI waiting list (WL) registrations including SLK and HR-KI, using OPTN data. Factor-specific disparities were quantified as the standard deviation (SD) of log(Tx rate) among registrations, holding other factors constant. To compare relative advantages, we calculated deceased KI Tx incidence rate ratios (IRR) for groups of candidates who receive high priority in KI allocation. Results: Pre-SLK policy change IRRs of 4.97 and 5.60 for SLK and HR-KI suggest MOT kidney candidates have a ~5-fold advantage in access to Tx compared to KIalone adult candidates, akin to the pediatric advantage, but not as high as for prior living donors (IRR=23.95). The SLK IRR declined slightly to 4.22 post policy change (Fig 1) . Despite the high IRRs for MOT, Fig Conclusions: MOT KI candidates receive an approximately 5-fold advantage in access to KI Tx compared to an average KI-alone candidate, but not as great as some highly prioritized KI-alone candidates (e.g., prior living donors). The slight decline in SLK IRR is consistent with the modest decline in the number of post-policy SLK Tx. Population-weighted disparities associated with SLK and HR-KI are relatively small, primarily since MOT represent a small % of the WL, and have not changed appreciably post-SLK policy. Figure 1 illustrates the number of candidate offers by donor expected kidney yield in relation to donor KDPI and the actual number of kidneys transplanted. The observed patterns correspond to a weak positive association (r=0.20, p<0.001) and no significant association (r=-0.004, p=0.85) between number of candidate-level offers per match and expected kidney yield and KDPI, respectively. There was a moderately negative correlation between KDPI and expected kidney yield (r=-0.55, p<0.001). The Organ Center works with a large cohort of difficult-to-place kidneys, providing a unique opportunity to consider improved placement to allow more efficient allocation progression and improved organ utilization. Not only are high-KDPI donors a large portion of the OC's portfolio, but they are also coming to the OC after being offered and refused at both the local and regional level and less than half of these donors have an expected kidney yield less than one. This cohort proves a suitable subset of organs for defining parameters to identify and refine a systematic approach to expand to the community as a whole, focusing on increasing placement of these difficult-to-place kidneys. Purpose: Approximately 3,500 donated kidneys are discarded in the US each year, drawing concern from Medicare and advocacy groups. We hypothesised that more aggressive organ acceptance practices would provide substantial benefit to waitlisted kidney transplant candidates and increase the donor pool. Methods: We performed a nationwide study using registries from the US and France comprising comprehensive cohorts of deceased donors with organs offered to kidney transplant centers between 2004 and 2014. We compared practice patterns and used logistic regression as well as computer modelling of organ acceptance and discard practices in both countries. Based on actual survival data, we then quantified the number of years of allograft life that a redesigned US system would have saved. Purpose: International discussions in 2017 about kidney "discard rates" between transplant (TX) professionals in the US and UK led to the realization that the two countries' approaches for quantifying this rate may not be directly comparable. While both define this rate as the % of kidneys recovered for TX that aren't TXed, in the UK kidneys are almost never recovered prior to TX program acceptance, whereas pre-acceptance recovery is common in the US. We compared trends in several kidney utilization rate calculations and other recovery and usage patterns across countries to identify cross-country learning opportunities. Methods: We analyzed deceased donor kidneys (recovered; TXed) from 2006-2017 in the OPTN and UK Transplant registries. Donor age and Kidney Donor Risk Index (KDRI) distributions were compared. The utilization rate (UR) was defined as the number of transplanted kidneys divided by various denominators, to assess trends and attempt to provide cross-country comparisons. Rates were stratified by KDRI and other factors. Results: Regardless of definitions, kidney URs have been steady in both countries over the past 5 years (Fig 1) . Among kidneys recovered for transplant (solid curves), 90% are utilized in the UK vs. 81% in the US, despite substantially higher KDRI and age distributions in the UK among both recovered and TXed kidneys (Fig 2) . In the US, 95% of kidneys with a documented final acceptance are utilized. In 2017, 41% of utilized kidneys in the UK were from DCD donors compared with 19% in the US. Only 0.8% of kidneys TXed in the UK were en bloc compared with 2.6% in the US. Table 1 . By univariate the only significant difference in those still alive compared to those who died was younger age at donation (p<0.001). By Cox PH regression for mortality there were no significant differences between groups. KM mortality curve is shown in Figure 1 . The incidence of mortality at 50 years post-donation was 63.9% overall with mean (SD) age at death of 74.2 (12.3): 25.2% CVD, 75.6 (12.6); 9.8% infection, 76.5 (11.6); 12.3% malignancy, 70.9 (6.0); and 23.8% other/unknown, 73.5 (14.1). Figure 2 shows the cumulative incidence of HTN, DM and proteinuria. HTN developed in 79.7%, mean age (SD) of 59.9 (14.8); diabetes in 19. 3%, 62 (19.4) , and proteinuria in 18.1%, 60.6 (18.2). Two donors (3.3%) developed ESRD; 1 hypertension and 1 hemolytic uremic syndrome. Conclusions: Of 66 donors who donated more than 50 years ago, 5 were lost to follow-up. In the LTFU cohort, 22 (36%) survived ≥ 50 years and 39 (64%) died. The main cause of death was CVD. HTN was the most common post donation comorbidity, with half of donors developing the condition by 36.5 years post. Two patients (3.2%) developed ESRD. steroid only induction. All patients received IV steroids for treatment for GVHD. 69% were also treated with Etanercept and 39% with photophoresis. Mortality was 62% with median follow-up of 115 days. Median time from diagnosis to death was 33 days. Survival was associated with only skin involvement on presentation and negative donor STR in the blood. Conclusions: This study aids in our understanding of a dismal complication after LT. In this cohort, the majority of patients presented with rash and fevers, the most common etiology of liver disease was NASH, and a donor-recipient age gap >20 years was commonly seen. Despite the high mortality with GVHD, limited skin involvement portends a favorable prognosis compared to all other presentations. Further research is planned to compare this cohort with non-GVHD control group in order to better understand risks factors and predictors for disease. . There were no significant differences in operative blood loss, transfusion requirements, operative time, cold ischemia time, warm ischemia time, intraoperative flow measurements, ICU or hospital length of stay between the 2 groups. Primary graft non-function was not observed in either group. Some postoperative complications are summarized in Table 1 . All cases of post-operative ascites resolved within 3 months of LT. The eGFR decreased by 23% in the DSVT group and by 21.5% in the NDSVT group at 5 years (p 0.3). Survival at 5 years was similar for patients with DSVT and NDVST (93.7% vs 96.8%, p=0.9). Conclusions: RPA is an established technique in the management of patients with DSVT during LT, with comparable outcomes to patients without thrombosis. Our report is the first to demonstrate similar long-term renal function in patients undergoing RPA for DSVT when compared to patients undergoing conventional LT. We suggest that RPA be considered in patients with DSVT with and without SRS as a method by which to negate alternative approaches to DSVT, such as cavoportal hemitransposition or multivisceral transplant. Purpose: Non-alcoholic steatohepatitis (NASH) cirrhosis is an increasingly common indication for liver transplantation (LT) in the United States and obese patients continue to be at risk for recurrence of NASH following LT. Intensive lifestyle and medical induced weight loss may prevent NASH recurrence following liver transplantation (LT) but, similar to obesity without liver disease, has rarely demonstrated a sustained effect. In contrast, bariatric surgery is remarkably effective in producing sustained weight loss. Methods: A single-institution, prospectively maintained database from 2014-2018 was queried for patients undergoing laparoscopic sleeve gastrectomy (LSG) following LT. The selection criteria for surgery were consistent with current NIH guidelines and patients were required to be a minimum of 6 months from their LT. Results: We identified 15 patients who underwent LSG following LT during the study period. The patients were predominantly Caucasian (86.7%) and female (60%), with median age of 59.0 years. Median time from LT to LSG was 2.2 (1.4-5.4 ) years and median follow-up was 2.6 (0.6-3.9) years. Following LSG, BMI decreased from 43.0 to 35.9 kg/m 2 (p<0.001). In 12 patients with at least 1 year of follow-up, the percent total body weight loss was 20.6 % (14.5-27.6). Average daily insulin requirement decreased from 26.9 to 9.9 units/day following LSG (p<0.05). Four of six patients resolved their insulin dependence following LSG. The number of anti-hypertensive medications remained unchanged. No patients required conversion to an open procedure, one patient required ICU monitoring due to intra-operative blood loss, but did not require a transfusion. The median length of stay was 2 (1) (2) days. There was one post-operative complication that consisted of a surgical site infection. There were no postoperative deaths. In the largest series to date, we demonstrate the safety and efficacy of laparoscopic sleeve gastrectomy in post-liver transplant recipients along with its pronounced effects on the resolution of T2DM. Our experience highlights the relatively low morbidity and mortality in the delayed approach compared to the previously reported outcomes in simultaneous LSG/LT. (1) assess the efficacy of proximal SAE for the treatment of RA and RH caused by portal hyperperfusion (PHP) after LT and to (2) Purpose: This study describes the comparative safety and efficacy of direct acting oral anticoagulants (DOACs) relative to warfarin following liver transplantation, at a large academic transplant center. Methods: This was a single-center, retrospective cohort review of adult liver transplant recipients prescribed either a DOAC or warfarin between January 2014 and January 2018. Patients were excluded if they had active cancer or discontinued anticoagulation therapy prior to 60 days for a reason other than acute bleeding or thrombosis. Patients receiving DOACs were matched with warfarin treated controls using an exact greedy matching algorithm based upon the following clinical parameters: donor type, age, history of hepatocellular carcinoma, indication for anticoagulation, HAS-BLED score, timing of anticoagulation, and duration of anticoagulation. Matched patients were then followed from the time of anticoagulation initiation, until treatment discontinuation or study conclusion. The primary endpoint for this review was the incidence of clinically significant bleeding between the two treatment groups. Secondary endpoints included incidence of major bleeding, incidence of new thrombotic complications, and change in calcineurin inhibitor trough/dose ratios at 30-days post-DOAC initiation. Results: A total of 27 patients prescribed DOACs were identified for inclusion in the study. Among these patients, there were 4 episodes of clinically significant bleeding and 4 instances of new or recurrent thrombosis over a median of 351 (61-784) anticoagulation days. Exact warfarin matches could be found for 20 out of the 27 patients originally managed with DOACs. Demographic and clinical characteristics were similar between the two treatment groups at baseline. At the conclusion of the study review period, the DOAC treated patients were found to have significantly lower rates of clinically significant bleeding (3 events vs. 10 events; p = 0.01) and major bleeding (0 events vs. 4 events). No statistically significant differences were found in the rates of new or recurrent thrombotic events between the DOAC and warfarin arms (4 events vs. 3 events; p = 0.67), and no differences were observed in tacrolimus trough/dose ratios at 30-days following DOAC initiation. After logistic regression, treatment assignment with warfarin continued to be associated with a significantly higher odds of clinically significant bleeding (OR = 12. The primary objective is to describe the impact of bezlotoxumab on 90-day CDI recurrence in the SOT population at a single academic center. We further provide a description of rCDI risk factors, incidence of acute kidney injury (AKI), severe CDI, and 90-day mortality. Methods: This is a retrospective chart review of adult SOT recipients with CDI diagnosed between June 30, 2017 and September 30, 2018. Patients received standard-of-care antibiotics with or without bezlotoxumab per physician discretion. Results: A total of 21 patients received treatment for CDI during the study period; n = 12 in the bezlotoxumab group, n = 9 in the control group. As noted in Table 1 , baseline characteristics were similar. 90-day recurrence rates in patients that received bezlotoxumab was 9% (n = 1 of 11*) numerically lower than the control group recurrence rate of 33% (n = 3 of 9), however there was no statistical difference (p = 0.368). There were no deaths within 90 days of CDI. ) of all inpatients with a positive CD screen were treated with oral vancomycin within 48hrs of swab collection. In the SOT cohort, 22% (9/41) with positive CD screens were treated with oral vancomycin within 48hrs. 56% (5/9) of SOT patients who received oral vancomycin did not have true infection (defined as documentation of diarrhea in the nursing record or the physician notes). Conclusions: At our institution, SOT recipients were more likely to have CD colonization detected by peri-rectal screening than the general inpatient population. SOT and non-SOT patients were treated with oral vancomycin at similar rates in response to the positive screen. Over half of the oral vancomycin use in SOT recipients was likely overtreatment, but this finding is limited by the low number of patients in this cohort. Conclusions: Based on ICD-9 or ICD-10 hospital discharge codes, the overall prevalence of PCP and toxoplasma infections was low, but with significant morbidity and all-cause mortality in pediatric transplantation recipients. Purpose: This study will assess whether a higher number of medication changes from pre-to post-transplant lead to worse clinical outcomes and higher healthcare utilization in kidney transplant (KT) recipients. Methods: Retrospective cohort study of adult KT recipients between 07/2015 and 07/2017. All patients that had an admission medication reconciliation verified by a pharmacist during their transplant hospitalization and were seen in clinic by a pharmacist within three days of discharge were included. Medication lists were compared pre and post-transplant to identify the number and type of changes. Results: There were 344 patients included. The median net number of medication changes was 15 with ≥18 in the highest quartile. Patients with ≥18 medication changes had significantly higher BMIs, had preexisting DM, re-transplant, and delayed graft function (Table 1) . 90-day (45% vs 22%, P = < 0.001) and 1-year readmissions (54% vs 34%, P = 0.001) were significantly higher in those with ≥18 medication changes. Overall healthcare utilization at 90-days post-transplant was significantly higher in those with ≥18 medication changes ( Table 2 ). Time to first readmission was significantly lower in patients with ≥18 medication changes ( Figure 1 ). Conclusions: 18 or more medication changes during the transplant hospitalization in KT recipients may be a risk factor for increased readmission rates, decreased time to first readmission, and overall healthcare utilization. Patients with a high number of medication changes should be identified as high risk patients, prompting transplant teams to optimize education and dedicate additional clinical resources to decrease readmissions and overall utilization in this patient population. . Changes in HRQoL scores from baseline to 1 and 3 years post KTx are illustrated in the figure. Compared to baseline the scores improved after one year for most dimensions, except for social function (SF). At three years the SF score had improved significant, both clinically and statistically but physical function (PF) score declined to baseline level, probably due to ageing. The scores for the other dimensions remained relatively unchanged (Table) Conclusions: Our data suggest that HRQoL improves after KTx, also among older recipients and that the improvement remains after three years. The most remarkable change is a marked improvement of social function, illustrating that the patients are finally "getting their lives back". The respectively. RBANS population age-adjusted mean scores improved from baseline (84±14) to 3 months (87±15), though not significant statistically, p=0.33. Thirty-four percent of patients at baseline and 30% at 3 months had impaired global cognition (score <78).For the TRAILS tests, 34 and 32 patients underwent baseline and 3 month post transplant testing, respectively. Although TRAILS A mean t-scores increased from baseline (41±10) to 3 months (45±11), this was not statistically significant, p=0.16. TRAILS B mean t-scores improved significantly from baseline (41±11) to 3 months (48±12), p=0.016. The proportion of patients with cognitive impairment (score <=35) decreased from baseline to 3 months in both TRAILS A (38% to 22%) and B (29% to 16%). of patients were able to maintain serum phosphorus less than 5.5 for three months prior to transplant. Of the 101 patients studied, only 11 were able to maintain all three parameters [minimal variance from treatment time, potassium less than 5.5, and phosphorus less than 5.5] for three consecutive months pre-transplant; see Table 1 for characteristics of these patients. There were no differences in one year graft and patient survival in patients who were able to maintain any or all of the parameters. Conclusions: Pre-transplant patients overwhelmingly do not maintain diet and dialysis adherence consistently. The utility of using potassium, phosphorus and variance from prescribed dialysis treatment time in recipient selection is unclear. loaded with a ligand of the AHR have been shown to promote the generation of Tregs by DCs and reduce disease in murine models of EAE and diabetes. We propose the use of these tolerogenic nanoparticles in models of allogenic transplant as a novel method of targeted immunosuppression via the AHR. Methods: PLA/PEG NPs approximately 50 nm in size were manufactured and loaded with ITE, an AHR ligand, and peptide antigen to the alpha chain of I-E (Eα-peptide) or ovalbumin to evaluate for antigen-specific response. BMDCs generated from AHR+/-and AHR-/-mice were exposed to free ITE, NP-ITE, NP-peptide/ITE. Cells were evaluated for uptake into DCs by flow cytometric analysis of Cy5.5-labeling of NP, and ability to activate AHR determined by RT-PCR expression of CYP1A1 and IDO1. DCs isolated from OT-II splenocytes were generated and exposed for 4 days to ova, NP-Ova, NP-Ova/ITE for evaluation of proliferative response and effect on cytokine production of IL-17 and IFN ;. NPs were utilized in-vivo by injecting CFSE-labeled OT-II splenocytes IV into B6 mice, treating the following day with ova peptide or NP, and performing flow cytometric analysis of proliferation after 3 days. Purpose: Mechanistic target of rapamycin complex 2 (TORC2) deficiency in conventional dendritic cells (DC) results in their enhanced pro-inflammatory and T cell allostimulatory activity; however, underlying mechanisms remains unclear. Methods: A Seahorse XFe96 Bioanalyzer was utilized to measure metabolic flux in real-time for bone marrow-derived DC generated from wild-type control (Ctrl) C57BL/6 (B6) or CD11c-CreRictor f/f (herein referred to as TORC2 -/-DC) B6 mice, in the absence or presence of rapamycin. ATP concentrations and mitochondrial mass were determined using an ATP determination kit and flow cytometry. The Golgi apparatus was visualized by confocal and transmission electron microscopy. A mouse nCounter immunology panel was used to analyze the transcriptional profiles of TORC2 -/-DC versus Ctrl DC. Results: TORC2 -/-DC used an altered metabolic program compared to Ctrl DC, characterized by enhanced baseline glycolytic function, increased dependence on glycolytic ATP production and higher viability following LPS stimulation. TORC2 -/-DC exhibited an increased spare respiratory capacity (SRC) compared to Ctrl DC. This metabolic phenotype corresponded with increased mitochondrial mass and failure of TORC2 -/-DC mitochondria to depolarize following stimulation. TORC2 -/-DC displayed more compact Golgi stacks with less perinuclear localization compared with Ctrl DC. Rapamycin-mediated inhibition of mTORC1 activity in TORC2 -/-DC led to loss of their enhanced SRC and glycolytic activity. The altered metabolic activity of TORC2 -/-DC could be ascribed to enhanced TORC1 activity, namely increased expression of multiple genes upstream of Akt/TORC1 activity, including the integrin alpha IIb, protein tyrosine kinase 2/focal adhesion kinase, IL-7R and Janus kinase 1, culminating in increased expression of the transcription factors peroxisome proliferator-activated receptor gamma and sterol regulatory element-binding transcription factor 1. Purpose: The expression of the Aire (Autoimmune regulator) gene was observed to inhibit the maturation and activation of dendritic cells (DCs) and induce transplantation tolerance. The bone marrow-derived DCs were cultured in vitro, and the Aire gene was transferred into DCs by means of green fluorescent protein (GFP) fluorescent adenoviral vector. The Aire gene-modified immature (imDCs) was detected by flow cytometry and enzyme-linked immunosorbent assay (ELISA). The phenotypic and functional changes of DCs before and after lipopolysaccharide (LPS) stimulation were established. A mouse model of abdominal heterotopic heart transplantation was established. Three days after the operation, the receptors were administered to the dorsal vein of the penis for injection of phosphate buffered saline (PBS) and imDCs. DC-GFP and DC-Aire were used to observe the heart graft activity and survival time of each group of mice. At the same time, the grafts were taken for hematoxylin-eosin (HE) and immunofluorescence on the 11th and 100th day after operation. The graft rejection intensity of each group was compared. Mean speed and displacement of OT-I and OT-II cells significantly decreased over time after immunization while AC and mean CT significantly increased. B cell mean speed, displacement and AC increased after immunization. These data are consistent with B cell activation and productive T cell-DC interactions and mirror previously reported data in secondary lymphoid organs. Here, we show that TLO form in and contribute to allograft rejection. We provide first evidence that TLO provide a local structure for T and B cell activation that might propagate anti-graft immune responses in the setting of chronic rejection. Further studies will elucidate the formation and maintenance of TLO and the consequences of local T and B cell activation. Data will be applicable to other conditions with TLO formation such as autoimmune disease and cancer. We established the molecular causes of increased GBM thickness by STORM. While Collagen a3/4/5(IV) did not appear to change much in the thickened GBM areas, staining with collagen a1/1/2(IV) showed clear increases in intensity at the endothelial side of the GBM that correlates with the thickened GBM, and often appeared as finger-like protrusions extending from the endothelial side of the GBM towards the podocyte side. Sometimes collagen a1/1/2(IV) patterns resembled the duplicated GBM patterns seen by electron microscopy. The use of super-resolution techniques allowed us to identify molecular changes that constitute the increased GBM thickness in TG. Our data suggest that disease starts from the endothelial side by secretion of extra collagen a1/1/2(IV), which contributes to the increase in GBM thickness. It also suggests that Collagen a1a1a2(IV), when reaching the podocyte side, might send an injury signal to the podocytes that leads to foot process effacement. Purpose: Infection remains a major cause of morbidity and mortality after kidney transplantation (KT). However, reliable biomarkers to predict the occurrence of post-transplant infection are lacking. In this study, we aimed to investigate the predictive values of pre-and post-transplant levels of T-cell immunoglobulin and mucin domain-3 (Tim-3) and Galectin-9 (Gal-9), two pleiotropic immunomodulatory molecules that involve in the negative regulation of T cell response, for the early prediction of infection after KT. Methods: Serum Tim-3 and Gal-9 were measured with ELISA and Bio-Plex® suspension array system, respectively, at pre-transplant and 30 days after transplantation in 95 KT recipients (KTRs). The decline rates of Tim-3 and Gal-9 were calculated relative to pre-transplant levels. Receiver-operating characteristic curves (ROC) and Cox regression models were utilized to assess the predictive performances and the impacts of Tim-3 and Gal-9 on the occurrence of infection during first two years after KT. Results: KTRs with infection history had significantly higher levels of Tim-3 and Gal-9 on day 30, and lower relative decrease rates of Tim-3 and Gal-9 compared to non-infected recipients, while no difference was observed between two groups regarding pre-transplant levels (Fig.1) . The AUCs for predicting 2-year posttransplant infection were 0.673 and 0.728 for Tim-3 and Gal-9 on day 30, 0.684 and 0.694 for relative decrease rates of Tim-3 and Gal-9, respectively ( Fig.2a-2b) . After adjusting for potential confounders, Tim-3, Gal-9 on day 30 and relative decrease rates of Gal-9 remained as independent risk factors for post-transplant infection during follow-up period (Hazard ratios: 3.06, 2.76 and 2.76, respectively)( Fig.2c-2f) . Post-transplant Tim-3 and Gal-9 levels on day 30 were novel and potentially valuable predictors for infection during first 2 years after transplantation, while pre-transplant Tim-3 and Gal-9 showed no predictive power to such complication. In addition, the relative decrease rate of Gal-9, not Tim-3, was useful in identifying the KTRs with high risk of infection. Purpose: Kidney graft failure is a major concern in patients with kidney transplantation. Patients who suffered graft failure are at a higher risk for mortality after returning to dialysis; therefore, monitoring graft function and early detection of transplant complications is crucial to maintain long-term graft function and survival. Urinary extracellular vesicles (UEVs) are currently being studied as a potential source of biomarkers due to their role as signaling molecules during both normal and diseased state. Recurrence of focal segmental glomerulosclerosis (FSGS) in the allograft occurs in 25-35 % of patients, and it is associated with poor renal allograft survival. Recent data support the hypothesis that circulating permeability factors play a crucial role in podocyte injury. This study aims to assess the level of podocyte-specific UEVs in kidney transplant subjects with FSGS recurrence using flow cytometry. Methods: UEVs were isolated from urine samples of control subjects (CTL) and subjects with FSGS recurrence after kidney transplantation. Isolated UEVs were stained with anti-CD63 and Annexin V to identify exosomes and microparticles, respectively. Podocyte-specific UEVs were identified using podocalyxin antibody. Stained samples were analyzed using flow cytometry. Purpose: While the negative impact of anti-HLA donor specific antibodies (DSA) on graft outcomes has been extensively investigated, little is known about early changes in the transcriptional profile and phenotype of circulating B and T cells before DSA development. We performed time-of-flight mass cytometry (CyTOF) and RNAseq on prospectively collected PBMC from pediatric kidney transplant recipients who developed (n=10) or did not develop (n=11) DSA within the first year after transplantation. PBMC were obtained at 2 months post-transplant, 3 months prior to DSA development, and at first DSA detection (PBMC from DSA NEG controls were collected at the same time-points). Results: DSA POS and DSA NEG patients had similar baseline characteristics and comparable B and T cell frequencies across the different time-points. Within DSA POS patients, only those that went on to develop antibody-mediated rejection (AMR; n=5) had increased B cells with an antibody-secreting phenotype (CD27 + CD38 + , Figure 1A , B p=0.0042) and a memory phenotype (CD27 + CD95 + IgD -CD38 -, Figure 1A , C p=0.0005) compared to DSA NEG and DSA POS AMR NEG recipients at the time of DSA detection. RNAseq analyses revealed differences in transcriptional programs involved in immunoglobulin formation and B cell activation (Gene Ontology enrichment) between DSA NEG and DSA POS patients at 3 months prior to DSA development, specifically in antibody-secreting B cells (CellCODE analysis). There was no difference in DSA titers between patients who developed AMR and those who did not (13,687±4,159 vs. 11,375±1,894 MFI, respectively; p=0.63). Conclusions: Our extensive CyTOF phenotypic and transcriptional analysis shows that circulating memory and antibody-secreting B cells expand and initiate a specific transcriptional profile months prior to the development of DSA and >1 year prior to biopsy-proven AMR. Purpose: Granulocyte-colony stimulating factor (G-CSF) and granulocytemacrophage CSF (GM-CSF) are known to mobilize immune cells to the peripheral blood. We evaluated regulatory T cells (Treg) percentages, phenotype and function in leukapheresis products obtained from G-CSF/GM-CSF-treated monkeys. Methods: Juvenile rhesus monkeys (n=6; 5-7kg) were treated with recombinant human GM-CSF (10 mg/kg/day for 4 days), then recombinant human G-CSF (10 mg/kg/day for 4 days), followed by leukapheresis. Absolute numbers of lymphocytes in the peripheral blood were evaluated before and after treatment. Leukapheresis products were evaluated for CD4 + CD25 hi Foxp3 hi Treg percentages and absolute numbers. Pre-(peripheral blood) and post-(leukapheresis product) treatment CD4 + CD25 + Treg were isolated, followed by polyclonal expansion up to 12 days only. Pre-treatment, post-treatment, and expanded Treg were evaluated for Tregspecific markers, chemokine receptors, transcription factors and suppressive function. Results: Following G-CSF/GM-CSF administration, peripheral blood lymphocyte numbers increased significantly (p<0.05). Notably, the percentage CD4 + CD25 hi Foxp3 hi Treg increased significantly from 2.6% to 5% (p<0.01). In leukapheresis products, the mean absolute number of Treg was 26x10 6 (4x10 6 /kg). No significant differences in the expression of Foxp3, CTLA4, CD27, and chemokine receptors (CCR4/CCR7/ CXCR3) by Treg were observed after treatment. However, Helios expression was significantly increased (p<0.05). While no significant difference was observed in the expression of the transcription factors T-bet and RORγt by Treg, GATA3 expression was significantly decreased (p<0.05). After polyclonal stimulation, the expansion capacity of pre-treatment Treg was 4-26 fold on day 6 and 380-660 fold on day 12, while the expansion capacity of post-treatment Treg was 6-26 fold on day 6 and 311-314 fold on day 12. Both pre-and post-treatment expanded polyclonal Treg exhibited comparable phenotype and efficiently suppressed allogeneic T cell proliferation in response to αCD2/CD3/CD28 stimulation. we utilized a mouse model of CsA acute nephrotoxicity. Mice received a 0.1% NaCl diet for 7 days, followed by CsA 90 mg/kg IP daily for 1 week. Mitochondrial function and structure were determined using Seahorse™ analyzer and western blot analysis of major components in mitochondrial electron chain complexes. Mouse kidney homogenates and PTECs were also analyzed for expression of mTOR, β-catenin, AMPK, ERK1/2 and PI3K/Akt signaling. Results: Exposure of PTECs to CsA resulted in dose dependent decrease in mitochondrial oxygen consumption rates (OCR). Notably, significant decreases of OCR basal, maximal, ATP-linked, and mitochondrial reserve capacity were elicited by non-apoptotic concentrations of CsA ( Figure 1 ). The CsA-mediated effects were associated with significant decrease of NDUFB8 (complex I), FeS comp I (complex II), and subunit I of complex IV (Figure 2 ). Such loss of mitochondrial indices was accompanied by bioenergetic reprogramming of PTECs. While oxidative phosphorylation was diminished, we observed an increase of extracellular acidification rates (ECAR), indicative of enhanced glycolysis. Metabolic reprogramming and remodeling was accompanied by significant activation of mTOR, along with activation of anabolic and pro-remodeling signaling related to MAPK ERK1/2 and PI3K/Akt phosphorylation and accumulation of β-catenin. Importantly, similar alterations, including mitochondrial, bioenergetic remodeling and epithelial injury, were observed in a mice model of CsA-mediated nephrotoxicity. Our results indicate that acute exposure of PTECs to CsA results in a loss of oxidative phosphorylation, with bioenergetic reprogramming to glycolytic metabolism as an initial adaptive (pro-survival) response. However, these responses culminate in metabolic and bioenergetic maladaptation and renal epithelial cell dysfunction. Disruption and return to cellular homeostasis may be one strategy to ameliorate CsA nephrotoxicity. Thus, in addition to the intended effect of blocking CD28-mediated costimulation, CTLA4-Ig also has the unintended effect of blocking CTLA-4-mediated coinhibitory signaling. Recently, anti-CD28 domain antibodies (dAb) that selectively target CD28 while leaving CTLA-4 signaling intact were shown to more effectively inhibit alloimmune responses and prolong graft survival in both murine models and nonhuman primates. However, the impact of selective CD28 blockade on protective immunity has not yet been investigated. Methods: We sought to compare the impact of CTLA-4Ig vs. anti-CD28dAb on murine CD8 + T cell immunity in the setting of a transplant-relevant pathogen, a murine homolog of Epstein-Barr virus. C57BL/6 mice were infected with 2×10 3 PFU of a YFP-labeled recombinant murine gammaherpesvirus-68 (MHV-68) and treated with saline control, CTLA-4Ig, or anti-CD28 dAb. Splenocytes from infected mice were harvested at 14 and 28 days post infection and anti-viral immune responses and viral load were assessed via flow cytometry. Results: While anti-CD28dAb treatment resulted in a decrease in virus-specific CD8 + T cell frequencies and numbers as compared to CTLA-4Ig, effector function in the form of IFN-gamma production was not different between treatment groups. Similarly, viSNE analysis revealed that the expression of CXCR3 and CD27, surface markers indicating high-quality effector function, was also similar between the anti-CD28dAb and CTLA-4Ig treated groups. Additionally, CITRUS analysis of co-stimulatory and co-inhibitory marker expression (PD-1, TIM-3, and TIGIT) on MHV-specific CD8 + T cells demonstrated no difference between the two treatment groups. Importantly, MHV viral load was not significantly different between the anti-CD28dab and CTLA-4Ig treatment groups. We demonstrate that preserved CTLA-4 co-inhibition limits MHVspecific T cell accumulation, but the virus-specific CD8 + T cell population that remains retains sufficient effector function to control viral burden. These data indicate that use of selective CD28 blockade, in the form of non-crosslinking anti-CD28 dAb, is equivalent to CTLA-4Ig in its ability to control viral load during gammaherpesvirus infection, further highlighting the clinical promise of this therapy. showing that expression of CD28 on the surface of T cells was undetectable in the CFZ+CD28dAb group for 4 weeks after desensitization. Survival in the CFZ+CD28dAb group was significantly improved compared to the control group and similar to our experience using CFZ+Belatacept (see Figure) . None of the CFZ+CD28dAb NHPs have shown any significant reactivation of CMV. NHPs. This dual targeting regimen may translate into a clinical desensitization protocol to improve outcomes for highly sensitized transplant recipients. We have previously demonstrated that administration of a heat shock protein 90 (HSP90) inhibitor prolonged the allograft survival in murine skin and heart transplantations. In addition to effects on antigen presentation and proliferation of lymphocytes in the host, HSP90 is associated with the production of damageassociated molecular patterns and regulation of the expression of MHC molecules. These mechanisms in the graft may influence the host alloimmune response. We therefore hypothesized that preconditioning of the graft with an HSP90 inhibitor might suppress the alloreaction and prolong the graft survival in solid organ transplantation. Methods: Murine heterotopic heart transplantation was performed using C57BL/6 (H-2b) and BALB/c (H-2d) mice as donors and recipients, respectively. In the graft preconditioning group,the graft was perfused in situ with 1 mL of a cold solution of alvespimycin, an HSP90 inhibitor, in which the agent was dissolved in heparinized Ringer's lactate solution at a concentration of 250 µg/ml. The graft survivals of the graft preconditioning group and the control group were compared. In addition, cardiac allografts of the recipients were collected on day 3 and 5 for histopathological evaluation and quantitative RT-PCR. Results: Median allograft survivals were 14 and 7 days in the graft preconditioning group and control group, respectively (log-rank p <0.001). Histologic examination of the grafts revealed that cell infiltration was milder in the graft precondition group than in the control group. In immunohistochemical study, infiltration of neutrophils and dendritic cells in the graft was suppressed in the graft preconditioning group on day 3. Subsequently, expression of IL-2 mRNA and IL-12p40 mRNA in the graft was downregulated in the graft preconditioning group on day 5 as compared with the control group. The results of this study suggest that preconditioning of a graft with alvespimycin may suppress the adaptive immune response, leading to inhibition of acute allograft rejection. Preconditioning of the graft targeting HSP90 may be a promising strategy in solid organ transplantation. Purpose: Metabolic reprogramming has been identified as a critical regulator of immunity. Little is known on changes of T-cell metabolism in aging and how to utilize metabolic targets for immunosuppression. Here, we show that T-cell metabolism changes drastically in aging. Most notably, targeting T-cell metabolism provides an age-specific and effective immunosuppression. Methods: Naïve CD4+ T cells were collected from C57BL/6 mice (3 and 18 mths) and activated with anti-CD3 and anti-CD28 for 24 hrs. Oxidative phosphorylation (OXPHOS) and aerobic glycolysis were assessed by oxygen consumption (OCR) and extracellular acidification rate (ECAR) using a Seahorse XFe96 extracellular flux analyzer. Full thickness skin grafts were transplanted from young (3 mth DBA/2 mice) to young and old C57BL/6 recipients (3 and 18 mths). Immune profiling was performed at sequential time intervals using a FACS Canton II flow cytometer. Results: Old, but not young CD4+ T cells demonstrated compromised metabolic rates with significantly lower OCR and ECAR rates (p<0.0001). Moreover, old CD4+ T cells demonstrated a significant mitochondrial impairment with compromised respiratory and glycolytic capacities (p<0.0001). These results indicated a limited capacity of old CD4+ T cells to respond to metabolic stress. Glutaminolysis has been identified as a critical metabolic pathway in T-cell immunity. 6-diazo-5-oxol-norleucine (DON) is an analog of glutamine that inhibits critical enzymes of glutaminolysis. DON treatment inhibited IL-2 production in both young and old CD4+ T. (n=3) for detailed immunohistochemical analysis. Slides derived from predetermined anatomical locations within each hemiliver and were quantitatively analyzed with Aperio Imagescope software. Results: CD46 staining confirmed the separation of each type islet into their distinct hemilivers. Tissue factor stain was weaker and limited in the islets at 1 hour and increased in the islets, and significantly increased in the GKO islet areas at 24 hours (P=.026). CD31 and tissue factor dual stains demonstrated CD31 expressed in platelets on islets and endothelium with tissue factor positive cells in the islets and surrounding the venules at 24 hours. There was a large amount of platelet and factor XIIIa accumulation detected in and surrounding the islets. The platelet and factor XIIIa positivity in hCD46/GKO islets was significantly lower compared to GKO islets at 24 hours (P=.0025 and .0378, respectively). Conclusions: These data show that neonatal porcine xeno-islet transplantation evokes the coagulation cascade and inflammation immediately after transplantation and this is further augmented at 24 hours. Platelets and tissue factor are prominent as is the final stages of the clotting cascade-factor XIIIa accumulation on the islets and surrounding tissue. This may have a strong relationship with early islet loss. CD46 expression significantly reduces these factors. CD46 is an important co-factor controlling complement mediated injury, thus CD46 gene modified organs and early employment of anti-coagulation remedy may provide the new approaches for better transplant outcome. HLA-E, A20 and humanized vWF were used in 49 single lung transplants into baboons; 6 also included β4GalKO. An evidence-based "platform drug regimen" consisted of steroids, sC1Inh, thromboxane synthase inhibition, anti-histamine, and αGPIb Fab. Donors were treated with Desmopressin (DDAVP) to deplete pig endothelial vWF. Immunosuppression consisted of αCD20, ATG, MMF, αCD40, usually with αIL6R moAb and/or Alpha-1 Antitrypsin. Xenograft function was assessed intra-operatively by transplant blood flow measurements and life-supporting lung capacity, and postoperatively by radiographs. Results: Few GalTKO.hCD46 pig lungs with 1, 2, or 3 additional genetic modifications (3/4/5-GE) exhibited durable life-supporting xeno lung function, and baboon recipients rarely survived for >12h due to systemic inflammation and/ or lung xenograft failure. Multiple 6-GE lungs and a few 2-, 3-or 4-GE lungs (e.g. β4GalKO, hvWF) usually fully supported recipient gas transport and hemodynamics; overall, 9/49 exhibited recipient survival >2d; occasional recipient baboons survived for up to 8d (hCD55.hEPCR.hTBM.hCD39) and 9d (hEPCR.hTBM.hCD47.HO-1), and one 7-GE (hEPCR.hTBM.hCD47.HO-1 with β4GalKO) lung recipient survived for 31d. Lung xenograft failure beyond 4d was usually associated with rebounding anti-pig antibody titer and loss of lung vascular barrier function. Conclusions: Multi-transgenic pig lungs designed to modulate anti-pig antibody binding and other previously identified pro-inflammatory mechanisms, combined with evidence-based, mechanism-directed drug treatments, significantly prolong lifesupportive xenolung function and recipient baboon survival. Accumulated evidence suggests that, in addition to anti-non-Gal antibody, complement, and coagulation pathway dysregulation, recipient NK cells (HLA-E) and donor macrophage activation (thromboxane, histamine) play important roles in driving residual inflammation. Controlling these pathways with additional targeted lung donor gene modifications (or drug treatments) appear likely to successfully protect lung xenografts, and further advance lung xenotransplantation towards clinical application. Purpose: Costimulation blockade strategies targeting the CD40/CD154 pathway are highly effective in preventing xenograft rejection in pig-to-nonhuman primate (NHP) transplantation models. The aim of this study was to assess the relative therapeutic efficacy of tacrolimus, a clinically applicable immunosuppressive agent, compared to either anti-CD40 or anti-CD154 therapies. Methods: Rhesus macaques (n=17) with low pre-transplant xenoreactive antibody titers were selected following recipient screening. Selected recipients underwent bilateral nephrectomy and life-sustaining porcine renal xenotransplantation using GGTA1 KO/CD55 transgenic donor pigs (NSRRC, Columbia, MO). Animals underwent T cell depletion and were randomized to one of three maintenance treatment regimens: tacrolimus (target trough 8-12ng/mL), anti-CD40 (clone 2C10R4), or anti-CD154 (clone 5C8), plus mycophenolic acid and steroids. Results: Recipients treated with anti-CD154 therapy (n=6) experienced the longest survival (MST=235 days, p=0.0031), including three rhesus macaques with survival over 300 days (406, 400, 310 days). Recipients treated with anti-CD40 therapy (n=4) exhibited a moderate prolongation in survival (MST=29.5 days) whereas tacrolimustreated recipients (n=6) experienced the shortest survival (MST = 11.5 days). Graft failure was associated with an increase in serum creatinine. Conclusions: Here we demonstrate that immunosuppression with anti-CD154 or anti-CD40 therapy is associated with prolonged survival of kidney xenografts relative to tacrolimus in a porcine-to-NHP xenotransplantation model. These data provide further rationale for clinical translation of these costimulation blockade reagents which demonstrate the strongest survival benefit for xenotransplant recipients. (2) if transgenic (Tg) expression of human CD47 (hCD47) caused negative effects via a CD47/TSP-1 pathway following pig-to-baboon K+VT Tx. Methods: Study 1: Phagocytosis of porcine endothelial cells (EC) as well as podocytes with/without hCD47 was assessed in co-culture assays with baboon macrophages. Study 2 (to assess the effect of hCD47 Tg in vivo): Five baboons received porcine Kidney plus vascularized thymus (K+VT) in which hCD47 was expressed at high levels. All 5 received ATG and rituximab followed by anti-CD40 or CD40L mab +MMF. One of the 5 additionally received anti-IL6 receptor ab weekly from the 3 rd week. Graft renal function, immunologic assays as well as graft expression of hCD47 and TSP-1 were assessed. Results: Co-culture phagocytosis assays: Statistically significant reductions of phagocytosis of both porcine EC and podocytes were observed when hCD47 was expressed on porcine ECs or podocytes. Expression of hCD47 and TSP-1: One animal expressed hCD47 only on glomerular cells while the others (n=4) expressed hCD47 on glomerular cells as well as vascular ECs and arterial median layers (widespread hCD47). Widespread hCD47-expressing kidneys also expressed TSP1, although weakly, in vascular median layers while no TSP1-expression was observed on kidneys with glomerular hCD47 or naïve pig or baboon kidneys. Following XK+VT Tx: Historic controls of GalTKO K+VT without hCD47Tg uniformly developed proteinuria 2+ within 14 days (n>10), while all recipients of hCD47Tg GalTKO K+VT displayed minimal (1+ or 0) Uprot. However, although the recipient of a kidney graft in which hCD47 was expressed only in the glomeruli maintained function >4 months, three of 4 animals that received widespread hCD47 K+V were euthanized due to systemic edema with evidence of up-regulation of TSP-1 in grafts, without evidence of immunologic rejection within 8 weeks. These had circulating pig CD3+ T thymic emigrants >5% following Tx with increased serum levels of IL6. One animal that received weekly anti-IL6r ab maintains stable renal function without systemic edema (Cr<1mg/dl currently on POD 86). These results demonstrate that (1) (2) if these new antigens were immunogenic in vivo. Methods: Preformed nAb against GalTKO or TKO PBMC in 10 naïve baboon sera were assessed by FCM Ab binding assays using anti-human IgM or IgG ab. Among these baboons, five received pig kidneys and VT grafts from porcine CMV negative TKO pigs. All baboons received anti-CD40 mab, ATG and Rituximab. Graft renal function (sCre and histology), thymic emigrants from pig thymic grafts (chimerism) as well as immunologic assays (anti-pig ab and cellular assays) were performed. Results: Among 10 naïve baboon sera, two had less Ab binding against TKO than GalTKO cells. Four baboon sera had similar binding to TKO and GalTKO cells. Notably, 4 baboon sera had higher ab binding to TKO than GalTKO cells, suggesting that new antigens were revealed in association with the additional KO. Although all baboons (n=5) had stable renal function in the first 11 days (Cre <1.5mg/dl), two baboons with higher non-Gal preformed IgG against TKO than GalTKO (high preformed nAb against TKO) rejected their kidney plus VT grafts at POD 20. In contrast, three baboons, including one that had similarly high IgG binding against both GalTKO and TKO, maintained renal function >86, 43* and >30 days with Cre <1.0mg/dl (> ongoing. * died from anesthetic complication). Chimerism (thymic emigrants from VT grafts) after the first 7 days was markedly lower in baboons with high preformed nAb against TKO than in those without. Immunohistology showed predominant IgG binding in the excised rejected kidneys. These data indicate the TKO may induce new antigenic specificities that led to delayed vascular xenograft rejection in a pig-to-baboon kidney plus VT model. Prescreening for preformed IgM and IgG nAb using both GalTKO and TKO PBMC is essential to avoid early loss of pig xenografts. efficacy of FDA-approved immunosuppressive agents in pig-to-baboon kidney xenotransplantation, and compare the results with those achieved using an anti-CD40mAb-based regimen. Methods: Ten kidney transplants were carried out in baboons using GTKO/CD46 pigs with various other genetic manipulations aimed at controlling coagulation dysregulation. Immunosuppressive therapy consisted of induction with ATG and anti-CD20mAb, and maintenance with either (i) regimens using only FDAapproved drugs (tacrolimus + abatacept [CTLA4-Ig] or tacrolimus + rapamycin or mycophenolate mofetil) (GpA, n=5) or (ii) an anti-CD40 mAb + rapamycin regimen (GpB, n=5). (All 10 baboons received corticosteroids, IL-6R blockade [tocilizumab] , and TNF-α blockade [etanercept] .) Baboons were followed by clinical/laboratory monitoring of kidney function and coagulation/inflammatory/immune parameters. At euthanasia, morphologic and immunohistochemical studies were performed on the kidney grafts. Results: One GpB baboon was euthanized on day 4 for acute gastric dilatation, and was excluded from analysis. The mean survival in GpB (n=4) was 181+/-40.5 days (median 186 days), which was significantly longer than in GpA (15+/-5.0 days; median 13 days) (p<0.01). All GpA baboons were euthanized for antibody-mediated rejection, whereas 3 of 4 of GpB were euthanized for infectious complications. At euthanasia, the serum creatinine was not significantly different between the two groups (GpA 1.78+/-0.45 vs GpB 2.18+/-0.91mg/dl). Only GpA baboons developed features of consumptive coagulopathy. On histopathological examination of the grafts, the incidence of thrombotic microangiopathic glomerulopathy, glomerular/ interstitial thrombi, arterial vasculitis, and peritubular capillary inflammation was significantly higher in GpA than in GpB (p<0.05). There was a significant negative correlation between all of the histopathological features of thrombotic microangiopathy and changes in coagulation parameters (platelet count and plasma fibrinogen). Conclusions: Our data suggest that regimens based on current FDA-approved immunosuppressive agents may be insufficient to prevent immune-mediated genetically-engineered pig kidney graft injury even when the grafts express a human coagulation-regulatory protein. Purpose: A significant hinderance to the successful use of xenotransplantation has been the presence of preformed antibodies against the donor antigens, known as xenoantigens. Using modern genetic engineering tools, researchers have been able to eliminate expression of the most commonly identified xenoantigens from pig strains utilized for organ transplantation. However, experiments show that antibody binding to these genetically modified organs persists at a low level. Here, we perform a proofof-concept experiment to demonstrate that the antibody cleaving protease, IdeS, is capable of eliminating residual IgG binding to genetically modified porcine cells. Methods: Human or cynomolgus macaque serum was treated with the IdeS enzyme (FABricator, Genovis Inc.) for 30 minutes per manufacture guidelines to cleave IgG into separate Fc and F(ab') 2 fragments. Treated and untreated serum were utilized to stain cultured primary porcine aortic endothelial cells (pAEC) from WT pigs or genetically modified pigs, lacking the xenoantigen, αGal. Using flow cytometry, fluorophor-conjugated secondary antibodies against either IgG or IgM were utilized to assess the degree of intact antibody binding to the pAEC. Results: Genetically modified pAEC, lacking αGal, showed a roughly 90% reduction of both IgG and IgM binding from human serum compared to WT pAEC ( figure 1A ). IdeS treatment further reduces the level of IgG binding to modified pAEC to background levels for human serum and to a similar but lesser extent for monkey serum (figure 1B,C). As expected, because IdeS is an IgG-specific protease, the treatment had no effect on the level of IgM binding. Conclusions: IdeS treatment of serum is capable of eliminating residual binding of functional IgG to genetically modified porcine cells, suggesting that it is a potential tool to decrease antibody binding to xenotransplant organs and eliminate antibodyand complement-mediated graft injury. Purpose: Ischemia-reperfusion injury (IRI) is a major source of morbidity in renal transplantation and is without targeted therapy. In renal transplantation, IRI contributes to poorer outcomes and early graft loss. Histone deacetylases (HDACs) are enzymes responsible for the epigenetic modification of histones and other nuclear proteins, thereby altering gene expression and regulating diverse cellular processes. We have previously demonstrated that class I HDACs (HDAC1 and HDAC2) have reciprocal effects on renal IRI. In mouse models, HDAC1 deletion increases vulnerability to IRI while HDAC2 deletion or inhibition provides significant protection against IRI. Understanding whether this protection is due to extra-renal inflammatory modulation or whether effects are intrinsic to renal tissue is critical to the development of specific targets in the mediation of IRI. Methods: Renal tubule-specific tamoxifen-inducible HDAC 2 knockout (HDAC2-pax8 KO) mice, and tamoxifen-treated wild-type female (WT) control mice were used. Mice were subjected to 28 minutes of temperature controlled warm renal IRI through unilateral clamping of the renal pedicle and contralateral nephrectomy. Creatinine and BUN were examined at 24-, 48-, 72-, and 96-hours post IRI. Sirus red fibrosis scoring was performed at 30 days post IRI. Results: HDAC2-pax8 KO mice developed significantly less renal injury after renal IRI than controls, with significantly decreased post-operative BUN and Cr ( Figure 1 , p<0.005). HDAC2-pax8 KO mice developed significantly less fibrosis relative to controls at 30 days (Figure 2 , p=0.005). Conclusions: Renal tubule-specific HDAC2 knockout appears to be protective in a standard model of renal warm IRI. This demonstrates that the benefit of HDAC2 deletion is specific to the renal tissue. This finding has important translational potential and provides guidance for identifying renal specific targets for clinical use. Purpose: Glycogen synthase kinase 3 (Gsk3) has α and β isoforms. They are constitutively active in cells and inhibited upon stimulations by N-terminal serine phosphorylation. Although roles of active Gsk3 in ischemia reperfusion injury (IRI) have been well appreciated, the significance of Gsk3 inhibitory phosphorylation has not been fully understood. The potential functional difference of Gsk3 isoforms in the disease processes has never been properly addressed. Methods: In a murine liver partial warm ischemia model (90min. ischemia), we compared Gsk3 wild-type (WT), α and β double or single phosphorylation-resistant (Gsk3 αS21A and/or βS9A) mutant knock-in (KI) mice to study whether and how Gsk3 inhibitory phosphorylation regulated liver IRI. Results: Liver Gsk3 phosphorylation was transient downregulated by ischemia in both its isoforms and gradually increased by reperfusion up to 6h. Gsk3 α but not β phosphorylation-resistant mutation protected mice from liver IRI, as evidenced by lower sALT levels and better preserved liver histological architectures at 6h and 24h post reperfusion in Gsk3 α single and αβ double mutant KI mice, compared with β single mutant KI and WT counterparts. To determine the regulatory mechanism of Gsk3 inhibitory phosphorylation in the disease process, we studied macrophage activation and hepatocyte death in vitro. Gsk3 mutant KI macrophages, both Kupffer cells and bone-marrow derived macrophages, produced more TNF-a and IL-6, but less IL-10, upon TLR4 stimulation. Hepatocytes of the same genotype, however, were indeed protected from TNF-a and stress-induced cell death. To confirm that Gsk3 phosphorylation-resistant mutation plays distinctive roles in liver parenchymal vs. non-parenchymal cells, we studied Gsk3 mutant KI bone marrow (BM) chimeras. Only mutant KI recipients of WT BM were protected from liver IRI, while WT recipients of KI mutant BM suffered more severe liver injuries. Liver autophagy was enhanced in Gsk3 mutant KI mice upon IR. Inhibition of autophagy abrogated liver protection from IRI in these mice. Gsk3 mutant KI resulted in increased activation of HIV-1 TAT-interactive protein 60 (TIP60) and AMP-activated protein kinase, but inhibition of mammalian target of rapamycin complex 1, leading to enhanced autophagy induction and increased resistance to inflammatory cell death in hepatocytes. Methods: This study combines data obtained from pre-and post (t=45 min) reperfusion renal tissue biopsies, with sequential arterio-venous blood sampling over the transplanted graft. Three study groups were defined on basis of clinical outcome: grafts from living donors (reference), and deceased donor grafts with (+DGF) and without (-DFG). Magic angle NMR was used for tissue analysis, and MS-based platforms for organic acids, acylcarnitine, amino acids and purines for the arterio-venous plasma samples. Results: 1-year graft survival was 100%. Integration of the metabolic data identified a profile that is fully discriminatory for future DGF. This metabolome is characterized by ongoing tissue damage, and post-reperfusion ATP/GTP catabolism, indicated by persistent (hypo)xanthine production. Failing recovery of high-energy phosphates occurred despite activated glycolysis, fatty acid oxidation, glutaminolysis and autophagy, and was related to a defect at the level of the oxoglutarate complex in the Krebs cycle. This study shows that in human organ ischemia and reperfusion, functional tissue damage (future DGF) is preceded by an instantaneous metabolic collapse at the level of the Krebs cycle. The observed metabolic defects fully contrast with those reported for rodents, and therefore providing a rationale for the poor translatability of preclinical findings. Efforts to quench clinical IRI should focus on the preservation of metabolic competence, either by preserving the integrity of the Krebs cycle and/or by recruiting metabolic rescue pathways. We and others have reported on benefits of endoplasmic reticulum (ER) stress modulation and autophagy enhancement to mitigate ischemia-reperfusion injury (IRI) in orthotopic liver transplantation (OLT). Although gut microbiota may influence skin/cardiac allograft rejection, its role in OLT recipients is unknown. We employed a mouse allo-OLT model and analyzed human OLTs to determine as to whether and how recipient antibiotics pretreatment (Abx Rx) may influence ER stress -autophagy crosstalk and OLT outcomes. Methods: In the experimental arm, mouse recipients (C57BL/6) with or without Abx Rx (amoxicillin, x10 days) were transplanted with allogeneic (Balb/c) cold-stored (18h) livers, followed by liver/blood sampling at 6h post-reperfusion. In the clinical arm, 264 human OLT recipients were retrospectively divided into Abx (Abx Rx ≥10 days, n=156) vs control (Abx-free or Abx<10 days, n=108) groups and analyzed by a logistic regression analysis; while OLT biopsies (Bx; n=52) collected at 2h after reperfusion were analyzed by Western blots (WB Transplantation, 2011) . The kidneys of the hibernators display no histological injury or renal tubular epithelial cell (RTEC) apoptosis. We have also shown that mouse RTECs subjected to in vitro cold storage at 4°C followed by rewarming (CS/REW), and mouse kidneys subjected to cold storage at 4°C followed by transplantation (CI+Txp) demonstrate significant RTEC apoptosis that can be partially blocked with a pan caspase inhibitor, Q-VD-OPh (Nydam T, Transplantation, 2018) . Since Q-VD-OPh only partially blocks apoptosis, we hypothesized that caspase independent apoptosis may be activated during CS/REW and CI+Txp in mice. Furthermore, we hypothesize that GS RTECS are protected from caspase independent apoptosis. Methods: GS and mouse RTECs were subjected to cold storage in UW solution followed by rewarming (CS/REW) in normal media as previously described (Jain S, Transplantation, 2015) . Donor mice kidneys were subjected to CI followed by kidney transplant (CI+Txp) or transplanted without CI. Apoptosis was quantified by TUNEL assay or morphologically. AIF and Endomuclease G (EndoG) were examined in mitochondrial and cytosolic fractions by immunoblot and IF. Results: In vitro experiments: Mouse RTECs exposed to CS/REW had significantly increased apoptosis vs. squirrel RTECs (Table 1) . Furthermore, mouse RTECs subjected to CS/REW had significantly increased mitochondrial AIF translocation to the cytosolic fraction vs. squirrel RTECs (Fig 1) . In contrast, EndoG translocation could not be detected. Mouse kidney transplant experiments: AIF translocation to the cytosol was detected only in mouse kidneys subjected to CI+Txp, whereas mouse kidneys that were not subjected to CI did not demonstrate cytosolic translocation of AIF. Furthermore, EndoG cytosolic translocation was not detected. Our data suggests that hibernators that are capable of surviving several days at 4°C suppress caspase dependent and independent apoptosis. In contrast, in vitro cold storage and cold ischemia followed by transplantation in mice is characterized by both caspase dependent and evidence of caspase-independent apoptosis. The latter is mediated by AIF rather than EndoG. Complete blockade of RTEC apoptosis in clinical transplantation will therefore likely require inhibition of both caspase dependent and independent pathways. Purpose: We aimed to study the health economic effects of kidney transplantation in a population of kidney transplant candidates > 65 years of age. Methods: We performed a prospective single center study comparing costs, quality adjusted life years (QALYs) and self-perceived health during the last year before and the first year after kidney transplantation in 289 patients ≥ 65 years at enlisting. The patients completed an SF-36 survey every six months at waiting list, and at two, six and twelve months post-transplant. A health index was derived for each patient at each visit as a measure of self-perceived health. QALYs were computed based on individual health indexes and survival data from the same cohort. Average costs of continued dialysis compared to immunosuppression/follow-ups after engraftment were imported. Results: At time of data retrieval, 205 patients had received a transplant, 29 of them (14%) from a living donor. Median age at transplantation was 71.6 (65.0-83.6) years. Both Self-perceived health ( Figure 1 ) and QALYs (Table 1) improved significantly after transplantation. At one year, the costs/QALY were substantially higher for transplantation (101 000 $ vs. 88 000 $). Preliminary analyses including 3-year post transplant data from 64 patients do however suggest that this is reversed completely already after two years. We conclude that compared to remaining on the waiting list, kidney transplantation is associated with improved health but also with increased costs the first year after transplantation. Preliminary analyses clearly suggest a positive long-term cost-effectiveness, but this needs to be confirmed. Conclusions: A significant downward trend, accompanied by decreased variability, in expected probability (adjusted for patient and donor characteristics) of patient or graft survival at 1 year is observed for all organs considered in this analysis. Across the 13 PSRs analyzed there clearly has been a 'regression to the mean' throughout the entirety of the solid organ transplant effort in the United States. It is plausible that the observed effect is due to a reaction by centers to increasing outcomes oversight and regulation by the Organ Procurement and Transplantation Network/Membership and Professional Standards Committee, and the Centers for Medicare and Medicaid Services. Numerous publications over the years rally the concern that very low failure estimates may be leading to the continued high risk of donated allograft non-use and more careful selection of riskier candidates.* It has been argued that traditional outcome regulation, with its near singular reliance on SRTR expected patient and graft survival values, has resulted in perhaps 'too good' of outcomes when considering the larger potential donor and recipient pools that are likely being underserved by programs due to regulatory concerns. Only changing the harshness of the expectations and/or regulatory oversight, and also taking into account risk factors currently not considered, will allow centers to use more of the currently discarded donated organs and consider riskier candidates. The Collaborative Improvement and Innovation Network (COIIN): A. Wey 1 , S. Purpose: Of 2.5M deaths in the US every year, only 10,000 are recovered for transplantation; recovery of organs from older donors is rare although similar donors are routinely used in Europe. Current OPO performance metrics may disincentivize OPOs from pursuing marginal donor candidates. We present an OPO aggressiveness index to identify OPOs that aggressively pursue marginal kidney donors. Methods: Using SRTR data from 2013-2017, we evaluated the proportion of eligible deaths in each OPO that fell into one of 7 categories of "marginal donor candidate" (age>60, BMI>35, KDPI>85%, serum creatinine (SCr)>2, Hepatitis C, infectious risk donor (IRD), and donor after cardiac death (DCD)) and evaluated the correlation between these proportions and deceased donors per 1000 total deaths in each OPO in 2015 ("overall donor rate"). We used decile scores of four of these categories (age>60, BMI>35, KDPI>85%, SCr>2) to construct an index of OPO aggressiveness ("aggressiveness score"), and compared this and current OPO performance metrics to overall donor rate. Results: Of 7 categories of marginal donor candidate, four were correlated with total donor rate (r>0.4) ( Table 1A ). The aggressiveness score demonstrated substantial variation between OPOs ( Figure 1A ). OPO aggressiveness score was more strongly correlated with overall donor rate (r=0.61) than were the standard OPO performance metrics of standardized donation ratio (r=0.49), organs transplanted per donor (r=0.54), or observed:expected ratio (r=0.59) ( highest for programs with above-and below-average transplant rates, and lowest for programs with average transplant rates. One additional tier at listing in the waitlist mortality rating was associated with lower patient mortality in kidney transplantation (HR, 0.94 0.96 0.99 ), and one additional tier at listing in the posttransplant rating was associated with lower patient mortality in lung transplantation (HR, 0.90 0.94 0.98 ). The relative importance of pretransplant and posttransplant programspecific ratings at listing to subsequent candidate mortality was organ specific. Thus, public reporting should clearly emphasize the metrics important for patient survival after listing for a given organ. .9% lived in a state that adopted ME on 1/1/2014, 11.6% lived in a state that adopted ME between 4/2014-7/2016, and 37.5% lived in a state that did not adopt ME. The proportion of new registrants insured with Medicaid increased from 9.5% 2012-2013 to 11.2% in 2014 and 12.9% 2015-2017 in states that adopted ME on 1/1/2014 (chi 2 p<0.001) but remained constant in states that did not adopt ME (4.9%/5.1%/4.9%, p=0.3) (Figure) . Among new registrants who received living donor KT within 18 months of registration, the proportion insured with Medicaid increased from 5.8% in 2012-2013 to 6.7% in 2014 and 9.1% 2015-2017 (p<0.001); in all eras this proportion was 3.3% in states that did not adopt ME (p=0.9). Compared to states that did not adopt ME, transplant centers in states that adopted ME had a 21% increase in new adult registrants insured via Medicaid in the first year following ME (IRR= 1.17 1.21 1.26 , p<0.001) and a 40% increase thereafter (IRR= 1.36 1.40 1.45 , p<0.001). Additionally, centers in states that adopted ME had a nonsignificant 10% increase in new pediatric registrants insured via Medicaid in the first year following ME (IRR= 0.95 1.10 1.26 , p=0.2) and a 26% increase thereafter (IRR= 1.13 1.26 1.40 , p<0.001). In states that adopted ME, the proportion of new waitlist registrants insured via Medicaid increased substantially, and transplant centers in these states had large increases in new Medicaid registrants. ME may have expanded access to waitlisting and transplantation in states that adopted ME. The Scientific Registry of Transplant Recipients semi-annually assigns transplant programs to one of five performance tiers for 1-year graft survival in its program-specific reports (PSRs). Concerns about the historical variability in tier assignment over time have been expressed. However, historical analyses have two limitations: they cannot separate variability in program effects over time from random variability in outcomes, and they cannot compare the observed tier assignments to the correct tier assignments. To overcome these limitations, a simulation study was performed to assess the reliability of tier assignments when program effects are known and held constant. Methods: Simulation parameters approximated the cohort of adult recipients of deceased-donor kidneys in the January 2018 PSRs. One hundred PSR cohorts were simulated for 11,600 simulated programs (50 batches of 232) to produce 20 temporally independent tiers for each program. The intraclass correlation for the 20 independent tier assignments was 0.38 0.39 0.40 , which suggests that 39% of the overall variability in tier assignment was explained by the program. Intraclass correlations were 0.19 0.20 0.21 , 0.34 0.35 0.36 , and 0.56 0.57 0.58 for programs with 0-3, 3-10, and >10 expected events, respectively. Conclusions: A true tier was derived from the program effect for each simulated program. Figure 1 shows the simulated tier assignments by the number of expected events and the true tier. The assigned tier matched the true tier 37.5% of the time. The assigned tier matched the true tier 32.5%, 37.8%, and 44.4% of the time for programs with 0-3, 3-10, and >10 expected events, respectively. The assigned tier was within 1 tier of the true tier 83.6% of the time, and was within 1 tier of the true tier 79.0%, 83.7%, and 90.0% of the time for programs with 0-3, 3-10, and >10 expected events, respectively. No difference were detected in postoperative pain score, patients requiring analgesic, dose of analgesic, and time to first flatus between two groups. Of note, patients underwent SMIKT reported significantly higher incision satisfaction score and lower Vancovour scar score at 1, 2, and 3 month after transplantation. Conclusions: MIKT is a safe surgery procedure, which did not increase the operative time, while significantly reduced incision length with better short term cosmetic effect and patient satisfaction. . We evaluated longitudinal patterns in the use of laparoscopic gastric bypass, sleeve gastrectomy, and adjustable gastric banding procedures for patients with ESRD. We then estimated median length-of-stay, risk-adjusted complication, and reoperation rates for patients with and without ESRD. Results: The number of ESRD patients undergoing any bariatric procedure increased 9-fold from 50 in 2006 to 475 in 2016 (N=2,698 total). Surgical practice patterns changed significantly over this period as well. For example, the proportion of ESRD patients undergoing sleeve gastrectomy increased from less than 1% in 2006 to 84% in 2016. (Figure) Outcomes were similar between ESRD patients and other Medicare beneficiaries for all procedures. For sleeve gastrectomy, ESRD patients experienced a complication in 3.6% (95%CI 2.6-4.6%) of cases compared to 3.7% (95%CI 3.5-3.8%) for non-ESRD patients. Complication rates were nearly twice as high for ESRD patients undergoing gastric bypass (7.0%, 95%CI 5.3-8.7%). Reoperation rates following sleeve gastrectomy were also similar between ESRD (0.2%, 95%CI 0.1-0.3%) and non-ESRD patients (0.2%, 95%CI 0.1-0.4%). Both patient populations had a median length of stay of 2 days (IQR 1-5). Conclusions: Due in part to a more favorable complication profile, national practice patterns for bariatric surgery in patients with ESRD have shifted dramatically towards sleeve gastrectomy. As the transplant community moves beyond questions of safety and feasibility, transplant centers should create opportunities that improve the longitudinal care of obese patients by facilitating greater access to bariatric surgery. Results: Pre-clinical data showed an accelerated graft rejection in obese recipients when compared to lean control animals (p<0.005); notably, bariatric surgery (sleeve gastrectomies) extended graft survival compared to both, DIO and lean mice (p<0.006); bariatric surgery reduced body weight significantly (from 45 g to 30g, p < 0.001) and reversed insulin resistance (glucose level <200mg/dl). Next, we performed a broad metabolomic analysis that indicated restored systemic levels of taurodeoxycholic acid (TDCA) and valine following SGx. When treating DIO mice with TDCA and valine, we observed a dramatic and long-lasting weight loss (30%), ameliorated alloimmune response and prolongation of allograft survival that was comparable to animals that underwent SGx (p=ns). Of note, TDCA/ valine treatment was associated with a reduction of Th1 responses and an increased IL-10 cytokine production (p<0,01); CD4+CD25FOXP3+ T cell frequencies remained unchanged. In addition, TDCA/ valine treatment was associated with a reduction of CD11b+CD11c+ activation in vivo and reduced DC activation and proliferation when cultured in presence of LPS (p<0,001). Next, we tested the effects of TDCA/ Valine in metabolic chambers and confirmed that the application of TDCA and valine resulted in a loss of appetite while neither impacting energy expenditure nor oxygen consumption. These findings were supported by a robust decline of the melanin concentrating hormone (MCH), a potent orexigen, in the lateral hypothalamic area in TDCA/valine DIO treated animals (MHC mRNA >4-fold decrease compared to DIO mice, p<0.001). Conclusions: Collectively, our study underscores a novel metabolic pathway that may pave the way for a novel therapeutic approach targeting weight loss and alloimmune responses. Abstract# 293 Conclusions: Kidney donors were discharged with 60 MEQ per day, which is greater than the CDC recommendation of 50 MEQ. Compared with similar procedures, both kidney and liver donors received more than twice the amount of recommended MEQ at discharge. Aggressive post-operative multimodal pain regimens and education on opioid prescribing patterns is critical to help decrease the amount of opioids prescribed to this patient population. In procedural related factors like mean procure time, hospital stay, ICU stay, blood loss and needs of transfusion , it showed better results in embolization group( Table 1) , significantly. Only one patient had complication, puncture site hematoma after embolization. 2 patients had removal site hematoma and one patient experienced post operation ileus in nephrectomy. There was no mortality case related with procedures. 10 of 22 (45.5%) in nephrectomy group underwent percutaneous embolization 1 day before surgical removal. In pre-embolization patients, they had shorter procedure time, less blood loss and shorter hospital stays compared with patients undertaken surgical removal directly without pre-embolization. Conclusions: Embolization of symptomatic failed renal grafts wound has considerable efficacy and saftey with less morbidity and no serious complication compared to graft nephrectomy and pre-embolization before nephrectomy can be the effective suggestion to reduce the risk related to operation. Purpose: Analysis of combined heart-kidney transplants (HKTx) reveals equivalent or improved patient survival and reduced allograft rejection rates as compared to single-organ transplants. The rate of HKTx is growing rapidly compared with heart transplants alone. We seek to provide an updated comprehensive analysis of risk factors and outcomes in the HKTx cohort, which may help guide future selection processes and therapeutic interventions. Methods: HKTx patients from 2010 until 2017 were identified from the United Network for Organ Sharing (UNOS) database. Univariate Cox proportional hazard models were used to determine risk factors for patient overall survival. Cut point analysis was then performed using the %findcut macro to determine the optimal dichotomization of continuous risk factors for Cox regression analysis. Hazard ratios (HR) and their 95% confidence intervals (CI) as well as the log rank p-value are presented for overall survival and rejection-free survival. Survival estimates were calculated using the Kaplan-Meier method. All hypotheses were two-sided with p<0.05 considered statistically significant. Analyses were performed in SAS v9.4 (SAS Institute; Cary, NC). Results: One, 5, and 10 year overall survival estimates for the entire 1148 patient cohort were 86.9%, 76.7%, and 59.7%, respectively. One, 5, and 10 year rejection-free survival estimates for the entire cohort were 83.9%, 71.8%, and 54%, respectively. Of the 1148 patients, 69% experienced no rejection in either organ, 1.7% experienced only heart rejection, 3.3% experienced only kidney rejection, and 26% had dual organ rejection. Donor age was found to be significantly associated with survival (HR = 1.02, CI = 1. Analysis of survival by score revealed that a UTS score of 1 or higher was associated with significantly higher survival post transplant (10 year survival 51.4% vs 56.8 %, p<0.0001), as shown in the figure. It is conceivable that the superior survival of donors with a positive UTS score is careful donor selection (ie a donor with drug use but with very little other co-morbidity). The MTS score did not predict posttransplant survival. Even utilization of donors with multiple positive toxicology findings resulted in similar survival curves. Analyzing survival for patients post 2007 by both UTS and MTS score showed similar outcomes across all score levels. The information collected about drug use in donors has evolved over time. Prior to 2007 when only limited binary descriptive fields were available, donors with at least one positive finding had higher survival than those without, In the recent era since 2007, drug use by history or toxicological results did not affect survival even in cases of multiple drugs. While the improved survival of some drug-positive donors may reflect selection bias, it is important to realize that drug use did not lead to negative survival effects across a wide spectrum of drugs. This should lead to increased use of carefully selected heart donors with a history of drug and alcohol use. At the time of harvest, a sample for quantitative HCV PCR was obtained that was also positive and quantifiable, albeit with a titer too low to allow successful genotyping. 3 months post-transplant, HCV remained undetectable, diagnostic for non-transmission. Patient 2 followed a course similar to other patients at our center who received a heart transplant from HCV-positive donor. All donorderived HCV at our center have either achieved SVR-12 or are in process to achieve SVR-12 (or non-transmission). Conclusions: Cardiac donors with HCV viremia are being utilized in multiple centers, creating competition for this scarce resource. HCV-positive donors are not a panacea for the donor shortage. Although patients are counseled to expect transmission of HCV from an HCV-viremic donor, the infectivity rate has been <100%. In contrast to other centers reporting incomplete infectivity, the trial has provided the opportunity to confirm HCV viremia both by nucleic-acid testing and quantitative PCR, provided additional data against false-positive assessment of viremia. Evidence of occasional non-transmission supports the safety and efficacy of pre-emptive antiviral treatment rather than universal prophylaxis against HCV post-transplant for prevention of resistance and cost-effectiveness. 1987-1996, 1997-2006, and 2007-2018 . Additionally, we utilized multivariate nominal logistic regression to examine the factors predictive of utilization of heart donors for transplant. Results: From 1987 to March 2018 there were a total of 204999 organ donors in the United States. Of these, 71470 hearts (34.9 %) were accepted for transplant. The three tables below list donor factors, divided in two ways: First, whether the donor was utilized ("taken") or not and then by the 3 eras in the database. All comparisons were highly significant (p<0.0001). In Table 1 , donors utilized had consistently lower age, male preponderance and smaller BMI. More donors over time were Black race and had a history of hypertension. Anoxia was less common as a cause of death in the first era but rose significantly over the next 2 periods. CVA was quite common in the oldest era but became the least common in the final era. On Table 2 , we see that donor angiography has been much more common and left ventricular hypertrophy and valve abnormalities have become modestly more common with greater increases for donor antihypertensives and tattoos. In Table 3 , we note that utilization of donors who smoked has dropped dramatically while alcohol use has not changed and cocaine and other drug use has become much more common, along with Public Health Service Increased Risk status. For logistic regression, we included 29 UNOS fields, including donor factors such as era of transplant, age, gender, BMI, blood type, serum creatinine, cause of death, ethnic group, history of clinical infection, history of hypertension, diabetes, and valve disease. We examined whether the donor was considered Public Health Service Increased Risk, a history of tattoos, smoking, IV drug use, alcohol, cocaine or other illicit drugs. We looked at region of the country, donor need for inotropic support, and whether the donor had a coronary angiogram done as part of organ donation evaluation. The strongest predictors of the donor heart being used for transplant were donor age, donor coronary angiogram being performed, donor valve disease, left ventricular hypertrophy, cause of death, donor weight and gender. None of the social variables such as drugs, alcohol or smoking were significant predictors. Era of transplant was not a significant predictor of utilization of organs. We suspected that coronary angiogram was a surrogate for donor age (though some young drug donors might undergo coronary angiography). The mean age was 36.4 ± 18.9 without coronary angiogram and 46.4 ± 9 years with angiography (p<0.0001). We then removed the coronary angio term from the multivariate analysis since it was co-linear with donor age. Donor age remained the strongest predictor, with even higher strength but no new variables emerged as significant. The strongest predictor of heart transplant utilization regardless of era is donor age, with size of the donor and anatomic quality of the donor also highly associated with donor utilization. Given the young mean donor age, there is a significant opportunity to use donors of somewhat older age, particularly in those patients who may be willing to accept this compromise since young, perfect donors are relatively rare. Purpose: Facial vascularized composite allotransplantation (VCA) has emerged as a groundbreaking reconstructive solution for patients with severely disfiguring facial injuries. It allows simultaneous "like with like" restoration of both aesthetic and functional deficits in a single-stage procedure with superior results. We report on the 1st Canadian and 42nd worldwide face transplant performed in Montreal in May 2018. A 64-year-old male sustained a gunshot wound in 2011. The injury involved the lower two-thirds of the face and resulted in extensive midface bony and soft tissue damage. Three years of planning and numerous cadaveric dissections preceded the operation. The allograft was procured from a beating-heart donor with similar blood type and partial match in HLA. The transplant consisted of Le Fort III and BSSO osteotomies and skin from the lower 2/3 of the face and neck. Virtual surgical planning was used to fabricate osteotomy guides and stereolithographic models. Microvascular, facial nerve (3 branches) and infraorbital nerve anastomoses were performed bilaterally. Immunosuppression protocol was initiated during procurement and consisted of thymoglobulin, tacrolimus, mycophenolate and solumedrol. Maintenance consisted of tacrolimus, mycophenolate and prednisone. Results: At 6-month follow-up, the aesthetic outcome is excellent. Partial restoration of light touch sensation has been observed over the majority of graft. Electromyography has confirmed muscle activity in the facial musculature bilaterally and the patient is able to produce mild spontaneous smile. Although significantly affected, speech, mastication and deglutition are continuously improving with intensive daily therapy. Despite these limitations, the patient reports high satisfaction with the procedure and has reintegrated in the community. Post-operative complications were mainly infectious, including mucormycosis of the left thigh, treated with surgical resection and anti-fungal therapy. Two Grade I episodes of acute rejection successfully treated with solumedrol. Conclusions: At 6 months follow-up, we report an overall good functional outcome in Canada's 1st and world's oldest face transplant recipient to date. Physical condition and mental readiness are more important factors than age alone. Meticulous planning, a multi-disciplinary approach and use of technology is paramount to the success of this procedure. Continuous long-term follow-up is mandatory for surveillance of immunosuppression-related complications and functional assessment of the graft. Methods: Over a 2-year period we have performed 14 uterus transplantations in our institution using grafts from both deceased (n=2) and living donors (n=12). The recipients were followed regularly with gynecological examination, cervical biopsies and sonograms. The main outcome measurements were hospital stay, postoperative complications, and success rate. Results: Five recipients had their uterine grafts removed within the first two weeks after surgery. Vascular complications, related to both inflow and outflow problems, were identified as the primary reason for three of the graft losses and organ-related factors for two. Four out of the seven recipients with ongoing grafts have experienced vaginal strictures at the vagina-vagina anastomosis necessitating surgical interventions. Four recipients were treated for rejection (grade 1, 2 and borderline) with steroid recycling with complete resolution. One of the treated women with borderline rejection delivered a healthy baby. All other borderline episodes resolved without treatment. One recipient developed cervical intraepithelial neoplasia 1 (CIN 1), but no action was required. Four of the recipients had a total of 11 embryo transfers resulting in four pregnancies. One recipient had a decreased kidney function during pregnancy which led to a decrease in her immunosuppression levels up until delivery and hysterectomy. Conclusions: Uterine transplantation is a promising treatment of uterine factor infertility that affects 1-5% of women. The lessons learned from our initial 14 recipients have been instrumental to our success, and we aim to share our conclusions and build on knowledge in the field of uterus transplantation. The side effects of chronic immunossupression limit the more extensive use of Vascularized Composite Allotransplantation (VCA), and because reliable pre-clinical animal models will be instrumental in the development of safer immunosuppressive regimens, the aim in this study was to assess the relevance of murine and porcine VCA models by comparing their clinical and histologic rejection patterns to those of human VCA patients. The VCA models compared in this study included murine and porcine hindlimb transplant models as well as human patients with upper extremity transplants. Clinical rejection patterns were compared by photographic documentation, and histologic rejection patterns were compared based on H&E stains of graft biopsies. Due to the similarities between porcine and human VCA rejection, porcine rejection patterns were further characterized according to immunologic cell types based on immunohistochemical (IHC) staining. Analysis of IHC slides involved cell counting software (ImageJ) and percent area calculations (Fiji). Results: Porcine clinical skin rejection is similar to human clinical rejection and progresses from mild erythema to necrolysis. In contrast, murine rejection has unique features such as fur loss and early edema. Porcine histologic rejection is also similar to human rejection based on H&E characterization, whereas murine histologic rejection follows a unique pattern of early, severe edema and a lower density of inflammatory infiltrates. Finally, IHC analysis of porcine rejection reveals distinct differences in cellular components between rejection grades. T cells rise in grades 2 and 3, B cells rise in grade 3, and Macrophages rise in grade 4. Regulatory T cells rise in grade 3 and markedly decrease in grade 4. This pattern resembles previously published data on human VCA rejection. Conclusions: While the murine model is a cost-effective way to test initial theories and perform high throughput studies, the porcine model is superior due to its strong correlation with human VCA rejection patterns. Based on IHC analysis, during the porcine rejection process the presence of T Purpose: Kidney transplant recipients with failed allografts suffer from high rates of humoral sensitization. Immunosuppression (IS) strategies are highly variable in these patients with many discontinuing IS, often resulting in HLA antibodies that complicate re-transplantation. Despite this risk, common practice has maintained that the risks of infection and metabolic toxicities of IS outweigh the potential benefit of antibody prevention. The advent of belatacept with its favorable toxicity profile and ability to prevent alloantibodies render it an appealing IS option to minimize sensitization and facilitate re-transplantation in kidney transplant recipients with failed allografts. We designed a single center, randomized controlled trial to evaluate the impact of belatacept on HLA antibody formation in patients with failed renal allografts. Kidney transplant recipients (>1-yr post-transplant, 18-70 yrs) with failing (GFR<20, PreD) or failed allografts (PostD) were randomized to receive belatacept (BELA) or IS discontinuation (ISD) and followed for 36 months. Group assignment was made according to a computer-generated randomization table. PreD subjects not yet on dialysis were converted to belatacept or maintained on their current IS, and once on dialysis continued on belatacept monotherapy or their IS discontinued. PostD subjects already on dialysis were converted to belatacept monotherapy or their IS discontinued. Outcomes measured were the development of donor-specific antibodies (DSA) and degree of sensitization as measured by flow-based panel reactive antibody (PRA) assessment, Luminex-based HLA single antigen bead specificity testing and calculated PRA (cPRA Purpose: Kidney transplantation remains limited by the chronic toxicities associated with calcineurin inhibitors (CNIs) and steroids. The objective of this clinical trial was to determine whether CNIs and steroids could be avoided in a costimulationbased regimen. Methods: We enrolled recipients of live (n=30) and deceased (n=10) donor kidneys in an IRB approved trial using intraoperative alemtuzumab induction followed by maintenance therapy with monthly belatacept and daily sirolimus. The trial was designed to offer patients the option to reduce immunosuppressive therapy in years 2 and 3 and wean to monotherapy belatacept. All patients (median 47 years, range 20-69; 27M:13F, 24C/15AA/1Asian) have been followed for over 48 months. Phenotypic lymphocyte analysis was performed longitudinally for 36 months posttransplantation. Results: All patients tolerated belatacept; 28 patients tolerated sirolimus, with 6 patients converted to MMF permanently, and 6 patients requiring transient conversion to MMF. There was no patient death or kidney allograft loss within 48 months. Two patients have experienced graft loss to light chain disease and recurrent hypertensive nephropathy respectively at 5 years. At 6, 12, 24, and 36 months, the mean serum creatinine was 1.3±0.4, 1.2±0.6, and 1.4±0.8 and the mean eGFR was 83±19, 85±21, 82±22, and 80±22, respectively. There was no clinical rejection in the first year, though subclinical rejection (Banff 1A or greater) was detected by protocol biopsy in 2 patients at 6 months and 3 patients at 1 year. 19 patients attempted weaning to belatacept monotherapy and 12 patients were successfully maintained with belatacept monotherapy. Biopsies showed Banff 1A to 2B in 3 patients after weaning sirolimus that responded to bolus methylprednisolone and return to protocol therapy. Patients who weaned or attempted to wean to belatacept monotherapy did not develop DSA. DSA was detected in 5 patients, three with sirolimus intolerance, 1 removed from belatacept, and 1 removed from trial therapy for pregnancy. One patient developed histoplasmosis and one patient develop cutaneous Kaposi's sarcoma; both were cured with therapy. CMV and EBV reactivation was not seen, though transient BK nephritis was detected in 4 patients. Phenotypic lymphocyte analysis demonstrated that alemtuzumab produced profound lymphocyte depletion followed by slow T cell repopulation and more rapid B cell reconstitution with a repertoire deviated toward naïve T and B cells. A significant increase of CD4 + CD25 + Foxp3 + regulatory T cells and regulatory/transitional B cells was evident in the first year. Conclusions: Our data suggest the efficacy and limitations of a novel belataceptbased regimen. The regimen effectively prevents belatacept resistant rejection, enriches the immune repertoire for naïve cells and regulatory cells, and is permissive for control of rejection with belatacept monotherapy in selected patients. Purpose: Prior studies on the conversion of tacrolimus (Calcineurin Inhibitor; CNI) to belatacept have been limited in scope due to either the absence of, postconversion surveillance biopsies that could underestimate rejection rates, or a case-controlled design that makes attribution of improvement in kidney function to belatacept difficult. Methods: 53 adult kidney transplant (KT) patients were converted to belatacept from tacrolimus for allograft dysfunction. The slope of eGFR post-conversion was compared to a 1:3 propensity matched cohort of patients maintained on CNI (N=159; matched on clinical and histologic variables) derived from the Paris Transplant Group (PTG) registry. Most patients (N=37; 70%) underwent a post-conversion surveillance (at median 6 months) biopsy. Thirty (57%) consecutive patients also underwent transcriptome analysis of pre-and post-conversion biopsies using the molecular microscope (MMDx). Results: Patients (mean age: 48 years) were switched to belatacept at a median of 6 months post-KT. Many were sensitized (32%; cPRA range=29-100%); re-grafts (23%); had delayed graft function (60%); and had a history of recent acute rejection (15%). Death-censored graft survival was 85% at a median follow-up of 2.5 years post-conversion. Seven (13%) patients had TCMR (at median=6 months) postconversion. Of these, two were subclinical TCMR. Overall, renal function improved (p=<0.001) from a peak mean eGFR of 31±15 to 38±17 mL/min/1.72m2 by 6 months post-conversion, but then stayed stable till most recent follow-up. This eGFR improvement was also observed (p<0.001) in comparison to the matched PTG cohort where eGFR did not improve overtime but stayed stable (mean~32mL/min/1.72m2) up till a comparative time-frame of 2 years. Paired pre-and post-conversion histologic analysis did not reveal worsening of inflammation or chronicity. Paired biopsy gene expression analysis also did not reveal significant changes in inflammation or acute kidney injury; although the atrophy-fibrosis score worsened (mean=0.28 to 0.44; p=0.005 ). An in-depth analysis of individual genes also did not reveal any patterns of improvement after belatacept conversion. In this study we report that belatacept conversion was safe for kidney transplant patients, including those at high-immunologic risk. Improvement in renal function with belatacept conversion was seen early and then sustained in comparison to a matched control cohort where there was neither an improvement nor worsening in renal function overtime. Of interest, the withdrawal of CNIs did not lead to any reduction of molecular injury features. Moreover, we were unable to show any molecular signals that could be related to CNI administration and regressed after withdrawal. Purpose: The Molecular Microscope diagnostic system (MMDx), based on microarray gene expression, uses ensembles of machine learning classifiers rather than single genes, gene sets, or classifiers, to maximize the accuracy of rejection diagnoses and injury assessment. We tested its accuracy and stability, and developed an automated system for generating molecular reports on kidney transplant biopsies. Methods: We evaluated the ensembles' accuracy (agreement with histology) and stability (correlation of predictions based on multiple training sets). 1679 kidney transplant biopsies were repeatedly split at random into two training sets (N=600 each) and a test set (N=479). Classifiers were developed in each training set, and predictions for ABMR and TCMR made in the test sets. Twelve separate machine learning methods and their median were evaluated. In a separate analysis, a random forest classifier was used to predict the report sign-outs of an expert clinician. Results: There was considerable variation between the 12 classifier methods for any given biopsy ( Figure 1A and B) . The median had a higher accuracy than any of the individual classifiers, and was among the most stable (highest correlation between predictions from separate random training sets - Figure 1C and D). A random forest classifier was used to predict the sign-out of an expert evaluator ( Figure 1E and F -abbreviations are explained on the MMDx report). Accuracies for the expert's molecular TCMR and ABMR diagnoses were ~98 and 97% respectively. Most disagreements were in biopsies near diagnostic thresholds. Considerable disagreement with histology persists, which is expected given the noise in histology assessments. The balanced accuracy of MMDx signouts for pathology diagnoses of TCMR and ABMR was about 75%. Conclusions: In our data set, ensembles of machine learning classifiers generate diagnoses that are both more accurate than the best individual classifiers, and nearly as stable as the best. This result is expected from the machine learning literature, since different methods will tend to make different types mistakes, and taking the medians cancels out the worst estimates. Similar ensembles (random forests) can be used to create automated report sign-outs that agree with an expert observer 97-98% of the time. Disagreement with histology will persist, largely due to the known noise in histology assessments (ClinicalTrials.gov NCT01299168). Purpose: Chronic antibody-mediated rejection (cAMR) is a major contributor to graft loss. However, prognostic heterogeneity has not been addressed. We investigated whether mechanistically-informed prediction of allograft loss could identify patients with different outcomes. We prospectively enrolled all adult kidney recipients from 2 centers at the time of the first graft biopsy showing cAMR according to Banff criteria (2005 Banff criteria ( -2016 . All patients were assessed for cAMR phenotype based on histology, immunochemistry and gene expression using Nanostring technology on formalinfixed paraffin-embedded tissue for 209 pathogenesis-selected genes previously identified for their relevance to transplant diagnosis. All patients underwent anti-HLA antibody, GFR and proteinuria evaluation at cAMR diagnosis, and were followed up to September 2018. Results: Among 151 patients meeting inclusion criteria, cAMR was diagnosed at a median time of 23 (IQR, 10-58) months post-transplant. Cross-validated supervised principal component analysis identified 2 independent linear combinations of 20 genes associated with graft loss, which were mainly related to B cell immunity and immunoglobulin-mediated immune response. We built a prognostic gene score based on the 2 components that exhibited good optimism-corrected calibration and greater time-dependent accuracy (5-year AUC of 0.80) than clinical and histological parameters for predicting graft loss (5-year AUC for GFR, proteinuria and cg score: 0.67, 0.70 and 0.65, respectively). The prognostic gene score identified 3 groups of patients: low-score patients (n=22) with 5-year graft survival of 95%, intermediatescore patients (n=65) with 5-year graft survival of 64% and high-score patients (n=64) with 5-year graft survival of 29% (p<0.001). In multivariable analysis, the prognostic gene score predicted graft loss independently of clinical, immunological and histological parameters (adjusted HR: 1.70, 95%CI: 1.22-2.36, p=0.002). Conclusions: By addressing prognostic heterogeneity in cAMR at the molecular level, we defined a pathogenesis-based prognostic score that outperformed clinical, immunological and histological parameters for predicting graft loss. We are currently validating the performance of the prognostic gene score in an external validation cohort of kidney recipients with cAMR from 3 other centers. In this first report, we find that in patients with cABMR who were not recently subjected to intensive immunosuppressive therapies, belatacept conversion was associated with an improvement in renal function. These results are further bolstered by molecular evidence of improved microvascular inflammation and rejection scores on follow-up biopsies as well as an improvement in renal function when compared to a propensity-matched control group maintained on CNI therapy. Purpose: Poor physical function in kidney transplant candidates is associated with frailty, sub-optimal transplant outcomes, and higher health care utilization. We need an instrument to objectively assess physical function in patients on the kidney waitlist. Our center began in-person re-evaluations of waitlisted patients with high allocation priority based on the new Kidney Allocation System. We used the 6-minute walk (6MWT) as a measure of cardiorespiratory fitness and 1-minute sit-to-stand tests (STS) as a measure of lower extremity strength. We hypothesize that 6MWT result is strongly correlated with STS result and incorporates information beyond co-morbidities. This included 10616 living donor (LD) and 30992 deceased donor (DD) renal transplants. Renal allograft survival rates were computed according to Kaplan-Meier and Cox proportional hazards were used to estimate the impact of potential influential variables. Results: Proportion of obese diabetic renal transplant recipients increased to 49.6% in 2013-2017 from 16.8% in 1987-1992, p=0 .0001. Figure 1 shows The SPPB is a composite measure of balance, gait speed and chair stand time; a score ≤ 10 was defined as lower extremity impairment. Elevated hemoglobin A1c (HbA1c) and low serum albumin were defined as levels ≥ 8.0% and < 3.5 g/dL, respectively. Results: Our study cohort had a mean age of 62 ± 9 years, 62% were men, 80% were Caucasian, 59% had diabetes and 57% were on dialysis. Diabetes and low serum albumin were significantly associated with lower extremity impairment but not with frailty (graphic_). Among patients with diabetes, clinical risk factors independently associated with lower extremity impairment included female gender, peripheral neuropathy, prior stroke, peripheral vascular disease and low serum albumin. In contrast, the only clinical risk factor significantly associated with frailty on multivariate analysis was female gender. No relationship between age, body mass index, Caucasian race, history of dialysis or elevated HbA1c with either lower extremity impairment or frailty was observed. Conclusions: In a cohort of high-risk KT candidates, diabetes and low serum albumin were associated with a greater prevalence of lower extremity impairment than frailty. Among diabetics, clinical risk factors for lower extremity impairment and frailty differed. Further studies are needed to determine whether nutrition and exercise interventions can improve physical performance, and ultimately health outcomes and survival, in KT candidates with diabetes. Results: We included a total of 392 SOT recipients, 181 (46%) CMV D+/R-and 211 (54%) CMV D-/R-, with a mean age of 47 years and 297 (76%) males. There was a total of 151 (39%) liver, 188 (48%) kidney, 45 (11%) pancreas and 8 (2%) other type of transplants. Patients in the CMV D+/R-cohort were slightly older (51 versus 48 years, p=0.032) but other variables, including cardiovascular risk factors and pre-transplant TEs, were similar. Post-transplant primary CMV infection occurred in 108 (60%) recipients in the CMV D+/R-group with only 2 recipients (< 1%) in the CMV D-/R-group. Overall, TEs occurred in 38 (21%) patients in the CMV D+/R-group versus 22 (10%) in the CMV D-/R-group which indicates a twice higher cumulative incidence at 5 years in the former group (18% versus 9%, respectively). Among CMV D+/R-recipients, the rate of TEs at 5 years was not significantly different between patients with and without CMV infection including those with high (≥ 4logIU/mL), low (<4logIU/mL) or negative CMV DNAemia. After adjusting for potential confounders with a Cox-regression model, CMV D+/Rat transplantation was independently associated with increased risk of TEs over 5 years (HR 1.95, 95% CI 1.096-3.482). Conclusions: CMV D+/R-status at transplantation is associated with increased risk of thrombotic events post-transplantation. Conclusions: More advance heart failure before HT may be associated with a higher risk of CMV infection in CMV positive recipients. We need bigger numbers to confirm this tendency. Purpose: Although only approved for primary prophylaxis of CMV among allogeneic stem cell transplant recipients, there is great interest in the use of letermovir (LET) among solid organ transplant (SOT) recipients with drug resistant CMV due to toxicity of alternative agents and novel mechanism of action. We sought to describe our experience with LET for SOT recipients. Methods: All adult SOT recipients with of drug resistant CMV who received LET in 2018 for treatment or secondary prophylaxis were evaluated. Patient and CMV characteristics as well as outcomes were obtained. Results: Nine SOT recipients received LET for drug resistant CMV. Five received LET 720 mg daily for treatment of active infection and four received LET 480 mg daily for secondary prophylaxis. At both doses, LET was well tolerated and without significant bone marrow suppression, hepatotoxicity or nephrotoxicity. Among the patients on treatment dose, two achieved undetectable CMV PCR (<35 IU/mL) and two achieved a reduction in CMV PCR to a persistently low level (< 500 IU/mL). One patient (C) had ganciclovir added at 21 days. No patients developed new or worsening tissue invasive disease after LET was initiated. One recipient on secondary prophylaxis experienced breakthrough viremia during week 2. Conclusions: LET for treatment or secondary prophylaxis among SOT recipients appears to be safe and possibly effective in this ongoing cohort study. Given the urgent need for well tolerated effective treatment for resistant CMV in SOT recipients, LET and combination therapies should be studied. Purpose: The present study represents the largest series of donor livers with moderate macrosteatosis and provides detailed analysis of the effects of post reperfusion syndrome (PRS) and its impact on both short and long-term outcomes. PRS is an immediate complication of hemodynamic instability, asystole or the need to start the infusion of vasopressors during the post-reperfusion period. This study aims to describe the real peri-operative dangers of utilizing donor livers with significant macrosteatosis. CD4+ T cells (Panel 2b, p=0 .02) at 6 months post-transplant. Conclusions: ATG induction in pediatric patients favors a mature CD4 T cell phenotype, which correlates with fewer non-EBV infections. This advantage is not seen for EBV infections, possibly due to as an increase in CD4+ PD1+ T follicular helper cells, which are necessary for efficient maturation of B cells and EBV replication. These observations support the tailored use of ATG in pediatric kidney transplantation and provide further links between ATG induction and risk of posttransplant lymphoproliferative disorder. Methods: A retrospective analysis of deceased-donor transplants between January 2008-January 2018 was conducted using the UNOS database. Baseline characteristics of recipients of KDPI<35 kidneys in the pre and post-KAS periods were compared using the students t-test or X 2 analysis for continuous and categorical variables, respectively. Purpose: Lymphocytic bronchitis (LB) precedes chronic lung allograft dysfunction (CLAD). Peri-vascular (A-grade) and peri-bronchiolar (B-grade) lymphocytic inflammation are traditional markers of acute cellular rejection. The relationship of LB (Endobronchial or "E-score" rejection) to A-and B-grade pathologies is unclear. We hypothesized that LB would be associated with a distinct gene signature. We studied LB in two partially overlapping lung transplant recipient cohorts. Cohort 1 was defined by large airway brushes (6 LB cases and 18 posttransplant referents). Differential expression using DESeq2 was used for pathway analysis and to define an LB-associated metagene, as the normalized sum of counts for transcripts unregulated with a false discovery rate adjusted p-value of <0.1. Cohort 2 consisted of eight biopsy pairs for each of the A-, B-, and E-pathology subtypes matched with a pathology-free biopsy from the same subject. These biopsies were analyzed by multiplexed digital counting of immune RNA transcripts (NanoString). Metagene scores from biopsies with and without rejection were compared by paired t-tests for this LB metagene and previously published allograft rejection signatures. Results: Compared to referents, LB demonstrated upregulation of genes, including allograft rejection pathways, characterizing an LB-associated metagene (Cohort 1, Figure 1A ). We observed statistically increased expression for this LB-associated metagene as well as three other established allograft rejection metagenes in rejection vs. paired non-rejection biopsies for both E-grade and A-grade subtypes, but not in the B-grade pathologies (Cohort 2, Figure 1B) . Purpose: End-stage renal disease is associated with premature ageing of the T-cell immune system. At the time of kidney transplantation the average biological age of the circulating T cell system is increased by 15-20 years over the calendar age, but inter-individual variation is substantial. The hypothesis was tested that advanced immunological T-cell ageing increases the long-term mortality risk after kidney transplantation. Methods: Patients from a well-defined cohort (N=210), transplanted with a kidney from a living donor between 2010-2013, were included. All recipients received induction therapy with basiliximab and prednisone/MMF/tacrolimus. Circulating T cells were analyzed before and at 3, 6 and 12 months after transplantation . The number of CD31-expressing naive T cells (identifying recent thymic emigrants, a marker for thymic function), telomere length and differentiation status of deceased:37 cells/ul, p=0.001) were significantly lower in the deceased group prior to transplantation. Numbers of naive CD31 + T cells were inversely related with increasing age (r=0.56, p<0.001). However, the average numbers of naive CD4 + CD31 + and CD8 + CD31 + T cells in the deceased patient group was at the level of patients >75 years, independent of age. In a multivariate proportional hazard analysis including recipient age, the number of naive CD4 + T cells remained associated with all-cause mortality (HR 0.98, CI 0.98-0.99, p<0.001). The lowered number of naive CD4 + T cells in the deceased patient group was primarily caused by a decreased thymic function (less CD31 + naive T cells). In addition, a compensatory increase in CD31naive T cells, which is normally observed with age-related loss of thymic function, was not observed. Within the first year after transplantation, the number and characteristics of naive T cells remained remarkably stable. All other immunological parameters assessed were not related to patient survival after transplantation. Conclusions: Advanced immunological T-cell ageing at time of transplantation, defined by a severe reduction in thymic function, is highly associated with all-cause mortality after kidney transplantation. Purpose: Interpreting kidney transplant biopsies by histology or molecules involves assigning scores for separate pathologic categories: rejection, acute injury (molecules only), and chronic parenchymal loss (atrophy-fibrosis). However, it would be useful to have a global assessment of the dominant phenotype. We performed unsupervised archetypal analysis of 1745 kidney transplant indication biopsies using transcripts selected for their relationships to rejection (ABMR, TCMR), AKI, and atrophyfibrosis. We hypothesized that inclusion of rejection, injury, and fibrosis would generate a dominant rejection/injury/fibrosis (RIF) phenotype for each biopsy to supplement the assessment of separate pathologies. Methods: Indication biopsies from 22 centers were processed by histology and microarrays (INTERCOMEX ClinicalTrials.gov NCT01299168). Unsupervised analysis was used to assign 7 archetype scores to each of the biopsies. The scores add to 1.0, and the dominant phenotype is defined as the archetype with the highest score within each biopsy. The RIF phenotypes were: 1) normal, 2) TCMR, 3) early injury/no scarring (AKI), 4) early-stage ABMR (EABMR), 5) fully-developed ABMR (FABMR), 6) late-stage inactive (burned out) ABMR (BOA), and 7) atrophy-scarring/no rejection (IFTA). The average archetype scores vs time post-transplant are plotted in Figure 1 . Each biopsy was assigned to a group based on its highest score: Normal, N=566 (32%), TCMR, 138 (8%), AKI, 218 (12%), EABMR, 158 (9%), FABMR, 226 (13%), BOA, 76 (4%), and IFTA, 363 (21%). The main differences between RIF and our purely rejection-based phenotypes were in dividing EABMR into those with and without AKI, and those with late ABMR into active (FABMR) vs. inactive (BOA). These categories better reflect the relative risk of graft failure compared to normal kidneys ( Figure 2 ). Conclusions: In addition to the usual biopsy phenotyping -enumerating individual classes of rejection and injury abnormalities -a parallel dominant phenotype classification can, when added to a pure rejection-based phenotyping system, more closely identify the actual risks of each biopsy, and has the advantage of identifying the most normal tissues. The RIF phenotype of "molecularly normal" identifies kidneys with excellent chance for long term survival, and in this sense is superior to previous estimates of relative risk of failure. Purpose: To utilize a novel technology involving siRNA conjugated iron oxide nanoparticles to protect against ischemic damage to islet cells before transplant, therefore minimizing the amount of excised pancreas needed to allow for living donor islet cell transplantation to be a much more clinically feasible option in the treatment of diabetes mellitus (DM). Methods: After donor pancreatectomy, the islets were cultured for 2 days with siRNA conjugated nanoparticles that targeted the pro-apoptotic genes Caspase 3, Caspase 8 and Fas. In Study 1, the protective effects on the islets were assessed in vitro by direct islet counts before and after culture. Apoptosis was also directly analyzed via DNA apoptotic ladder assays. In Study 2, the insulin secreting effects of the islets were assessed in vivo by transplanting a marginal number of islets into diabetic recipients in a non-human primate model. After culture the harvested islet cells were transplanted via direct infusion into the recipient's portal circulation in a similar manner to current clinical practice. The animals were maintained on a standard immunosuppressive protocol, and their blood glucose levels were analyzed and tracked. Results: It is generally accepted that greater than 10K/IEQ is required to maintain stable blood glucose levels in human and non-human primates. We have shown a dramatic reduction in insulin requirements by utilizing a marginal number of islet cells (4,400 -7,800IEQ) compared to control cases without siRNA. Also, we have demonstrated the anti-apoptotic effects of the SiRNA conjugated islets in vitro with apoptotic ladder assays, as well as directly with minimal loss of islets after 2 day culture when compared to controls. We have also shown histologic evidence of positive insulin staining cells from the liver in recipients of direct portal infusion. Conclusions: It has been theorized that the reason for failure of islet transplantation has been due to ischemic injury of the islets after transplant. Our data have demonstrated for the first time ever the protective effects of magnetic nanoparticles as carriers for siRNA that targets pro-apoptotic genes in pancreatic islet cells before transplant. The use of siRNA conjugated iron oxide islets will further minimize the amount of islets needed for transplant, as well as decrease the amount of pancreas required to normalize blood glucose levels in the treatment of DM. Here, we have tested two costimulation blockade-based regimens in an NHP islet xenotransplantation model using only clinically available agents. Methods: Rhesus macaques (3.2-6.8kg) were used as xenograft recipients. Diabetes was induced with Streptozotocin (STZ, 1250 mg/m 2 , IV). Neonatal porcine islets (NPI) were isolated from GKO or GKO/hCD46 transgenic piglets (Revivicor, Inc.) by a modified Korbutt technique. All recipients underwent laparotomy and islet cell suspensions (≥50,000 IEQ/kg) were subsequently infused into the portal vein. Recipients were induced with either basiliximab at 0.3 mg/kg on days 0, 2 (n=6) or rhesus anti-thymocyte globulin (rhATG, NHP Reagent Resource) at 4mg/kg/day on days 0, 1, 2, 3, 4 (n=6). All recipients were maintained on CTLA4Ig (20mg/kg, on days -2, 0, 4, 7, 14, 21 and biweekly afterwards), tacrolimus (0.1mg/kg/d) from days -2 to 21, sirolimus (0.05mg/kg/d) from day 22 onwards, and mycophenolate mofetil daily (25mg/kg/d). All recipients received daily CMV prophylaxis with ganciclovir followed by valganciclovir. Following transplantation, graft function was monitored with daily morning fasting glucose, biweekly intravenous glucose tolerance testing (IVGTT), and biweekly serum porcine c-peptide level. Results: Twelve rhesus macaques successfully underwent NPI xenotransplantation. Of the six receiving basiliximab induction, engraftment was achieved in 4 and the median graft survival, as defined by the last day of detectable porcine c-peptide in plasma, was 14 days. Of the six receiving rhATG induction, all achieved sustained lymphocyte depletion following induction. All engrafted and the median graft survival was 40.5 days, significantly longer compared with the basiliximab group (p=0.0337). Graft rejection was confirmed by immunohistochemistry, demonstrating islets with a dense lymphocytic infiltrate. In summary, we demonstrated partial islet engraftment using two clinically available immunosuppression regimens. Depletional induction followed by costimulation blockade appears to significantly prolong graft survival. Purpose: Islet transplantation can cure Type 1 diabetes; however, multiple donors are needed due to extensive perioperative loss of islets when removed from their native blood supply. We recently discovered that CD34+45-vascular endothelial cells from parathyroid glands (PTG), another richly vascular endocrine organ, have an extraordinary ability to promote beta cell engraftment and function. This study aimed to generate a human embryonic stem-cell (hESC) derived source of CD34+45vascular endothelial cells that could equitably protect islet grafts. Methods: hESC-derived β-cells (eBCs) and CD34+45-(scCD34+) cells were differentiated from the same stem cell line and were co-transplanted in the subcutaneous (SQ) and intramuscular (IM) sites of immunodeficient mice. scCD34+ cells were FACS purified. Results: Using the same hESC line that we have used to produce eBCs, we have succeeded in generating CD34+CD45-cells after 9 days of in-vitro differentiation. These scCD34+ cells lack the hematopoietic marker CD45 and contain the endothelial precursor marker CD146+ (Fig A) . Moreover, these cells efficiently formed tubular structures in 4 hours in an in vitro angiogenesis assay, suggesting that these cells are vascular endothelial progenitor cells (Fig B) . Transplantation of scCD34+ cells in an in-vivo angiogenesis model, using mouse skin flaps, resulted in increased vascular density and number of vessel junctions compared to sham and hematopoietic CD34+45+ cells differentiated simultaneously. Co-transplantation of scCD34+ with eBCs in SQ and IM sites resulted in significant improvement of graft mass preservation when compared with eBC transplanted alone. Moreover, we developed a novel co-clustering method to combine scCD34+ with eBCs prior to transplantation, creating vascularized beta cells, which showed further engraftment protection in the SQ and IM via luciferase imaging when compared to control. While the βετα-cell-protective activity of PTG CD34+ cells is remarkable, a renewable source of CD34+ cells that possess the same activities of PTG CD34+ cells is highly desired to enable wide application of βετα-cell replacement therapy. This study show that vascular progenitor cells can be generated from stem cells and used to support the engraftment of stem cell derived beta cells. ). This tolerance is initially dependent Foxp3+ cells that are concentrated in the graft in distinctive Treg-rich organized lymphoid structures (TOLS). Recipients of allograft kidneys develop systemic tolerance to tolerant-resistant heart allografts in pigs, non-human primates, and mice. The plan of this study is to identify the cellular and molecular mechanisms of kidney induced islet allograft tolerance (KI 2 AT). Using a co-transplantation model of kidney and islet in mice, we aim to test if spontaneous tolerance of mouse kidney graft is able to induce systematic tolerance to islet grafts in the recipient. We present preliminary data that show that spontaneously accepted DBA/2 kidney allografts in bilaterally nephrectomized recipients can confer tolerance to DBA/2 islets, transplanted under the kidney capsule of the renal allograft, resolving hyperglycemia in streptozotocin (STZ)-induced diabetic B6 mice. Results: We show that the spontaneous acceptance of kidney allografts confers systemic tolerance to islet allografts ( Fig. 1) . At day 50, the kidney allografts were removed and analyzed pathologically. H&E staining of the kidney/islet allografts show a massive aggregation of leukocytes at the kidney-islet junction (Fig. 2) . Immunohistochemical staining showed that, while CD3+ and Foxp3+ cells are present within these aggregates, the majority of these cells are B cells (Fig. 3) . The findings from this project provide a model that will lead to more therapeutic regimens that can be translated to the clinic, incorporating cellular and molecular components shown to be essential for the development of systemic tolerance resulting in the acceptance of islet allotranplants, independent of cotransplantation with kidney allografts. Purpose: In the current standard clinical transplant approach, islet cells are administered intravenously into the portal system where they lodge in the liver. This approach exposes transplanted islets to a suboptimal environment due to the immediate-blood-mediated inflammatory response, hypoxemia and high levels of immunosuppressive drugs. Furthermore, this technique often requires multiple transplants to gain a sufficient engrafted mass to secure euglycemia due to loss of the transplanted beta cell mass, poor islet survival and function. Because of the omentum's rich blood supply, capacity to accommodate a large islet mass, convenient and minimal invasive route to implant and retrieve grafts, portal drainage and potential immune-privilege, this site may provide a suitable alternative site for islet transplantation. This approach can successfully reverse chemically induced diabetes in rat models and achieved insulin independence in 1 reported patient for up to 1 year but failed to cure diabetes in NHP models. We report a successful, clinically relevant islet transplantation model whereby allogenic islets are implanted in omental pouch and reverse diabetes long-term in diabetic nonhuman primate (NHP) recipients. Methods: As part of a JDRF sponsored study, allogeneic naked islets from single donor were transplanted to the omentum of STZ-induced diabetic cynomolgus monkeys applying a clinically relevant immunosuppressive regimen consists of Thymo (5mg/kgx4), Rituxan (20mg/kgx2) induction with Rapamycin only maintenance therapy. Animals were followed for three months to ensure that reproducible long-term graft survival and function can be achieved in this site. Results: 3 diabetic NHPs were transplanted with 98,400-105,000 islet equivalents (IEQs) and each promptly achieved normoglycemia. Glycemic control and insulin independence were maintained for the entire 90 days of study duration except one animal was electively terminated on day 48 posttransplant due to a study unrelated complication. IVGTTs performed at 1-and 3-month post-transplant showed normal provocative glucose homeostasis, especially at the 3-month time point where blood glucose dynamics were as robust as the normal healthy animal. Insulin and C-peptide were greater than 1.5mU/L and 2.0 ng/mL respectively for all animals. All 3 recipients demonstrated evidence of Treg expansion by flow cytometry and donor hyporesponsiveness by in vitro ELISPOT assay during the period of normoglycemia. Examination of the omentum at autopsy showed well preserved islet structures with strong insulin staining, and only rare lymphocytes infiltration. Conclusions: Fully long-term euglycemia and insulin independence were achieved by implantation of islets in the omental pouch. This study is the first report of a successful transplantation of single donor islets to the omentum in NHPs, which is a highly relevant pre-clinical animal model to develop strategies for beta cell replacement including stem cell derived beta cell and bio-engineered cells. Our protocol is attractive as the omentum pouch and immunosuppressive regimen we use are clinically applicable. Purpose: Human islets are exposed to a range of inflammatory stimuli upon transplantation that leads to poor graft survival. Identifying the various signaling events may provide strategies for intervention. Therefore, we studied the type of isletokines produced by islets and released via exosomes in response to stress Methods: Human islets were subjected to hypoxic and inflammatory conditions in vitro. Exosomes were isolated from the medium and characterized by TEM and surface marker expression analysis. Luminex multiplex assays were performed to identify isletokines released via islet exosomes. Western blot analysis was performed to identify the activation of signaling pathways in islets in response to hypoxia and cytokines. Withaferin A (WA) a plant derived compound known to block Nuclear Factor Kappa B was used to prevent isletokine release via islet exosomes, and effects of WA on other stress-signaling components was determined Results: Human islets exposed to hypoxia and proinflammatory cytokines showed significant activation of ER stress sensors IRE1α/XBP1/CHOP and also NF-κB signaling pathways resulting in release of exosomes containing IL-6, IL-8, MCP-1, and CXCL10. Prolonged release of exosomes containing isletokines induce significant c-Casp-3/Casp-3 mediated apoptosis in human islets. Pretreatment of islets with WA significantly suppressed the activation of canonical NF-κB RelA/ p65, ER stress signaling, and also prevented release of isletokines in the exosomes in response to hypoxic and pro-inflammatory stimulation. Furthermore, we identified other stress signaling components in islets included AKT/mTOR/GSK3B, which were activated in response to hypoxia and proinflammatory cytokines and blocked by WA except for GSK3b Conclusions: Human islets exposed to pro-inflammatory and hypoxic stress release exosomes carrying inflammatory isletokines. Inhibition of NFkB p65 with WA significantly suppressed isletokine release via exosomes and also the activation of AKT/mTOR/IRE1α. These findings indicate that stress-induced signaling can be targeted to prevent early innate immune destruction of islets during transplantation Results: 14d after allo-immunization and anti-TIM-1, TAM-treated B-IL-4RαKO mice had 50% less TIM-1+ B cells than TAM-treated Cre-neg XIL-4Rαfl/fl littermate controls. In addition, B cells from these B-IL-4RαKO mice expressed 50% less IL-4 and 40% less IL-10 than littermate controls. Next, we examined islet GS. Untreated B6 B-IL-4RαKO and control mice rejected BALB/c islets with MST 28d and MST 25d, respectively. Surprisingly, anti-TIM-1 treatment accelerated rejection in B-IL-4RαKO mice (MST 20d; p=0.01), while 75% of control mice treated with anti-TIM-1 achieved long-term GS (>100d; p=0.03). To try to understand this marked difference in GS, we examined the cytokine profile of T cells from B-IL-4RαKO vs. control mice. Splenic CD4 cells from B-IL-4RαKO mice expressed 33% more IFNγ and 33% less IL-10, and splenic CD8 cells expressed 20% more IFNγ than littermate controls. Conclusions: Collectively, our results suggest that B cell IL-4 signaling helps induce IL-10 secreting TIM-1+ Bregs which then reduce inflammatory T cell cytokines. Whether IL-4 produced by B cells themselves has an autocrine role and/or directly inhibits inflammatory T cell cytokines is currently being examined. Regardless, targeting B cell IL-4 signaling can have a major impact on transplant outcomes. Purpose: To investigate the impact of high fibre (HF) diet or dietary supplementation with the short chain fatty acid (SCFA) sodium acetate (SA) on kidney allograft rejection in a murine model of kidney transplantation Methods: Kidney transplants were performed from BALB/c to C57BL/6 mice as allografts. Allograft mice were fed a HF diet for 2 weeks prior and throughout experiments (WT+HF), or received SA 200mg/kg IP for 14 days post-transplantation then SA 150mM solution orally (WT+SA; GPR43 -/-+SA). Allograft controls received normal chow only (WT+NC). Gut microbiota composition was assessed by 16S rRNA sequencing. Results: WT+HF allografts had prolonged survival compared to WT+NC allografts ( Fig.1 p<0.01) , and were protected from acute (day 14: lower creatinine (p<0.01), less tubulitis (p<0.001)) and chronic (day 100: lower creatinine (p<0.05) and less proteinuria (p<0.01) and glomerulosclerosis (p<0.001)) rejection. Transplantation led to dysbiosis in WT+NC mice, with a loss of normal gut microbial diversity at day 14, but not in WT+HF mice where microbial diversity was enhanced (Fig. 2, p<0 .01). Clostridiales species (p<0.0001), known to produce SCFAs and to promote Treg development, and Bifidobacterium (p<0.05), known to produce SA, were both increased post-transplant in WT+HF mice yet diminished in WT+NC. In mechanistic experiments, WT+SA allografts exhibited superior survival to WT controls (p<0.05), were protected from rejection and exhibited donor specific tolerance, confirmed by acceptance of donor strain but rejection of 3rd party skin grafts (p<0.01). The survival benefit conferred by SA was broken by depletion of CD25+ Tregs by using anti-CD25mAb, and SA was ineffective in GPR43 -/allograft recipients (p<0.05). Conclusions: HF diet prevented transplant-associated dysbiosis and afforded protection against allograft rejection. Protection was mediated, at least in part, by SCFAs and was dependent on a CD4 + CD25 + Foxp3 + regulatory mechanism and signalling via GPR43. Dietary manipulation of the microbiome warrants evaluation in human transplantation. Purpose: Transient mixed chimerism induced with non-myeloablative conditioning in monkeys and humans induces tolerance to about 70% of donor kidney grafts transplanted at the time of bone marrow transplantation. To improve the reliability of tolerance induction, we have infused polyclonally expanded recipient Tregs at the time of BMT. We have achieved prolonged but not permanent chimerism with this approach in cynomolgus macacques. To determine the robustness of tolerance, we waited 120 days before grafting donor kidneys, as such grafts are routinely rejected by animals receiving BMT without Tregs. Methods: Ex vivo expanded polyclonal recipient Tregs were grown from sorted (CD4+ CD25hi) PBMCs and cryopreserved. Conditioning includes total body irradiation (125cGy), thymic irradiation (7Gy), anti-CD154 mAbs, horse ATG, followed by allogeneic MHC-mismatched BMT and 1 month of rapamycin. 4 cynomolgous macaques received this protocol, of which two received Treg infusions. To test tolerance, donor kidney Tx was performed on day 120 after BMT without immunosuppression, after peripheral chimerism had been lost. Results: Transient peripheral blood macrochimerism was seen for 20 and 50 days after iliac crest BMT in 2 animals receiving Tregs. Kidney Tx in the two Treg recipients functioned for 35 days and >170 days without IS. In the longest survivor, MLR assays showed antidonor hyporesponsiveness that was restored in CD25depleted MLR. Histopathologic analysis show no major signs of graft rejection 152 days after transplant. Kidney transplants are upcoming in two control animals. Conclusions: Administration of autologous ex vivo expanded Tregs in combination with iliac crest BMT results in prolonged survival of donor kidneys grafted after peripheral chimerism is lost. The addition of Tregs to the mixed chimerism protocol may allow for the induction of tolerance to kidneys even in the absence of permanent chimerism. Ex vivo expanded, polyclonal Tregs given at the time of BMT not only enhance donor chimerism but may induce a regulatory environment that is associated with robust operational tolerance to the donor. A. Dangi 1 , E. B. Thorp 2 , X. Luo 1 , 1 Medicine, Duke University, Durham, NC, 2 Pathology, Northwestern University, Chicago, IL Purpose: It is well known that certain infections can impair the stability of transplantation tolerance in recipients who had prolonged functional grafts, resulting in graft-loss. However, the underlying mechanism of pathogen-induced abrogation of tolerance is unclear. Cytomegalovirus (CMV) infection is highly prevalent and has long been associated with allograft-rejection. Herein, we aimed to investigate the impact of CMV infection on allograft tolerance in previously tolerant recipients, and the underlying mechanism of tolerance impairment. A fully mismatch mouse model of pancreatic islet transplantation (Balb/c to C57BL/6J) was used. Tolerance was induced by intravenous infusion of donor apoptotic splenocytes (treated with ethylene carbodiimide; ECDI-SP) on days -7 and +1. Tolerant recipients with normoglycemia for >90 days were infected with murine CMV strain Δm157 (10 8 PFU). In some experiments, tolerant recipients were treated with anti-FR4 antibody (TH6) to deplete FR4 + anergic CD4 T cells prior to MCMV infection. Results: Infusion of ECDI-SP induced the allograft tolerance and these recipients carried functional grafts indefinitely. MCMV infection in these recipients however rescinded tolerance resulting in graft-loss within 2-4 weeks. Concurrently, allografts from uninfected tolerant recipients were enriched with CD44 + CD4 T cells displaying an anergic phenotype (FR4 HI CD73 HI PD1 + CTLA4 + ). Interestingly, post-MCMV infection, the anergic phenotype of intragraft CD4 T cells was lost and this was corelated with tolerance impairment and graft-rejection. These data suggested that intragraft anergic CD4 T cells might contribute in tolerance maintenance in these recipients. However, depletion of anergic CD4 T cells (~80%) using anti-FR4 antibody did not impair allograft tolerance. In fact, depletion of anergic CD4 T cells from the tolerant recipients prior to MCMV infection prevented the disruption of transplantation tolerance resulting in prolonged allograft survival. Conclusions: Collectively, our data suggest that (1) Purpose: Modulation of the immune system using cellular therapy is an increasingly attractive option to prevent solid organ graft rejection, while at the same time maintaining immune function, sufficiently so to prevent most complications associated with general immunosuppression. We have previously shown that murine myeloid progenitor cells (mMPC), cryopreserved after expansion in vitro, not only induce robust tolerance for matched organ grafts when preconditioning included lethal whole body irradiation and third party hematopoietic stem cell transplantation (HSC), but also when non-lethal pre-conditioning is used. Methods: BALB/c mice were pre-conditioned with a combination of depleting antibodies, chemotherapeutic agents and immunosuppressants as well as infusion of 3 to 6 million mMPC expanded from HSC in vitro using a 10-day expansion protocol. mMPC were injected i.v. one week after skin graft placement. Skin grafts were either matched or not matched to the mMPC. These experiments looked at survival following the treatment without HSC transplantation, engraftment by mMPC-derived cells, tolerance to mMPC-matched, but not unmatched, skin grafts, generation of anti-graft MHC antibodies and anti-graft T cell responses in blood. Results: A preconditioning protocol that combines anti-thymocyte serum, anti-CD3 antibodies, busulfan, and rapamycin over a two week period allowed mMPC to be infused up to two weeks after the skin transplants. 40 out of 44 mMPC-matched grafts tested using this protocol survived for more than 5 months. mMPC mismatched grafts were all rejected. When looking at the effects on T and B cell responses, results shows that administration of mMPC under these conditions significantly reduces the T-cell response against mMPC-matched stimulator cells, but not against third party stimulator cells in MLR assays. Circulating antibody levels directed against mMPC-matched MHC molecules did not change following mMPC administration, compared to untreated controls. Animals that received a mMPC-matched skin graft did not show increased levels of anti-graft MHC circulating antibodies. However, when animals received third-party skin grafts, mismatched to both mMPC and host, a significant increase in the levels of circulating antibodies recognizing the skin graft MHC was seen. Conclusions: mMPC administration can contribute to tolerance induction without the need for lethal preconditioning or allogeneic hematopoietic stem cell transplantation. Under these non-lethal conditions mMPC can induce tolerance for solid organ grafts by modulating the T and B cell response directed against the graft. In particular, the role that melanocortin receptors (MCRs) play in the modulation of T cell activity has gained much attention. Methods: Here, we examined the expression of MCRs on leukocyte subsets and dissected their contributions to the function of regulatory T cells (Tregs). We also carried out heart transplant studies using adrenocorticotropic hormone (ACTH) to assess the role of MCRs in transplant rejection. Results: We isolated human CD19 + B cells and CD3 + T cells, and we found that the expression of melanocortin 5 receptor (MC5R) at both the mRNA and protein levels was higher than other MCR subsets in these cells. Immunofluorescence staining of CD3 + T cells demonstrated the presence of MC5R on the plasma membrane. Following induction or expansion of Tregs, the expression of MC5R was increased. Addition of ACTH to a human Treg induction assay amplified the production of Tregs. Then, we tested the hypothesis that administration of ACTH would improve the survival of heart allografts in CD28 -/mice, a strain afflicted by impaired Treg function. The mean survival time (MST) of heart allografts in the untreated CD28 -/mice group and the CD28 -/mice group treated with ACTH were 12 days vs. 43 days, respectively (*p<0.05). Moreover, ACTH was found to increase the synthesis of Tregs in a Treg induction assay using lymphocytes from CD28 -/mice. As cytotoxic T-lymphocyte-associated protein 4 immunoglobulin (CTLA4-Ig) has been shown to abrogate Tregs, we tested the synergistic effects of ACTH in combination with CTLA4-Ig on the promotion of tolerance in a complete MHC-mismatched heart transplant model. The MST of the mice treated with CTLA4-Ig alone and those treated with CTLA4-Ig and ACTH were 54 days and >60 days, respectively. Further investigations are underway to assess the impact of the promotion of Tregs by ACTH in these in vivo models. In summary, these data indicate the importance of MC5R in the regulation of alloimmunity and the potential utility of stimulating the MC5R pathway to induce transplant tolerance. Purpose: Acute Graft-versus-Host Disease (GvHD) is a rare but frequently lethal complication after solid organ transplantation (SOT). In intestinal transplant recipients, anti-HLA Donor-Specific Antibody (DSA) were found to hasten the clearance of donor lymphoid cell in the blood and the allograft. We thus hypothesized that transfusion of plasma with high levels of antibodies targeting at least one donor mismatched HLA antigen, could control GVHD in such setting. Methods: We faced a therapeutic dead-end in an immune-deficient child with severe steroid-resistant GvHD after a kidney transplantation. Despite high-dose steroids, cholestatic hyperbilirubinemia worsened and a profound marrow aplasia occurred. An urgent nationwide search among 3800 registered blood donors with known anti-HLA immunization identified 5 potential donors, 3 of whom underwent plasma donation. We aimed at achieving an in vivo DSA mean fluorescence intensity (MFI) comprised between 2,500 and 5,000 through a 4-fold dilution of the collected plasma. The DSA was measured at 4439 and 3907 MFI units in the 4-fold diluted plasmas #1 and #2. Results: The patient, with an estimated plasma volume of 600 mL, received 200 mL of plasma #1 and plasma#2, three days apart. Rapid DSA adsorption on donor cells was supported by rapid drop in MFI post-infusion. Plasma transfusions were remarkably well tolerated, with no discernable kidney allograft toxicity. The patient had been experiencing severe neutropenia (<0.5 G/L) and major hyperbilirubinemia (>10 mg/dL) for 15 and 6 days, respectively. The day following the infusion, white cell count rose sharply, meanwhile the bilirubin dropped. At day6 post-infusion, the neutrophil count peaked at 4.4 G/L, while total bilirubin decreased to 3.8 mg/ dL. Within a week, the general status dramatically improved. Diarrhea completely and durably resolved, and the child gained 2 kg over 5 weeks. Steroids doses were progressively tapered down and stopped on POD311. Before the first plasma transfusion, roughly 99% of the circulating CD3+ T cells were donor-derived. An unusual flow cytometry staining pattern was noticed in a subset of donor T cells that co-expressed the recipient-specific HLA-A2 molecule, yet at a lower level compared to recipient T cells. Imaging of these cells unveiled that this pattern resulted from recipient-derived extracellular microvesicles bound to donor T cells. Strikingly, this T cell subset sharply decreased as early as 3 days after the first infusion and was barely sizeable thereafter. Conclusions: DCD remains a vital opportunity to expand the donor pool; concerns of delayed graft function and ischemic cholangiopathy limit its enthusiastic utilization by all. Complications from DCD donors are linked to warm ischemic times. Lowering the observation period from 5 to 2 minutes represents an evidence-based opportunity to improve the quality of DCD allografts without compromising care of potential donors. Purpose: Liver transplantation (LT) has become a standard therapy for the management of end-stage liver disease, while early allograft dysfunction (EAD) remains a serious problem. Chronic liver inflammation causes fibrosis (collagen deposit) that may lead to cirrhosis, whereas early stage of hepatic fibrosis develops without any symptoms, recognizable clinical variables or macroscopic abnormality on liver surface. Due to a myriad of ongoing metabolic and environmental stressors that may trigger hepatic inflammation, we hypothesized that some of the current donor liver grafts have inapparent fibrosis, which in turn may affect susceptibility against LT challenge. By retrospectively analyzing collagen deposition in LT biopsies (Bx), this study aimed to determine whether occult hepatic fibrosis at the time of procurement influences clinical outcomes. Methods: Liver Bx were obtained during LT surgery under IRB protocol (n=38). Hepatic Bx were stained with Sirius-red to detect collagen deposition and Sirius-red positive area (SRA) was calculated by dividing positive-stained / whole tissue area. In addition to SRA, clinical parameters including donor age/gender/BMI, recipient age/gender/BMI, donation after cardiac death, MELD score, cold ischemia time, warm ischemia time were evaluated. A multivariate analysis based on a step-wise logistic regression model and a Cox regression model were used to identify predictive factors of EAD and rejection-free recipient survival, respectively. Results: Of 38 cases, a median of SRA was 11.7% (range: 1.1%-29.9%) while SRA in reference normal mouse livers were 0.9±0.4%. SRA positively correlated with sALT levels at POD1 (r=0.321, p=0.049). To evaluate the predictive power of SRA for EAD, a cut-off value of 16% was decided based on the ROC curve (AUC=0.703, sensitivity=0.800, specificity=0.788), and then 38 cases were divided into low-SRA (SRA<16%, n=27) and high-SRA (SRA>16%, n=11) groups. The high-SRA cases had increased mRNA levels coding for αSMA (a fibrosis marker, 1.9±1.6 vs 0.4±0.1), experienced more frequent EAD as compared to low-SRA patient cohort (36.4 vs 3.7%, p=0.019), while a multivariable analysis identified high-SRA as a sole independent predictive factor of EAD (odds ratio=14.9 [95%CI: 1.4-155.0], p=0.024). The high-SRA cases exhibited inferior post-LT rejection-free survival (3-years: 44.6% vs 85.2%, p=0.008, median follow-up=1258 days), while a multivariable analysis identified high-SRA as an only independent predictive factor of rejection-free survival (hazard ratio=4.4 [95%CI: 1.3-14.5], p=0.015). This study documents the critical influence of occult liver fibrosis at the time of procurement on clinical LT outcomes. By shedding new light on previously unappreciated inapparent fibrosis development in the hepatic donor tissue, our findings provide a rationale for future studies with larger LT patient cohort. Purpose: There remains a disparity between the number of patients awaiting heart transplantation (HTx) and the availability of donor hearts. This is exacerbated by relatively low rates of donor heart utilization, partly because of a reluctance to accept extended criteria organs. Many of these so-called extended criteria have been shown not to impact outcomes. We sought to determine whether extended criteria have a cumulative effect on recipient outcomes. Methods: Between 2012 and 2017, we assessed 626 HTx extended criteria donors, defined as donor age >50 yrs, left ventricular (LV) hypertrophy >1.2cm, LV ejection fraction (LVEF) <50%, ischemic time >4 hours, donor-transmitted coronary artery disease (CAD), female-to-male gender mismatch, and donor:recipient weight <0.80. We then divided recipients into four groups according to the number of criteria present: 0 (n=350), 1 (n=220), 2 (n=76), ≥3 (n=15). We assessed each group for 3-year actuarial survival, freedom from cardiac allograft vasculopathy (CAV), freedom from any-treated rejection and freedom from non-fatal major adverse cardiac events (NF-MACE: MI, CHF, stroke, and need for angioplasty or pacemaker/ICD). Results: There was no difference in 3-year actuarial survival, 3-year freedom from CAV and 3-year freedom from any-treated rejection between the groups. However, there was progressively worse freedom from NF-MACE as the number of extended criteria increased (82.5% vs 74.8% vs 59.3% vs 54.6%; p=0.035). The most common extended criterion in group 2 was donor age (56/220), the most common combination of 2 criteria in group 3 was age and CAD (19/76) and the most common combination of 3 criteria in group 4 was LV hypertrophy, gender mismatch and CAD (4/15). In an attempt to expand the donor pool, numerous single-center series have demonstrated good outcomes for extended criteria organs. We show acceptable outcomes for donor organs with multiple extended criteria. However, we found that risk of NF-MACE incrementally increased with the number of criteria. These findings suggest that donors with one or even two or more extended criteria are acceptable for use, although complication rates may be higher. Purpose: Obesity and nonalcoholic fatty liver disease (NAFLD) are major health issues affecting 20-50% of the US adult population and are increasingly prevalent in potential organ donors. We studied organ utilization in biopsy-proven NAFLD donors to understand the effect on discard and recipient outcomes. Methods: Scientific Registry of Transplant Recipients (SRTR) data were used to determine outcomes in adults transplanted with livers that had undergone prerecovery donor liver biopsy between 2004 and 2018. NAFLD was defined as > 5% macrosteatosis. Co-existing donor risk factors included donor age >60 years, terminal transaminases >500 u/L, hepatitis B (HBV) core antibody positive, hepatitis C (HCV) antibody positive, donation after cardiac death (DCD), donor ICU length of stay > 7 days, and distance to transplant center >150 miles. Donor and recipient demographic data were examined, and survival and outcomes calculated. A p-value of <0.05 was considered significant. Results: 10,275 of 38,964 (26%) of livers with a pre-recovery biopsy were not used from 2004-2017. NAFLD was noted in 6,661 (65%) of these discards, and as the primary reason for discard in 35% of donors. 21% of NAFLD donors without any additional risk were discarded. Older age (>60years), obesity (BMI >30), heavy alcohol use, prolonged donor ICU course and HCV resulted in small increases in rates of discard, however co-existing risk factors of DCD status (44.6%), elevated terminal transaminases (35.5%) and distance to transplant center (40.8%) resulted in high likelihood of discard. Allograft survival with and without NAFLD, were not significantly different. Outcomes using NAFLD livers with co-existing risk factors did not universally lead to worse outcomes, with only cold ischemia, DCD and donor ICU course combining to significantly lower long-term outcomes ( Figure 1 ). Interestingly, there is much discordance between combinations driving discard rates and their impact on graft survival when used clinically. Conclusions: Biopsy findings of NAFLD, even when combined with additional coexisting donor risk factors shouldn't trigger reflex liver discard. While there is likely selection bias underlying successful use it indicates that with appropriate recipient selection, balancing of risk can lead to good long-term outcomes. (Shaheen et al, 2016) . However, there has not been significant research on the risk factors and incidence in the brain dead organ donor population. Complications secondary to IAH and ACS can limit the organs available for transplantation through end-organ dysfunction and gross hemodynamic instability in the donor. Aim: Understanding the risk factors and incidence of IAH and ACS in the brain dead organ donor population will help to promote early identification of these processes and subsequent intervention which may maximize the number of organs suitable for transplantation. Methods: Retrospective review of the data from January 2012 through April 2018 was completed to understand the incidence and impact of IAH and ACS in Midwest Transplant Network's (MTN) brain dead donor population. An MTN created risk factor checklist was utilized to decide which patients to monitor intra-abdominal pressure (IAP). Results: There were 1,410 recovered organ donors during this time period. 212 (15%) screened positive and underwent IAP monitoring with 13 (0.9%) indicating invasive intervention -decompressive laparotomy or abdominal drain placement. 11 donors had successful intervention, 2 did not -due to lack of availability of surgeon and family restriction. In the IAP monitoring group (no invasive intervention), overall observed vs expected organs transplanted was 1.07. In the invasive intervention group, overall observed vs expected organs transplanted was 0.99. In the group where intervention was indicated but unable to be completed, the overall observed vs expected organs transplanted was 0.71. A. L. Friedman, E. Marquez, J. A. Lewis, LiveOnNY, New York, NY Purpose: A retrospective review of cases from one large organ procurement organization (OPO) was performed to determine whether the delivery of RRT as a component of donor management led to successful procurement and transplantation of deceased donor organs. Methods: All cases from 1/1/16 -11/20/18 were searched for patients with consent for organ donation, a peak creatinine > 3 mg/dL, and any history of RRT. These individual records were searched to identify whether RRT was delivered during the terminal hospitalization and which organs were procured and transplanted. Posttransplant outcomes were obtained from TIEDI. Results: 17 patients were identified; all were brain dead. 16 livers, 2 hearts, 2 lungs and 2 kidneys were transplanted into 21 patients. Follow-up data was available for 20/21 recipients. Most (11/17, 65%) had preexisting endstage renal disease (ESRD) and a mean age of 52.9 years. 6 (35%) had acute kidney injury and a mean age of 46.5 years. The mean length of stay was 7.9 days (range 3 -20). Organs were procured and transplanted from 16 of 17 patients; cirrhosis was newly identified intraoperatively in 1 patient from whom no organs were recovered. Follow-up data was available for 15 of 16 liver recipients; 80% (12/15) were alive with graft function at 30 days. Both kidneys exhibited delayed graft function but are working 6 months post-transplant. Both lungs (1 recipient) are functioning 2 months post-transplant. Both heart recipients were alive 6 months post-transplant. Conclusions: CMS requires hospitals to "maintain potential donors while necessary testing and placement of potential donated organs, tissues and eyes take place". But renal replacement therapy is often withheld due to futility of the index patient. With the length of stay for most donors > 7 days, and an established mean survival of 7.4 days following dialysis withdrawal, it is apparent that RRT should be delivered to preserve the option of organ donation in suitable patients. The wonderful paradox of utilizing these RRT resources to save the lives of transplant candidates, despite the expected deaths of these organ donors, should be emphasized to hospitals, intensivists + nephrologists whose collaboration is sought. Purpose: Our objective was to survey health care professionals (HCPs) responsible for the care of potential organ donors (PODs) on attitude toward organ donation, knowledge about procurement process and potential lung donor (PLD) management nationwide. Methods: An 39-item anonymous questionnaire was designed and distributed online among HCPs previously register in an Argentinian intensive care society between march and september of 2018. The questionnaire contained demographic data, questions regarding knowledge, attitude toward organ donation and professional experience. Results: There were 736 responses, 61% female, with a mean age of 41 (9) years old. Of the surveyed HCPs, 66% worked in an adult intensive care unit (ICU) and 16% in a pediatric ICU. Sixty-one percent were physicians, 22% nurses and 18% physiotherapist. The attitude towards organ donation was reported as 88% favorable, of which 78% were registered organ donors. Knowledge regarding organ donation was classified as "adequately informed" by 78% of the HCPs and 80% of them expressed knowledge in POD management. Only 27% received organ donation information during professional training and 66% while completing a graduate degree. Seventy-one percent of the HCPs considered a neurocritical patient to be POD when brain death criteria were present. The majority of the HCPs (97%) reported having worked with a POD and 70% participated actively in the procurement process. Eighty percent of the physicians had reported a POD to the federal organ procurement organism. The criteria the POD must meet to be considered a PLD were correct in 71% of the answers, however, only 19% had a PLD management protocol. After diagnosing brain death, 51% made no changes to any of the ventilator parameters, and nearly a quarter (23%) were not aware of which parameters to select for a PLD. In case of hypoxemia, 84% would implement some strategy to improve oxygenation. The most frequently used strategies were: positive end expiratory pressure titration, endotracheal suctioning with a closed-circuit and recruitment maneuvers. Brigham and Womens Hospital, Boston, MA, 2 Cardiac and Thoracic Surgery, Brigham and Womens Hospital, Boston, MA, 3 Pulmonary, Brigham and Womens Hospital, Boston, MA, 4 Thoracic Surgery, Brigham and Womens Hospital, Boston, MA, 5 Infectious Diseases, Brigham and Womens Hospital, Boston, MA Purpose: Prior to the development of direct acting antiviral (DAA) therapy for hepatitis C virus (HCV), there was concern that patients who received lungs from HCV positive donors may have worse graft and patient survival, possibly due to transmission of a virus that led to liver disease and exposure to intravascular foreign material from intravenous drug use. Now, in the era of well tolerated and highly efficacious DAA, we compared pathologic findings attributable to donor exposures and clinical outcomes in patients who received lungs from HCV positive donors to patients who received lungs from HCV uninfected donors, stratified by donor risk profile. Methods: Surveillance transbronchial biopsies in lung transplant recipients within the first year post-transplant were evaluated for the presence of intravascular polarizable foreign material consistent with intravenous injection, anthrasilicosis and other chronic pathologic changes attributable to donor disease. The transplant recipient biopsy findings were compared among 3 groups based on donor risk profile: active HCV infection (NAT+), increased risk, non-HCV, and non-increased risk, non-HCV. Donor and recipient baseline characteristics and clinical outcomes were analyzed. Results: 52 lung recipients had adequate biopsy tissue for evaluation: 18 HCV NAT+ donors, 7 increased risk, non-HCV donors, and 27 non-increased risk, non-HCV donors. 3 of 18 (17%; 2 mild, 1 moderate) HCV NAT+ had intravascular polarizable foreign material compared to 1 of 7 (14%; 1 mild) increased risk, non-HCV and 0 of 27 non-increased risk, non-HCV. Anthracosis was a common finding in all groups, with 10 of 18 (56%; 3 moderate, 7 mild) HCV NAT+, 4 of 7 (57%; 1 moderate, 3 mild) increased risk, non-HCV and 16 of 27 (59%; 4 moderate, 12 mild) nonincreased risk, non-HCV demonstrating this smoking or exposure associated finding. There was a numerical increase in the number of patients treated for acute cellular rejection (ACR) in the HCV NAT+ group (11 of 18, 61%) and the increased risk, non-HCV group (4 of 7, 57%) compared to 11 of 27 (41%) in the non-increased risk, non-HCV group. Other clinical outcomes including primary graft dysfunction at 72 hours, length of hospitalization, graft survival, and overall patient survival were similar among the 3 groups. Conclusions: There was not a significant difference in the degree of intravascular foreign material between HCV NAT+ and increased risk, non-HCV lungs in this selected population, but both appear to be greater than that seen in the non-increased risk, non-HCV lungs. The percentage of donor lungs with anthracosis and the severity thereof was similar in all groups. Further investigation is warranted to better understand whether the presence of exogenous material in the donor lung is associated with early ACR. The similarity of the other clinical outcomes across the 3 groups, suggests that the excess intravascular particulate material likely did not have a significant effect in the increased risk (HCV NAT+ and non-HCV) groups. Fig A) . Survival at 1-, 3-and 5-years was 92%, 69%, and 52%, respectively. Compared to non-sensitized patient at the center between 2013 and 2017, graft survival rates between desensitized and non-sensitized patients were similar (Fig B) . Conclusions: By using one protocol in a large number of LuT patients, we feel that we have strong evidence that we can achieve equivalent post-transplant results in those LuT patients who are sensitized (compared to unsensitized patients). Our laboratory demonstrated the presence of circulating exosomes with lung self-antigens (SAgs), Collagen V and K-α Tubulin, and donor HLA in lung transplant recipients (LTxR) undergoing rejection. Since respiratory viral infections (RVI) is a known risk factor for development of chronic rejection following LTx, the goal of this study is to determine whether RVI leads to induction of circulating exosomes with SAgs and to demonstrate the presence of viral DNA/RNA in exosomes isolated from LTxR with RVI. Methods: Exosomes were isolated using ultracentrifugation and purity was confirmed by sucrose cushion gradient. DNA and RNA were isolated using kits and quantified on the Nanodrop. Libraries were generated using Kapa Biosystem's library kit. The raw Illumina 2x150bp pair-end reads were checked on FastQC and were aligned to the human and viral genome build from CHIPseeker Database. Validation was done using Abs and primers to respiratory syncytial virus, coronavirus, and rhinovirus. To determine the role of exosomes in inducing stress and DNA damage, we incubated airway epithelial cell line, KCC266, with exosomes from LTxR with RVI or stable. Results: Viral nucleic acid sequences were demonstrable in exosomes from LTxR with RVI. Comparing the sequences with human genome, we identified the presence of DNA sequences specific to defensins and GTPase pathways, 195 unique in LTxR with RVI specific to apoptotic cleavage, NMDA receptor activation and stable had 91 sequences specific to MAP kinase and cell death signaling. We also identified H. influenzae sequences in exosomes isolated from LTxR with influenza. Further, we demonstrated increases in proteins associated with endoplasmic reticulum stress, i.e, PERK, ATF4 and BiP in cells incubated with exosomes isolated from LTxR with RVI. However, significant increase in the levels of STING and IRF3 at the protein levels were not detected. Conclusions: Based on these preliminary results we conclude that RVI LTxR leads to induction of circulating exosomes having unique nucleic acid sequences of viral etiology suggesting that these nucleic acids of viral origin may have functional consequences including upregulation of stress markers. Purpose: In lung transplants (LTx), chronic rejection remains a major cause of morbidity and mortality following transplantation, mainly from chronic lung allograft dysfunction (CLAD). Two phenotypes exist; bronchiolitis obliterans syndrome (BOS) and restrictive allograft syndrome (RAS). Improved methods of CLAD diagnosis are necessary. Cell-free DNA (cfDNA) consists of fragments of nucleic acids that circulate in biofluids. Studies have shown that plasma and urine cfDNA is elevated in transplant patients undergoing allograft rejection. We explored cfDNA and CXCL10 derived from bronchoalveolar lavage (BAL) as potential biomarkers of CLAD and survival in LTx patients. Methods: Thirty-four LTx patients with BAL specimens were studied. Patients were diagnosed per standard definitions as stable (n = 14), BOS (n = 6), or RAS (n = 14). BAL was used for measurement of cfDNA and inflammatory markers. ANOVA and Cox proportional hazards regression were performed. Results: Cox proportional hazards regression was performed to assess the degree of association of cfDNA, CXCL10, and the interaction of cfDNA and CXCL10 with overall survival. Likelihood ratio tests revealed a significant association of overall survival and cfDNA (p < 0.01), CXCL10 (p = 0.01), and the interaction of cfDNA and CXCL10 (p = 0.02) ( Figure A) . Mean differences in log(cfDNA) and log(CXCL10) levels between stable and CLAD diagnoses were significant (ANOVA; p = 0.004 and 0.02, respectively). Pairwise mean comparisons of cfDNA resulted in a significant difference between BOS and stable cfDNA values (p = 0.001), and RAS and stable cfDNA values (p = 0.04) ( Figure B) . Pairwise mean comparisons of CXCL10 resulted in a significant difference between RAS and stable CXCL10 values (p = 0.04) ( Figure C) . Despite the small size of our study, results strongly suggest that cfDNA and CXCL10 measurements in BAL are highly synergistic biomarkers that are predictive of overall survival in LTx patients. Larger prospective studies in LTx patients utilizing cfDNA and CXCL10 are currently being executed to validate the utility of these biomarkers in diagnosis of CLAD and as predictors of overall survival. Purpose: Club cell secretory protein (CCSP) levels have been employed as a biomarker for respiratory stress in athletes, asthma patients, and in experimental acute and chronic lung injury models. A significant reduction in bronchoalveolar lavage fluids (BALF) CCSP levels and club cell numbers has been reported in lung transplant recipients (LTxRs) who develop bronchiolitis obliterans syndrome (BOS) compared to stable LTxRs. Goal of this study is to determine decline in CCSP and its correlation with induction of de novo development of Abs to lung-associated self-antigens (SAgs), Kα1 Tubulin (Kα1T) and Collagen V (Col-V), in LTxRs diagnosed with BOS. Methods: BALF from LTxRs diagnosed as having BOS (ISHLT criterion) were obtained for this study. CCSP levels in cell-free BALF were analyzed by ELISA (BioVendor, Asheville, NC). Development of Abs to lung SAgs, Kα1T-and ColV-, were determined by ELISA. Cytokines (i.e., G-CSF, GM-CSF, IFN-α, IFN-γ, IL-1β, IL-1RA, IL-2, IL-2R, IL-4, IL-5, IL-6, IL-7, IL-8, IL-10, IL-12, IL-13, IL-15, IL-17 and TNF-α), chemokines (i.e., Eotaxin, IP-10, MCP-1, MIG, MIP-1α, MIP-1β and RANTES), and growth factor levels (i.e., EGF, FGF-basic, HGF and VEGF) in BALF was analyzed by a 30-plex Luminex panel (ThermoFisher, Waltham, MA). Results: LTxRs with BOS (n=22) demonstrated significant decline in BALF CCSP level (58.6± 13.2 ng/ml) (p<0.05) as compared to stable patients (n=43) (65.6±11.9ng/ml). BOS (n=5) patients who developed Abs to lung SAgs had lower level of CCSP (58.6±12.2ng/ml) (p<0.05) in BALF as compared to CCSP levels (68.0±11.3ng/ml) of stable patient (n=25) who did not develop Abs to SAgs. LTxRs with BOS (n=7) who had lower levels of CCSP (43.2±12.3ng/ml) demonstrated increased levels of IL-6, IL-8, IL-1RA, IP-10, MCP-1, RANTES) levels though not statistically significant. However, BOS (n=7) LTxRs who had lower levels of CCSP (43.2±12.3ng/ml) also had significantly lower levels of VEGF (1.07±0.33pg/ml) as compared to stable (n=7) LTxRs (VEGF: 1.42±.58pg/ml) (p<0.05). We propose that a progressive decline in CCSP has a role in initiation of the cascade of inflammatory process and immune responses to lung SAgs, Col-V and Kα1T, which may contribute to the pathogenesis of chronic rejection following lung transplantation. while there are no current guidelines to address CYP3A4*22 variances. We aim to assess the clinical differences between the loss of function alleles, CYP3A5*3, *6, *7 and CYP3A4*22 in a cohort of kidney transplant recipients (KTR) receiving standardized weight-based dosing of TAC. Methods: A single center retrospective longitudinal cohort study of KTRs expressing CYP3A homozygous or heterozygous loss of function alleles over a 24 month period (2013) (2014) (2015) who received initial weight-based dosing of oral TAC (0.1mg/kg/day) as part of maintenance immunosuppression were analyzed for time to therapeutic TAC trough (TTT, defined as 7-11 ng/mL), percentage of undetectable TAC troughs at hospital discharge (PUT), actual TAC troughs (ATT) at hospital discharge and mg/kg therapeutic total daily dose requirements (TDD). Results: A total of 171 KTR were included in the study, comprising of homozygous CYP3A5*3/*3 (*3 only group, n=53) vs heterozygous CYP3A5*3/*6, *3/*7, or homozygous *6/*6 , or *7/*7 (mixed group, n=19) expressers and CYP3A4*22 (n=6) vs non-*22 expressers (n=157). Mixed group had a longer TTT (11 days vs 6.7 days, p=0.02), and a lower ATT (3.8 vs 6.3 ng/mL, p=0.006), compared to *3 only group, respectively. There was a trend toward higher PUT (16% vs 4%, p=0.11), and a higher TDD (0.11 vs 0.09 mg/kg/day, p=0.06), compared with *3 only group, respectively. There were no statistically significant differences in outcomes between CYP3A4*22 vs non-*22 with the exception of TDD (0.08 mg/kg/day vs 0.12 mg/ kg/day, p=0.01). However, there were clinically important differences between *22 vs non*22 including TTT 7.7 vs 11.6 days, PUT 0% vs 22%, and ATT 6 ng/mL vs 4.1 ng/mL. All *6 or *7 expressers were African American (AA) and 5 out of 6 *22 expressers were Caucasian. Conclusions: We confirm that predominantly Caucasian patients who express CYP3A4*22 require the lowest amount of TAC TDD (0.08 mg/kg/day) and may have clinically significant differences in TAC outcomes, however studies with larger numbers are needed. In contradistinction to their *3 counterparts, AA patients who express *6 or *7 are slower to reach TAC goals, are often discharged from the hospital with an undetectable TAC level and require a higher initial TAC dose. Figure 1 ). Final mixed effects models revealed that recipient ABCB1, age, donor CYP3A5, diabetes were associated with CL/F, dose, AUC/dose, and C trough /dose. Age, diabetes, hematocrit, and BSA (post-hoc) were associated with some of the parameters (Table 1) . Race, MMF, steroids were not associated with changes in PK parameters. Models were able to explain up to 75% of the variance in the data. In stable liver transplant recipients more than 6 months from transplant, recipient ABCB1 (gut) and donor CYP3A5 (liver) were significantly associated with changes in PK parameters, whereas race was not. Interactions between recipient (gut) and donor (liver) genotypes explain significant variability in stable liver transplant recipients. Methods: Blood samples were collected and total DNA was extracted. Target sequencing based on next-generation sequencing was used to detect all single nucleotide polymorphisms (SNPs) in 9 genes related to the sirolimus metabolism in vivo (CYP3A4, CYP3A5, CYP2C8, CYP2C19, ABCB1, POR, PPARA, UGT1A8, UGT1A9, UGT2B7). Medical records related to the sirolimus metabolism were also collected. Logistic regression analysis adjusted by the confounding factors was conducted to identify the potential associations of all detected SNPs with the sirolimus concentrations on 7 days and 1 month after the administration of sirolimus within the first 3 months after kidney transplantation. A clinical score was designed by simplifying regression coefficients of the independent variables. Cutoff levels were chosen based on the clinical score, and positive and negative response rates were calculated. An evaluation of the model was performed in a second group of recipients containing 100 recipients. preformed or first appeared prior to the 3 rd month post-transplant. Development of dnDSA was measured using single antigen beads (One Lambda, CA) at 1, 3, 6, and 12 months then biannually. dnDSA was considered positive if the MFI was ≥1000. All dnDSA was in the first 2 years post-transplant. All patients had at least 3 tacrolimus levels between 3-18 months post-transplant. . Despite the small sample size, the PK parameters obtained from both groups were consistent and with low variability. In the LFG316+IVIg group, the mean value for AUCinf/D decreased by 34%, clearance increased by 63% and t1/2 decreased by 41% compared with the LFG316 alone group. This difference in clearance/elimination characteristics between the treatment arms was evident from 3 days to 21 days where faster elimination of LFG316 was observed. This data is shown in figure 1 below. In the subsequent weeks of the observation period, the influence on clearance by the IVIg infusion diminished. LFG316 suppressed the complement pathway as measured by the Wieslab assay and CH50 in serum. The rapid elimination of LFG316 in the LFG316+IVIg group allowed more rapid recovery of complement activity. The infusion treatments LFG316 alone and LFG316+IVIg were found to be safe and well tolerated. High dose IVIg infusion administered immediately before infusion of LFG316 has a significant impact on the pharmacokinetics and pharmacodynamics of LFG316 resulting in a shortening of the period in which full inhibition of complement activity is maintained. The effect of high dose IVIg on LFG316 clearance is most profound during the first 2 weeks after IVIg infusion. These findings suggest a significant impact of IVIg on clearance of monoclonal antibodies that limits efficacy. This is likely through occupation of the Fc neonatal receptor (FcRn) by high dose IVIg, increasing clearance of monoclonals. These findings also reveal an important therapeutic pathway for IVIg administration in enhancing elimination of pathogenic antibodies in humans. Purpose: While outcomes of pancreas transplants are improving, there is a persistent hesitancy to use pancreas organs procured in the context of donation after cardiac death (DCD) for pancreas transplant alone (PTA) because of concern of graft thrombosis and primary non-function organs. We report the outcomes of the first 43 DCD PTA in the United States. Methods: We examined Organ Procurement and Transplantation Network (OPTN) data for patients with type 1 diabetes mellitus (T1DM) who received PTA from donation after brain death (DBD, n= 1906) Figure 1 ). Region 7 performed the highest number of PTA (582 DBD, 26 DCD), followed by Regions 10 (413 DBD, 4 DCD), 2 (DBD 325, 2 DCD) and 3 (DBD 63, 0 DCD). Pancreas graft survival of DCD PTA was not significantly different than DBD at 1 year (75% vs 75%), 3 years (60 vs. 64%), and 5 years follow-up (50 vs 55%) (P=0.79) (Figure 2 ). Compared to DBD PTA, recipients of DCD PTA did not have a significant increase of risk of pancreas failure (aHR 0.75 1.22 1.99 ) or patient death (aHR 0.36 0.91 2.31 ). Conclusions: These data support that selected DCD PTA by experienced surgeons may be a successful option for pancreas transplantation, offering equivalent outcomes as pancreata procured after brain death. Some of those reasons are organ edema as well as the potential for ischemic injury not able to be visualized at the time of procurement. The purpose of this abstract is to determine if the utilization of pancreata from donors with renal insufficiency is a safe practice. Methods: We identified 5,534 adult deceased donor pancreas-alone transplant (PTA) recipients between 2000 and 2018 from SRTR data. We further categorized the recipients into "discarded" group (pancreata from donors whose kidneys were discarded), "high creatinine" group (pancreata from donors whose kidneys were transplanted and who had a TC>1.5), and "low creatinine" group (pancreata from donors whose kidneys were transplanted with TC<1.5). We compared all-cause pancreas graft failure and mortality between the three groups. Cox regression was used to adjust for donor and recipient characteristics. Results: Among 5,534 pancreas recipients, 164 (3.0%) recipients had PTA from donors with discarded kidneys, and 664 (12.0%) recipients had PTA from high creatinine donors. At five years post-transplant, graft failure was 34.5%, 34.1%, and 34.0% for discarded, high creatinine, and low creatinine PTA recipients respectively ( Figure 1 ). After adjustment, the discarded group (aHR: 0.82 1.05 1.32 , p=0.7) and high creatinine group (aHR: 0.85 0.97 1.11 , p=0.6) had similar risk of graft failure compared to low creatinine group (Table 1) . Mortality at five years was 13.7%, 14.2% and 12.0% for discarded, high creatinine, and low creatinine, PTA recipients respectively ( Figure 2 ). After adjustment, the discarded group had an 18% higher risk of mortality (aHR: 0.87 1.18 1.62 , p=0. 3) which is similar risk based on the CI, compared to low creatinine group. However, high creatinine had 29% higher risk of mortality (aHR: 1.09 1.29 1.53 , p=0.003) than low creatinine group (Table 1) . Conclusions: Donor renal insufficiency did not confer higher risk of pancreas graft failure among PTA recipients, but may confer higher risk of post-transplant mortality. In selected scenarios, pancreata from donors with renal insufficiency may be a safe mechanism to expand the donor pool. Table 1 . Only 2 SPLK (living donor kidney) recipients received ALM. During the first 3-years of follow up, MMF doses were comparable between groups and TAC levels were higher in the rATG group only at 3 and 36 mos. More patients in the rATG group were on low dose prednisone during the first 6 mos posttransplant. The primary outcome was acute T-cell mediated rejection (TCMR) in pancreas allograft detected in for-cause biopsies. The incidence of pancreas TCMR was similar between the two induction groups (see figure) .Similarly, pancreas graft survival was comparable between the two cohorts. Serum creatinine levels were comparable between groups, except being slightly higher in the rATG group at 3 and 6 mos. prospectively captured all candidate details used in the OPOM. Each valid candidate timepoint was evaluated using the previously calibrated classification tree to categorize 3-month drop-out status. Data was coded into the 5th layer of OPOM with 32 leaf nodes. Using a rounding threshold of 0.5, the classification of Toronto data was compared to the risk prediction of OPOM. The AUC of the OPOM was also calculated and compared to MELD and MELD-Na. Results: A total of 22,111 discrete candidate timepoints in 2376 evaluable patients were assessed. The differences between the actual and predicted outcomes at the OPOM leaf nodes were assessed ( Figure 1 ). The predictive accuracy is 90.0% with a positive predictive value of 39.6% and a sensitivity of 34.6%. The OPOM had an AUC of 0.81 for 3-month drop-out, which was superior to MELD (0.77) and MELD-Na (0.78)( Figure 2 ). Purpose: Liver ischemia-reperfusion injury (IRI) represents a major risk factor of early graft dysfunction and predisposes to rejection crises in orthotopic liver transplantation (OLT). We and others have reported on the regulation of hepatic macrophage activation by Notch1, a highly conserved transmembrane receptor involved in cell fate decision, proliferation and regeneration, in warm hepatic IRI model. However, little is known as to its function in the pathogenesis of liver IRI in OLT. We produced myeloid-specific Notch1-deficent (mNotch1-KO) mice (C57/ BL6) to study putative role of Notch1 myeloid cells in IR-stressed OLT recipients. Methods: WT (C57/BL6) livers subjected to extended (18 hr) cold storage were transplanted into groups of WT and mNotch1-KO syngeneic recipients; OLT and blood samples were collected at 6h after reperfusion. HMGB1/Histone-enriched liver flush (LF) was collected after infusing cold-stored WT livers with physiological saline. Bone marrow-derived neutrophil cultures from WT and mNotch1-KO mice were stimulated with LPS/LF while mouse primary hepatocyte cultures (WT) were treated with or without IL22. Figure 2 shows the evolution of dnDSA by C1q over time after first detection in the posttransplant period. We compared renal survival in patients with eliminated C1q with therapy to those with persistent C1q. Rejection therapy was per institution standard of care. However, treatment was intensified if DSAs by C1q was not eliminated. C1q was eliminated in 78% of patients (n=66.) Conclusions: Persistent C1q positive DSA in spite of therapy is a strong predictor of poor graft outcome, Patients with a persistently positive C1q DSA should be considered for more aggressive therapy and participation in clinical trials Purpose: FK506 is one of the most common immunosuppressant drugs in organ transplantation, which have been demonstrated to cause glucose metabolism disorder and alterd the gut microbiota. Meanwhile glucose metabolism was reported influenced by gut microbiota and its metabolites, especially short chain fatty acids (SCFAs). In this article, we indicated the relationship of gut microbiota altered by FK506 and glucose metabolism in mice and also found the improvement in FK506 related glucose metabolism disorder by one of the SCFAs-butyrate. Methods: Mice models were established using C57BL6 mice. FK506 was used to induce glucose disorder model in mice. FK506 group(FK506 group)(n=4) were given 10mg/kg of FK506 by gavage each day; Butyrate treated group(But T group)(n=4) were given butyrate in drinking water by 150mM based on the established model; control group(Con group)(n=6) were given 10ml/kg water by gavage. Blood glucose was detected by fasting glucose and oral glucose tolerance test. Comparison of gut microbiota was analyzed by pyro sequencing the 16S rRNA genes. LEfSe(linear discriminant analysis) and PCA(Principal component analysis) were utilized to visualize significant differences in taxa between groups. PICRUSt(Phylogenetic Investigation of Communities by Reconstruction of unobserved States) analysis was performed to predict the functional capacity of the gut microbiota. Results: Glucose disorder model was established at 6 th week( Figure 1A1B ). We found FK506 group showed significant difference to Con group in phylogenetic classification(Figure1C). The differences in the overall community compositions between FK506 and Con group had shown by LEfSe and PCA( Figure 1C1D ). PICRUSt analysis was performed based on the 16S rRNA OTUs Table I shows the constructed prognostic score system. Survival was significantly different (p < 0.001) for those at low risk (0-4 points), medium risk (5-7 points), and high risk (8+ points) ( Figure 1 ). Furthermore, survival was equivalent between low risk pediatric re-transplant recipients and pediatric primary liver transplant recipients (p = 0.46) but significantly worse for medium (p < 0.001) and high risk (p < 0.001) re-transplant recipients Conclusions: With simple clinical characteristics, this scoring tool can modestly discriminate between those children at high risk and those children at low risk of poor outcomes after liver Re-transplantation. If validated by future studies, this scoring system could provide prognostic guidance to the family and patient. While the factors contributing to posttransplant non-adherence are complex, noncompliance with the dialysis prescription is often used by transplant physicians as a surrogate marker for posttransplant non-adherence. Dialysis noncompliance is often factored into decisions regarding transplant candidacy, yet the association between the two is tenuous. In this study we examine the association between measures of dialysis non-compliance and posttransplant outcomes. We performed a retrospective cohort study using statistically de-identified 2004-2014 data from a large dialysis provider linked to the OPTN dataset. The cohort was limited to 9543 hemodialysis patients with evaluable lab data and who were transplanted. The primary exposure was dialysis non-compliance defined as hyperkalemia (K>5.2 mEq/L), hyperphosphatemia (Phos>5.5mEq/L), intradialytic weight gain (IDWG) >5kg, treatment shortened by >30min or missed dialysis treatment (not hospitalized or vacation). Multivariable cox regression models were constructed for the outcomes of death and graft loss. Results: Noncompliant patients were younger, male, more frequently African American or lived in an urban area. Median household income was higher in the group with hyperkalemia or hyperphosphatemia but lower in those with large IDWG, shortened or missed treatments. In adjusted models accounting for age, sex, race, cause of ESRD, income, insurance, distance to the transplant center, comorbidities, PRA, HLA mismatch, KDPI and immunosuppression, only large IDWG, shortened and missed treatments were associated with death. All 5 non-compliance measures were associated with an increased risk of graft loss but only shortened treatments were associated with acute rejection (AR) within the first year, see Table 1 . Conclusions: Pretransplant dialysis behaviors have a limited association with patient survival or acute rejection. Shortened treatments, but not hyperkalemia or hyperphosphatemia, may identify patients at increased risk for poor outcomes Nephrology, Department of Internal Medicine, University Medical Center Groningen, Groningen, Netherlands, 2 Department of Internal Medicine, University Medical Center Groningen, Groningen, Netherlands, 3 Department of Epidemiology, University Medical Center Groningen, Groningen, Netherlands, 4 Department of Cardiology, University Medical Center Groningen, Groningen, Netherlands Purpose: Galectin-3 has been associated with renal fibrosis and decline of renal function. We aimed to determine the association of galectin-3 with long-term risk of death-censored graft failure in a large cohort of extensively phenotyped renal transplant recipients (RTR Conclusions: In RTR, galectin-3 is elevated and independently associated with long-term risk of death-censored graft failure. Galectin-3 may be helpful to assess prognosis and guide existing therapy. Whether a novel galectin-3-targeted therapy may represent an opportunity to decrease the burden of long-term graft failure in stable RTR requires further studies. Abstract# 411 (Table 1) . Patients with fDGF but without dDGF had a comparable outcome with patients who fulfilled both definitions. Moreover, the absence of fDGF in patients who needed dialysis after transplantation (dDGF) was associated with better outcomes (Figure 1 ). Conclusions: the definition of fDGF provided better and supplementary information about graft outcomes compared to the classical definition of dDGF in a modern series of kidney transplant recipients. and eculizumab therapy (HR=0.23; p=0.003) were independently associated with increased and decreased risks of graft loss, respectively, while preformed DSA (HR=3.11; p=0.051) fell short of statistical significance. Eculizumab prophylaxis significantly reduced the rates of recurrence in both high-(p<0.001) and moderaterisk (p=0.02) transplantations. More importantly, graft survival was significantly improved by eculizumab prophylaxis in the high-risk transplantation group (p<0.02). Eculizumab discontinuation did not lead to subsequent relapses in the moderate-risk (0/10), unlike in the high-risk (2/7), transplantations group. The present study demonstrates that the outcome of kidney transplantation in aHUS patients has dramatically improved since eculizumab approval and supports individualized risk stratification based on complement investigations and medical history. Purpose: There is no consensus on whether a very young kidney should be used as en-bloc or single. We studied the association of KDPI in the outcomes. Methods: Using data from UNOS/OPTN as of 05/2018, deceased donor pediatric transplants from 2000-18 were stratified into en-bloc (E, n=2,691) and single (S, n=20,219) kidney (excluded DCD and foreign donors). An adult donor (A, n=32,212) cohort, including the ones in the percentile 25-75% of age and weight was created. Ten-years death censored graft survival (DCGS) were compared in E, S and A cohorts. A sub-analysis stratified E and S by weight (<10, 10-15, 15-18, and > 20 kg) and KDPI (<60, 60-80, > 80%). Survival analysis and Cox regression were used to examine the outcomes. Further research should assess the outcomes of the multi-organ transplant allocation process to ensure the maximum utility of deceased donor kidneys. Purpose: Introduction of KAS in 2014 prioritized access to DDRTx for highlysensitized candidates although it was unclear whether the outcomes would be acceptable in this high immunologic risk group, many of whom have prior transplants, multiple comorbidities, and prolonged time on dialysis. Methods: Using the 2018 UNOS STAR file, we compared national patient and graft outcomes between high PRA (cPRA≥97%) recipients transplanted after KAS (4,726 DDRTx, 2015 -2017 to low cPRA (≤10%) recipients after KAS (25,800 DDRTx, 2015 -2017 and to high cPRA recipients before KAS (1,252 DDRTx, 2011 (1,252 DDRTx, -2013 Table 2) . Among 67 clinical, histologic, and molecular variables, 11 were significant at the 0.05 level: molecular nephron injury (IRITD3, IRRAT30/AKI transcripts; damage-associated molecular patterns (DAMP)); atrophy-scarring (molecular cigt1 classifier and histologic ci>1), and GFR. Histologic v-, i-and t-scores were lower in failure group (failure was inversely related), presumably because of scarring. Molecular T cell burden was lower in kidneys that failed, although MMDx TCMR and rejection scores were similar (not shown). Determinants of failure after TCMR biopsies are predominantly recent molecular parenchymal injury and stage (atrophy-scarring), not molecular rejection and inflammation scores. Conclusions: Early TCMR is declining, but late TCMR remains a serious issue, often in kidneys with extensive parenchymal damage. Late TCMR is probably often due to non-adherence. Graft loss after late TCMR episodes is associated with severe nephron injury and parenchymal loss (atrophy-fibrosis), with low GFR at biopsy, but not with molecular rejection and inflammation. Histologic i-, t-, and v-scores or molecular inflammation are actually lower in kidneys at risk of failure (probably due to scarring). The fact that v-lesions did not predict prognosis indicates changing concepts of risk. Mean Banff ti-score in both groups was similar at baseline. At 6 months, a higher proportion of patients in the TCZ group than the control group had a decline in the ti-score (10/16 vs. 3/14, p=0.026). Mean ti-score declined (-47%) in the TCZ group & increased (+21%) in the control group (p=0.13). Functional assays have been completed in a subset of patients (n=20) (see figure) . Mean frequency of CD4+CD25+Foxp3+ Tregs was similar in the 2 groups at enrollment (4.3% vs. 5.1%, p= 0.58). At 6 months, the Treg frequency increased (+50%) in the TCZ group and decreased (-22 .5%) in the control group (p=0.012). The frequencies of naïve, central memory, effector memory, & TEMRA cells in both CD4/CD8 T cells remained stable in both groups. Patients in the TCZ group showed a profound decline in IFN-γ (-31%) and IL-17 (-51%) production by CD4+ T cells when compared to control group at 6 months. IFN-γ/ IL17 double producing CD4+ T cells also decreased significantly in the TCZ group (-60% vs. +48%, p=0.02) at 6 months. Conclusions: TCZ treatment for 6 months was well tolerated and associated with significantly increased circulating Tregs, decreased effector cell function, and decreased graft inflammation in kidney transplant recipients. Blockade of IL-6 is a novel treatment option to regulate the alloimmune response. We hypothesized that inflammation from kidney injury could lead to enhanced expression of HLA and rejection risk in the kidney allograft after transplantation. This study's aims were to assess associations between injury biomarkers in deceased donor urine and the following 1-year recipient outcomes: (1) a composite of biopsyproven rejection, graft failure and death; and (2) acute rejection alone, censored for graft failure and death. Methods: We assembled a prospective multicenter US cohort in which we measured a panel of urinary biomarkers from deceased donors at kidney procurement and then ascertained kidney transplant outcomes for 1137 recipients through on-site, detailed chart review at 13 transplant centers. We focused primarily on injury biomarkers including interleukin-18 (IL-18), kidney injury molecule-1 (KIM-1), and neutrophil gelatinase-associated lipocalin (NGAL). We fit multivariable Cox regression models with adjustment for donor, recipient and allograft characteristics. Figure] . Multivariable regression also did not show significant associations between any biomarker and the outcomes. In a secondary analysis, we found no association between AKIN-defined injury in the donor and recipient outcomes. In this large cohort, AKI was common among deceased donors but neither donor injury biomarkers nor donor AKI were associated with either a composite of death/graft failure/acute rejection or rejection alone. The findings support the idea that cautious expansion of the organ pool using kidneys with AKI will likely not increase the risk for adverse outcomes such as acute rejection. Purpose: There is a need to develop non-invasive approaches to detect ongoing kidney graft injury and distinguish immune from non-immune inflammation to guide therapy. Ideally the approach should be accurate to the cause of the injury, sensitive, and rapid. Currently, biopsies are required for differential diagnosis and monitoring of suspected kidney graft injury. We used the Nanostring platform to test whether gene expression patterns in urine sediment will distinguish T cell mediated acute rejection from BK virus nephropathy. Thus, not only white substance users but all AAs (even non-users) were less likely to be listed than the referent group. If other racial/ethnic minority patients did not use substances, they were as likely as the referent group to be listed. However, if they used substances they were least likely of all groups to be listed. Conclusions: Both substance use and race/ethnicity affect likelihood of listing. The combination of these factors may lead to unique disadvantage for minority patients, but AAs face disparities in listing even without substance use. Transplant professionals should be alert to such effects and seek ways to eliminate them. Opioid sparing protocols were piloted in 2016 with wider implementation 10/2017. An educational in-service was given to all transplant nursing staff on pain assessment and management. Patient education was also revised to address patient expectations on post-operative pain management. Inpatient opioids were rapidly titrated and discontinued with the intention of utilizing acetaminophen for post-operative pain management on discharge from the transplant admission. Percent of patients discharged on opioids were compared pre-and post-protocol implementation. Opioids prescribing was also assessed across time and between open and robotic RTx. Results: Overall, 376 adult RTx patients were included. Patients were 63% male, 46% African American, and were an average of 50.8 (SD +13.5) years old (Table 1) . A majority of patients underwent open RTx (60.3%) and received deceased donor RTx (53.5%). Opioid prescribing on transplant discharge was significantly lower after opioid minimization protocol and education (pre 68.3% vs. post 11.1%, p < 0.001). Transplant length of stay was significantly shorter post-opioid minimization protocol (p=0.03). There was no difference in opioid prescribing based on RTx surgery type pre (p=0.78) or post-protocol (p=0.33) implementation. Table 2 details outcome differences. Over time, there was a significant downtrend in opioid prescribing on discharge ( Figure 1 ). If opioid therapy was required post-protocol, tramadol (7/13, 53.9%) was the predominant agent prescribed. Conclusions: Opioid minimization and non-opioid prescribing is feasible and welltolerated within an adult RTx population with the proper patient and staff education. was LDKT readiness: "how ready are you to pursue living donor kidney transplant," categorized into "not considering", "considering" and "taking actions" for LDKT. Sociodemographic and clinical characteristics were also tabulated. Results: The mean age (SD) of the sample (n=262) was 54 (13) Purpose: Living donor kidney transplantation is widely accepted to be the treatment of choice for end-stage kidney disease, and great efforts are undertaken to expand living donor kidney transplantation programmes. However, in recent years data have emerged that have led to an increased awareness of the potential long-term risks for kidney donors. Kidney donors are now widely counselled of the increased risk of end-stage kidney disease. In line with changes in the general population, an increased proportion of live donor candidates are now overweight or obese. Obesity has been shown to be an independent risk factor for the development of ESRD, as well as for the development of type 2 diabetes. Our living donor programme concentrates much effort on lifestyle modification in live donor candidates, focusing on weight loss, smoking cessation and the benefits of exercise. No data are currently available as to whether these recommendations are followed by live donors after donation. Methods: We conducted a retrospective analysis of all live donors who proceeded to donation in the years 2014-15, and who were followed up for at least 2 years in our centre. Comparisons in donors' BMI over this period were made with matched pair analysis from initial assessment to preoperative assessment, 1 year and 2 years after donation. Conclusions: These data underline the difficulties in maintaining lifestyle modifications, even in highly motivated and selected individuals. The impact of weight gain on long-term donor risk needs further evaluation, and live donor programmes should consider continuing to provide support with lifestyle modification after donation. Purpose: Post-operative renal recovery after nephrectomy is a substantial problem to be addressed in living kidney donors. Herein, we explored factors associated with renal recovery and progression to advanced chronic kidney disease (CKD) in living donors. Methods: A total of 1,588 kidney donors who underwent donor nephrectomy from 1982 to 2016 were retrospectively reviewed. We extracted donors who had estimated glomerular filtration rate (eGFR) at 1 month after kidney donation with follow-up period over one year. Percent change of eGFR from initial to one month after donation was calculated. The sub-optimal renal recovery was defined as recovery of eGFR less than 70%. The development of advanced CKD, latest eGFR < 60 ml/min/ m 2 were the clinical end-point. We used continuous value for laboratory variables except uric acid which was divided by sex specific criteria of hyperuricemia; male ≥ 7.0 mg/dL, female ≥ 6.0 mg/dL. Cox-regression and logistic regression analysis were used to determine the risk factor related with sub-optimal renal recovery and progressive CKD. (n=30), "Loop V-Plasty" technique (n=5), Single oval ostium technique to form common outflow (n=2), "Patch-plasty" using ePTFE graft (n=3), "bridging conduit plasty" to reconstruct IRHVs (n=53), and inferior vena cava reconstruction using ePTFE graft (n=6). The MHV reconstruction using ePTFE vascular grafts was done for 563 liver allografts. Aspirin (100 mg) was given to all the recipients from the5 th postoperative day and continued for 2 years. Results: The 2 nd month patency rates of the ePTFE grafts were 100% with no patient developing acute thrombosis or infectious complications directly related to ePTFE grafts. Primary non-function requiring re-transplantation occurred in none of the recipients. Thrombotic occlusion of the outflow resulting in Budd-Chiari syndrome occurred in 1 patient (0.38%) at 24 th post-operative month. Graft migration into the second portion of the duodenum occurred in 8 patients (1.42%) that were successfully treated by surgical removal of the ePTFE vascular grafts. Among the 6 recipients that underwent IVC reconstruction using ePTFE graft, 2 patient expired at 10 th and 13 th postoperative month due to recurrence of the primary disease. Conclusions: Venous outflow reconstruction of right liver allografts with venous variations using ePTFE vascular grafts contributes significantly to the outflow of the liver allograft that not only prevents congestion of the graft, but also increases the ease of graft implantation. The ePTFE vascular grafts have wide safety margin, however, potentially fatal complications may occur that can be effectively treated if timely diagnosis is made and appropriate management is instituted. Reconstruction of IRHVs using ePTFE vascular grafts is relatively new concept in LDLT and proved to be safe alternative to multiple veno-caval anastomoses that may increase the warm ischemia time. To summarize the findings of this initial experience, we utilized 63% (10 of 16) of the livers from MSUD patients. Another six livers (37%) from domino candidates were unused due to lack of appropriate candidates for those livers at our center at that time, and in one case due to unfavorable anatomy. We split 7 of 16 deceased donor livers to further increase the number of transplants yielding a 94% increase in transplants performed (31 transplants from 16 deceased donor organs). In one case a split/domino transplant was performed in which we received one side of the liver, and performed domino with our patients' liver, while the other side of the deceased donor liver was transplanted at another center (inter-center sharing). Seven of the 16 deceased donor livers were deemed acceptable to be split. Two were split with both sides used while additionally performing domino liver transplantation with the full metabolic liver. Three additional MSUD patients received left lobe deceased donor grafts while the right lobe of the liver was discarded as no patient had been matched to these lobes (logistics), a missed opportunity for 3 cases. Of the total possible transplants that could be yielded from our experience combining split and domino transplants, we missed 9 opportunities. With perfect execution it could have yielded a total of 40 transplants from the 16 cadaveric organs donated. This would represent a 250% increase in transplants from this selected donor pool. *Three additional segments from split grafts were not utilized due to lack of suitable candidates pre-selected for these grafts. Conclusions: Transplant swaps are utilized in kidney transplantation to provide organs that are blood type incompatible between donor and recipient matches as previously published. This technique has not been utilized in liver transplantation before, but may be added to our paradigm as shown below. Scenario a. In this case, a blood type incompatible parent wants to donate to their child, but it would increase the risk of rejection to cross the ABO blood group barrier. Rather, this child is offered the domino transplant from an MSUD patient in exchange for bringing the live donor entering the swap as is done in kidney transplantation. Scenario b. An altruistic donor is chosen to provide a partial liver graft for a patient with MSUD. The recipient's liver then undergoes domino transplantation into another individual. In this case, both transplants are accomplished without any cadaveric donor. The metabolic disease candidate has a non-compatible donor, who gives to another candidate in order to receive the altruistic donor in the swap. 18.8%, p=0.50). Additionally, no significant differences in secondary outcomes were seen (Table 1) . Among KTRs with ASB who were treated with antibiotics, a median of 7 days was prescribed and de-escalation to a narrower spectrum agent was possible in 45.3% of patients. Conclusions: KTRs with untreated ASB have similar outcomes as those prescribed antibiotics, even in the early post-transplant period. Our findings suggest a role for antimicrobial stewardship by discouraging antibiotic use in this population. where donor lung procurement is unsuccessful and therefore a potential resource cost to a transplant center. We propose a decision analysis tree and hypothesize a strategy using DCD and DBDs as compared to DBD alone is an effective strategy to increase access to transplant. Methods: A decision analysis model was developed using TreeAge Pro 2018, R1.1. Data was obtained through a systematic review of MEDLINE for all reports on DCD and DBD in lung transplantation. Analysis was from the viewpoint of a transplant center with a hypothetical 50 year old male patient with pulmonary fibrosis waitlisted for lung transplant for one month. Low clinical acuity was a Lung Allocation Score (LAS) less than 50 and high acuity was an LAS greater than or equal to 50. A high acuity patient has a greater probability of being transplanted, and an increased risk of both waitlist and post-transplant death when compared to a low acuity patient. The outcomes for each strategy were transplant, dry run, or remain on the waitlist. One way sensitivity analysis was performed to test the model strength based on variations in the range of values reported in the literature. The cost per additional transplant was explored relative to a transplant center's willingness to pay for an additional transplant and the center's willingness to accept the risk of a dry run. The estimated dry run rate for a DBD was 20% and for a DCD 40%. Probabilistic sensitivity analysis was performed to assess the robustness of the results to changes within the model and its variables. Results: A strategy using both DCD and DBD was favored over a strategy using DBD alone, even though there is a higher rate of dry runs with DCD organs. The expected value was 0.21 for the DBD only strategy, and 0.23 for the DBD and DCD strategy. One way sensitivity analysis showed that the DCD and DBD strategy was dominant across the range of probabilities for dry runs and DCD donors. Conclusions: A strategy using DCD and DBD donors can improve access to lung transplant. This strategy remained dominant across probabilities of dry runs and DCD donors tested. Further work is needed in assessing center level implementation strategies for increasing the use of DCD donors. Methods: We conducted semi-structured interviews about candidates' preferences for providing informed consent for organ intervention research, and willingness to accept a research organ and participate in intervention research. Results: Sixty-one candidates participated (44% participation rate). Most candidates (93%) agreed that organ research is worthwhile because it could increase organ availability, provide faster access to organs, provide better quality organs, and advance science. Most candidates (81%) were likely to accept a research organ if organ quality was good (defined as donor age 30), but far fewer candidates (26%) would accept a research organ if quality was only moderate (i.e., donor age 50). Most candidates (79%) desired being informed that the organ offered was a research organ before accepting it. Most candidates (82%) desired giving informed consent before accepting a research organ without further involvement in the research study. Giving informed consent before accepting a research organ was perceived by most candidates as important for making an informed decision, but was perceived by a few candidates as not necessary because they felt desperate to accept an organ and trusted their physician's judgment. Half (57%) desired giving informed consent before accepting a research organ and allowing their medical record to be reviewed post-transplant to track the outcome of the transplant for research purposes. Consent for also allowing post-transplant medical chart review was perceived by some candidates as important for protecting their privacy, but perceived by others as unimportant because they did not view medical chart review as invasive or harmful. The majority (62%) desired giving informed consent before accepting a research organ and allowing researchers to collect post-transplant information from the candidate via clinical lab tests. Deconvolution analysis revealed higher enrichment of stromal cell score in the biopsies compared to urine, whereas various immune cell types were enriched in the urine (Figure 2 , Blue=Biopsy and Yellow=urine) Conclusions: Gene set enrichment analysis of RNA sequencing data from urinary cells and kidney allografts along with deconvolution demonstrate enrichment of genes related to immune cells in urine that is undiluted by the stromal component. Our data support urine as an excellent biospecimen for biomarker discovery and development as well as for deciphering the anti-allograft repertory. of donor pDCs. Using mass cytometry to profile the recipient immune response to VCA, we find evidence that pDC treatment and allograft prolongation correlates with upregulation of myeloid-derived suppressor cells. To determine factors unique to the tolerogenic phenotype of pDCs we performed a microRNA (miRNA) microarray, and our results show that six miRNAs of the miR-181 family are upregulated in pDCs compared to cDCs. Likewise, pDCs deficient in miR-181a fail to prolong allograft survival. Transcriptome analysis of wild-type versus miR-181a deficient pDCs revealed statistically significant downregulation of multiple genes in miR-181a deficient mice that are related to immune co-stimulation and signaling. We report that Semaphorin 4a, which is involved in immunomodulation and is required for the function and stability of regulatory T (Treg) cells, is decreased in miR-181 deficient pDCs. Purpose: When antibodies bind to MHC antigens on endothelial cells a cascade of responses is initiated including exocytosis of adhesion molecules followed by attachment and activation of platelets and macrophages. The initiating mediators of antibody-mediated rejection are not fully defined. We have modeled these events by passively transferring donor specific antibodies (DSA) to renal allograft recipients. The influence of T cells was eliminated by using Rag deficient (Rag-/-) mice as recipients. Specifically, B10.A (H-2 a ) kidneys were transplanted to C57Bl/6 (H-2 b ) Rag-/-mice. After 7 days, when postoperative inflammation subsided, a mixture of IgG1, 2a and 2b monoclonal antibodies to H-2 a (or isotype controls) were transferred. Immunohistology, ELISA and NanoString analyses were performed on allografts 1 hour after antibody treatment. Results: By 1 hour, C4d deposited on peritubular and glomerular capillaries in a strong, diffuse pattern. Extensive aggregates of P-selectin positive platelets co-localized with C4d. Capillaries also contained platelet-monocyte conjugates. The platelets released large amounts of platelet factor 4 (CXCL4) and serotonin in the graft (10-25 ng/mg tissue). To further characterize the macrophage infiltrates mRNA arrays were performed by NanoString on kidney allografts. Passive transfer of DSA upregulated expression of genes associated with macrophages. Cd68 and Itgam (Mac1) were 3-and 4-fold higher at 1 hour following transfer of DSA and remained higher at 5 hours compared to isotype controls. However, the most highly upregulated gene was Gzmk (granzyme K). Gzmk was upregulated about 35-fold at 1 hour. Gzmk has recently been reported to be expressed by classically activated human macrophages and in separate experiments we confirmed that Gzmk is expressed by macrophages isolated from mouse renal allografts. To determine the effects of platelets on the expression of Gzmk, a group of mice was pretreated with platelet depleting antibody 1 hour before transfer of DSA. Depletion of platelets decreased Gzmk expression by half. Results: Endogenous memory CD8 T cells within highly ischemic allografts expressed both IL12Rβ1 and β2. However, most proliferating CD8 T cells did not express IL12Rβ1 needed for p40 HD mediated signaling. Moreover, endogenous memory CD8 T cells isolated from allografts did not proliferate ex vivo when cultured with p40 HD. These results suggested p40 HD may provoke memory CD8 T cell proliferation within the higher risk allografts by indirectly binding to IL-12Rβ1 expressing cells leading us to investigate other candidate factors to drive this memory CD8 T cell proliferation. qPCR analysis indicated increased mRNA expression of IL2Rα (CD25), β(CD122) and IL15Rα in infiltrating CD8 T cells purified from allografts subjected to prolonged vs. minimal CIS on day 2 post-transplant. ELISA indicated longer CIS time markedly increased IL-12p40 and IL-15 protein in the allografts and these increases were dependent on recipient CD4 T cells. Blocking of p40 or IL-15 signaling with anti-p40 or anti-CD122 mAb inhibited endogenous memory CD8 T cell proliferation within high-risk cardiac allografts in CTLA-4Ig conditioned recipients and extended the survival of the high-risk allografts from days 16-23 to days 40-72 (anti-p40 mAb) and days 24-97 (anti-CD122 mAb). Conclusions: These results suggest that p40 HD produced by graft DCs stimulated the production of proliferative cytokines such as IL-15, which directdly stimulates endogenous donor-reactive memory CD8 T cell proliferation within allografts subjected to 8 hours CIS. Purpose: Mechanisms of rejection of peripheral nerve and optic nerve remains largely unknown. The inability to prevent rejection of the optic nerve has thwarted the successful execution of an eye transplant. Here, we aimed to establish a model for understanding the mechanisms behind the rejection of peripheral and optic nerves and to identify optimal immune therapeutic strategies to halt nerve rejection. Methods: Optic and sciatic nerve grafts were transplanted under the kidney capsules of mice and were characterized fully for immune infiltrates and peripheral immune responses were assessed. Results: Surprisingly, the optic nerve autografts were better preserved than the sciatic nerve. We confirmed axonal presence inside the optic nerve implant by identifying a positive GFP signal in the nerve after intravitreal injection of a GFP-labeled AAV2 virus. The optic and sciatic nerve allografts showed immune cell infiltrates as early as 3 days post-transplantation. Both allografts contained more CD11b + cells than either CD3 + T cells or B220 + B cells. Notably, the optic nerve allografts contained more of these cells 7 days post-transplantation as compared to the sciatic implants. The spleens of the sciatic nerve recipients also contained a higher percentage of regulatory B cells than those of optic nerve recipients (0.1950% vs. 0.1250%; p=0.0304) . Using Ova nerve implants into Rag host injected with OT1 and OT2 cells, we noted higher CD8 + alloreactivity as compared to CD4 + alloreactive ones. To gain more insights on the immunogenicity of these nerves, we assessed their cellular compositions. Flow cytometric analysis revealed that the optic nerve contained a lower percentage of MHC class-II + cells (0.0044% vs. 0.0578%; p<0.0001) and higher percentage of PD-L1 + cells (0.049% vs. 0.034%; p<0.0001), and sciatic nerve cells induced more alloreactivity in comparison to optic nerve cells in an MLR assay (p<0.0001). Given the higher infiltrates in optic nerve in vivo, we speculate the immunogenicity of optic nerve increases following ischemic injuries. Finally, immunosuppressive treatment with mCTLA4IgG, Rapamycin, or Anti-CD3 failed to protect both allografts from rejection. The optic nerve allograft may be subjected to a more pronounced alloimmune response than the sciatic nerve allograft. Both grafts appear resistant to standard immunomodulatory drugs that confer tolerance towards other types of transplanted organs. counties had observed death rates 30% higher than those in more fortunate (CHS 0-10) areas (2.23 vs. 1.75 deaths per million person-years, p < 0.001), and were 27% less likely to be waitlisted than CHS 0-10 counterparts (p < 0.001, Figure 1 ). The presence of at least one highvolume transplant center in-state corresponded with greater waitlist access ( Figure 2 ). Outcomes on the waitlist and at 1 and 3 years post-transplant did not differ by county CHS or presence of a high-volume in-state center. Conclusions: Children with liver disease in underprivileged areas have lesser access to the liver transplant waitlist and a higher death rate from liver disease. Greater distance to a transplant center is associated with lesser waitlist access. Further research is necessary to identify means by which to reduce disparities in waitlisting and facilitate receipt of care for patients in remote and underserved communities. Methods: A 34 question survey was designed to address common aspects of donor selection, donor evaluation, surgical variations, and recipient considerations. The WHO Transplant Observatory was analyzed to determine global prevalence of LDLT. The survey was distributed globally to individuals associated with LDLT programs identified via the OPTN database, Pubmed authorship, and professional references between 7/2018-9/2018. Results: There were 125 survey respondents. The U.S. Program (USP) response rate was 97.7%. At least one respondent was obtained from 94.9% of countries with ≥10 LDLT cases in 2016 (International Programs, "IP"). 74% of programs performed pediatric LT, and 12% were exclusively pediatric. IP were more likely to consider LD of any blood group (66.7%) compared to USP (36.9%, p=0.02). Only 32.8% of all programs will consider LDLT for fulminant pediatric patients. Most programs (72%) do not have a PELD limit when considering LDLT. Pediatric programs are less likely to have a defined donor age limit when compared to adult programs (p=NS between USP and IP). Overall, 68% of programs have a donor BMI cutoff (median 18-32), and the mean acceptable macrosteatosis cut off was higher for IP (19.0% vs. 14.9%, p=0.02). Most pediatric programs were willing to consider first degree relatives of patients with Alagille Syndrome or Metabolic Disorders (p=NS between USP and IP). USP were more likely to consider nondirected, anonymous donors (65.1% vs. 36.6%, p=0.003). There were no differences in willingness to consider complex anatomical considerations. Overall, 79.5% of programs perform LD surgery via an open approach (p=NS between USP and IP). Conclusions: This study represents the first comprehensive global analysis of living donor selection and utilization in pediatric liver transplantation. While there are considerable global variations in LDLT practice patterns largely due to availability of deceased donor organs, this study has identified key aspects of donor selection criteria and utilization that can establish the standard of care for this procedure. Purpose: Progression to chronic renal disease (CKD) is highly prevalent in adults following liver transplantation (LT) with a profound impact on patient survival, assuming that the prevalence of renal dysfunction is also considerably high in children. We report our experiences in a large paediatric transplant center and try to identify risk factors leading to a decline of renal function. We retrospectively analyzed all 161 LT patients at our hospital from 2010 to 2017 (84 female). Medical records were reviewed for demographic, laboratory and clinical data. Patients were stratified into groups < 5 kg, 5-10 kg and > 10 kg. Change of renal function (slopes) was defined as loss/increase of GFR using the Schwartz formula and progress to renal failure within the first 4 weeks after LT was stratified according to the pRIFLE criteria. Mean observation time was 31 months (6 to 93 months). Results: Out of 161 patients, 134 were younger than 6 years at time of LT (74%). Average survival time in 5 years was 84%. CKD preexisted in 27 children (16.8%), twelve undergoing dialysis before LT (7.5%). 3/149 proceeded to ESRD over 5 years (2%). Other patients developed pRIFLE I (19,5%), II (17%), III (12%) within the first 4 weeks after LT. Average GFR within 4 weeks after LT was lower in group <5kg (66 ml/min/1.73m 2 ) compared to 81 (5-10 kg) and 114 (>10kg) ml/min/1.73m 2 . All GFR-slopes remained stable and were not significantly different from each other (<5kg: 0.01 vs. >5-10kg: 0,01 vs. >10kg: -0,35 ml/min/1,73m 2 /28 days). Within the entire observation time, all GFR-slopes showed a significant increase (<5kg: 14 vs. 5-10kg: 4 vs. >10kg: 3 ml/min/1.73m 2 /year). Regarding potential risk factors, 35 patients (22%) exceeded ABPM mean blood pressure values higher the 95th percentile, and 4 patients (2%) higher the 75th percentile. All other patients remained lower the 50th percentile. Proteinuria >30mg/dl did occur in 33% (n=53) of all patients and improved over years. Independent risk factors for a decline of GFR to ESRD were GFR decline within the first 28 days, days on ventilation, liver rejection, high MELD score, catecholamine and antihypertensive drug dosage, biliary leakage, proteinuria and high blood pressure. Conclusions: In contrast to adults, progress to ESRD in children following LT was rare (<5%) and most patients developed a normal GFR. Independent risk factors are rare and need further prospective analysis. However, renal function, hypertension and proteinuria should be monitored regularly. Purpose: Pediatric living donor liver transplant (LDLT) patients sometimes develops graft fibrosis. They showed graft fibrosis even if their liver function tests were within normal range. Therefore we have performed serial protocol biopsies. Recently, the Mac-2 binding protein glycosylation-modified isomer (M2BPGi) was developed as a new marker for progress of hepatic fibrosis, which is a blood test specific to fibrosis. We performed this study to examine the relationship between serum M2BPGi levels and liver histological findings for patients after LDLT. Methods: Patients under 19 year old who received LDLT in our institution were gathered. Study was designated for patients who were followed at least 1 years after LDLT. All patients received tacrolimus based our standard immunosuppression. Protocol biopsies were performed yearly basis. 89 patients enrolled in this study. Pathological findings were assessed by the last available biopsy with Hematoxylin-Eosin and Masson trichrome stein. Fibrosis was staged with Metavir score system. Serum M2BPGi level was compared with pathological fibrosis score. Student t-test and Pearson's chi-square test were used for analysis. Results: All patients received LDLT from their relatives. Original diseases were biliary atresia (n=61), metabolic disorders (n=8), fulminant hepatitis (n=7), hepatoblastoma (n=5) and others (n=8 M2BPGi is a novel fibrosis marker for liver fibrosis in patients after pediatric LDLT. It is especially useful for follow up pediatric LDLT patient to support liver biopsy interpretation. Further follow up is required to determine the relationship between fibrosis progression and M2BPGi. Purpose: Substantial weight gain and sedentary behavior are common after transplantation. Behavioral economic strategies with financial incentives and reminders may promote healthy behaviors in the general population, but have not been tested in transplantation. Methods: LIFT was a prospective, randomized, single-blind pilot trial to evaluate a behavioral intervention using a wearable accelerometer on post-transplant physical activity and weight change. 127 adult liver (LT:n=62) and kidney transplant (KT:n=65) recipients from 2-24 months post-transplant were recruited at one center. Patients were randomized 1:1:1 to a 12-week intervention with 3 arms: Arm 1: control, no device, Arm 2: device only, Arm 3: intervention with device and incentives. After a 2-week run-in, the intervention arm included daily monitoring of step counts and feedback regarding step goals (targeted 15% step increase every 2 weeks up to a threshold 7000 daily steps), and biweekly health engagement questions about healthy diet and physical activity. Patients who achieved step targets and correctly answered questions received small financial incentives. Results: Mean age was 52 (SD=13), 42% were White, 27% were Black, and 9% were Hispanic. One third had diabetes, mean BMI was 28 (SD=8). Study retention rate was 92%. Figure 1 shows step-data of 76 participants with accelerometers (Arms 2,3). The mean end of study step count was 7337 (SD=3494) in Arm 2 and 8532(SD=3907) in Arm 3; 1263 steps higher in the intervention arm. Active intervention was associated with a higher proportion of participant-days reaching the 7000-step target (OR 1.99, 95% CI:1.03-1.87, p=0.04) versus device only arm. The median change in weight [IQR] was +0.9 kg [-1.0 to 5.4] in Arm 1 (control, no device); +2.74 kg [-0.5 to 5.4] in Arm 2 (device only), and -0.5 kg [-1.4-3.4] in Arm 3 (intervention); p=0.05 . Reaching 7000 steps was associated with a trend towards weight loss when adjusting for study arm (beta -6.2, 95%CI -15.3-2.8, p=0.17). Conclusions: In a 12-week randomized pilot trial, financial incentives were effective in increasing post-transplant walking among KT and LT recipients. A trend towards weight loss was observed among those with greater degree of physical activity. Larger and longer prospective studies focused on physical activity are needed to avoid weight gain and sedentary behavior after transplant. The opioid crisis in the US and related increase in donors that died from drug-intoxication are well-documented. We set-out to estimate the proportion of drug-overdose deaths having donation potential and to characterize the variation in recovering organs from these potential donors. Methods: Multiple cause-of-death data from the National Center for Health Statistics and ventilation probabilities based on Nationwide Inpatient Sample data were used to estimate donor potential by employing the OPTN Deceased Donor Potential Study (DDPS) filtering methodology, including the age<=75 filter. OPTN data were used to identify actual drug-intoxication donors (DID). Exact rates with 95% Poisson confidence intervals compared the actual number of DID to the estimated number of drug-or alcohol-overdose (DAOD) deaths with donation potential at the national and Organ Procurement Organization (OPO) levels. Results: During 2011-2016, DAOD deaths increased by 51.7% (Figure 1 ) while all-cause deaths increased by only 9.1%. Opioid-specific drug-overdoses comprised 52.3% of DAOD deaths in 2011 and increased to 63.2% in 2016. The estimated number of potential DAOD donors nearly doubled between 2011 and 2016, from 2,802 to 5,482. Actual DID increased 157% from 2011 to 2016. As well, the ratio of actual DID to potential DAOD donors increased from 18.2% in 2011 to 23.9% in 2016. Figure 2 shows that the ratio of actual DID to potential DAOD donors ranged from 0.02 to 0.76 across the 57 OPOs (Puerto Rico excluded) in 2016. This ratio increased from 2011 to 2016 for 40 OPOs (four increased significantly) and decreased for only 17 OPOs (none statistically significant). Abstract# 506 Studies of other outcomes have shown that disparities are partially attributable to variation in quality between hospitals. We assessed whether one-year survival disparities in LTx differed by center quality. Methods: Patient data and center-level quality ratings (1 = lowest, 5 = highest) for 1-year patient survival came from SRTR. Patients were included if they were black or white and received a liver transplant between January 1, 2011 and December 31, 2016. We used Cox regression to estimate black-white hazard ratios for one-year post-transplant survival, both overall and by tier. We also estimated hazard ratios for each tier, relative to the lowest rating, stratified by patient race. Results: Lower rated centers had a higher proportion of black liver transplant recipients (Table 1) . Overall, black patients had a 21% increased risk of mortality one year after liver transplant compared to white patients (95% CI: 1.10, 1.34). Racial disparities in outcomes increased in a stepwise manner with increasing tier, with the exception of the Tier 5 centers, where no disparity was apparent. Increasing center rating conferred improved survival for white patients, but not for black patients, with the exception of Tier 5 centers Conclusions: Black patients experience increased disparities in survival in higher quality transplant centers and do not derive the same benefits as white patients from increased center quality, with the exception of Tier 5 centers. Further research is needed to elucidate this relationship in order to inform center-based interventions to reduce disparities. Purpose: Frailty, a syndrome distinct from comorbidity and disability, is clinically manifested as a decreased resistance to stressors and is present in up to 35% of endstage renal disease (ESRD) patients. It is associated with falls, hospitalizations, poor cognitive function, and mortality. Also, frailty is associated with poor post-kidney transplant (KT) outcomes including graft loss and mortality. It is likely frailty is associated with decreased access to KT, given its association with poor outcomes on dialysis and post-KT. Yet, clinicians have difficulty identifying which patients are frail; therefore, we sought to quantify if frail KT candidates had similar access to KT as nonfrail candidates. Methods: We studied 7,078 KT candidates (2009) (2010) (2011) (2012) (2013) (2014) (2015) (2016) (2017) (2018) in a three-center prospective cohort study of frailty. Fried frailty (unintentional weight loss, grip strength, walking speed, exhaustion, and activity level) was measured at outpatient KT evaluation. We estimated time to listing and transplant rate by frailty status using Cox proportional hazards and Poisson regression, adjusting for demographic and health factors. Results: The mean age was 54 years (SD=13; range 18-89), 40.2% were female, 33.9% were African American, and 21.1% were frail. Frail participants were almost half as likely to be listed for KT (hazard ratio:0.62, 95%CI:0.56-0.69, p<0.001) compared to nonfrail participants, independent of age and other demographic factors. Furthermore, frail candidates were transplanted 35% less frequently than nonfrail candidates (incidence rate ratio:0.65, 95%CI:0.55-0.77, p<0.001). Conclusions: Frailty is associated with decreased listing and decreased rate of transplant and is a potentially modifiable risk factor. Thus, assessment at kidney transplant evaluation could improve patient counseling and motivate strategies to improve pre-KT outcomes for frail candidates of all ages. (1) is a scenario where the rate of Voucher donors grows rapidly at a constant rate over 50 years. Slow Growth (2) is a scenario where the rate of Voucher donor grows slowly at a constant rate over 50 years. Rapid Growth, then Rapid Decline (3) is the worst case scenario where Voucher donors grow rapidly for the first 20 years, then decline rapidly for the next 30 years. Slow Growth, then Slow Decline (4) assumes Voucher donors grow slowly for the first 20 years, and then decline slowly for the next 30 years. Lastly, Rapid Growth, then Plateau (5) is a scenario where the rate of Voucher donors grows each year for the first 20 years, then remains constant for the next 30 years. In all 5 simulations, the coverage ratio never dipped below 1.13. This indicates that there will be more chain starts than chains needed for Voucher redemptions. Conclusions: This study demonstrates that an expansion of the NKR's Advanced Donation Program to include up to 5 Vouchers for immediate family members can be easily accommodated. Fifty year simulations using actual ESRD probabilities assigned to each of the categories of potential Voucher holders establish that there will be at least twice as many kidneys available for Voucher redemptions than are required in all scenarios modeled. This new program challenges the standard definition of a Non-Directed Donor when the opportunity for a Family Voucher may be available. Purpose: Poverty prevents more kidney transplants (KTs) worldwide than any other barrier. Global kidney Exchange (GKE) is a mechanism to overcome this barrier. Methods: In high-income countries (HIC), KTs generate significant savings compared with paying for ongoing dialysis. GKE proposes paying for KTs (and donor and recipient follow-up care) for some low/middle income country (LMIC) patients that face financial barriers to transplantation. In so doing, LMIC patients receive life-saving treatment, additional KTs are produced for HIC patients, and the savings from avoided dialysis exceeds the cost. Results: Between January of 2015 and June of 2018, GKE has produced 5 chains and 2 cycles that has allowed 7 international patients (3 from The Philippines, 3 from Mexico, and 1 from Denmark) to be transplanted, as well as 29 KTs for patients in the United States (US). The lengths of each GKE chain are 12, 7, 6, 2, 4 and the cycle lengths were 3 and 2. GKE-1 began with a US non-directed donor giving to a Filipino recipient and ended with a donation to the deceased donor wait list. GKE-2 extended a chain with a blood type (BT)-A bridge donor (BD) to produce 3 transplants and a BT-A BD. This BD began GKE-5 to produce 4 additional KTs. GKE-3, -4, and -5 each have a BD pending; GKE-6 and -7 were completed as cycles. GKE chains have involved 17 transplant centers and 38.9% of recipients were minorities. Five US recipients had BT-A, 20 BT-0, 3 BT-B, and 1 BT-AB; 5 international recipients had BT-A and 2 had BT-O. The PRA was 0-20% for 13 patients, 21-79% for 13 and > 80% for 10 (4 international). International pairs were funded by a combination of self-pay and approximately $500K of philanthropy. Transplanting 29 US patients saved US healthcare payers $5-7M vs. dialysis. International recipients have 100% graft survival (longest 3.75 years) and all international donors have normal creatinine and blood pressure. Conclusions: GKE provides a mechanism to overcome financial and immunological barriers to transplantation. Savings from avoided dialysis offers scalability, but transparency of international pair selection, emphasis on donor safety, and assurance of longterm immunosuppression for recipients are prerequisites for sustainability. Purpose: To reduce unwanted side effects of Tacrolimus (Tac), improvement of dosing is warranted. Therapeutic drug monitoring (TDM) of Tac based on trough concentrations (C 0 ) is a compromise for AUC-monitoring. The use of self-collected capillary microsamples of Tac at home will make it possible for the patients to provide 3 samples within a dosing interval without going to the hospital. Application of a limited sampling strategy (LSS) will make it clinically possible to accurately predict individual AUC:s in the clinic, using a population pharmacokinetic based computer dosing tool. This will improve Tac TDM significantly, and the aim of the present study was to evaluate the use of 3 microsampled Tac concentrations to predict individual Tac AUC:s. Methods: Twelve-hour Tac concentration-time profiles (13 samples) were obtained from 27 renal transplant recipients on twice-daily Tac. Blood samples were obtained using finger prick microsampling (10 μL Mitra ® Neoteryx, LLC) 3 ± 1 weeks after transplantation. A non-parametric population model (Pmetrics) was used to calculate the reference AUC 0-12h ref based on all 13 measured concentrations. In addition, AUC 0-12 LSS were predicted from 3 samples collected at 0-, 1-and 3-hours, and AUC 0-12 C0 was estimated from the single trough concentration (0-hour) with the same population model. The predictive performance of the 3-point LSS and C 0 estimated AUC:s were evaluated by comparison with the reference AUC 0-12h ref . Percentage relative bias was calculated. Results: Mean dose, trough concentration and systemic exposure of Tac (AUC 0-12h Conclusions: We conclude that Vtac is improved with conversion to srTac, in combination with Aza, in AYA RTRs. Therefore, in this at-risk population, once daily dosing appears to improve adherence. Furthermore, such conversion does not appear to put graft function at risk. Identifying these at-risk AYAs for such intervention remains challenging with screening tools and clinical evaluation, necessitating the importance of ongoing multimodal assessment. Results: There were 1404 kidney transplants analysed in the study period, and 58 patients were diagnosed with biopsy-proven BKVAN. The cumulative incidence was 4.1% during follow-up: 2%, 3.1% and 4% in 6, 12, 24 months after transplantation, respectively. Median time from transplantation to BKVAN diagnosis was 187 (61-1275) days. BKVAN was associated with recipient male gender (p = 0.042), deceased donor (p = 0.007) and preexisting diabetes (p = 0.017). Twenty-one (36.2%) patients had at least an acute rejection episode before BKVAN, and twelve (20.6%) after BKVAN diagnosis and IS reduction. Graft survival was inferior for BKVAN compared to non-BKVAN patients (p = 0.019). Survival in 5 years was inferior for Banff stages B/C (69.7%) compared to stage A (80.8%) and non-BKVAN patients (88.0%) (p = 0.017). Thirteen (22.4%) BKVAN patients lost their grafts, 9 (15.5%) attributed to BKV infection. Three patients with BKV-associated graft losses were submitted to a successful second kidney transplant, with no evidence of viral replication (follow-up of 7, 12 and 31 months). The cumulative incidence of BKVAN in our study was 4.1%. Despite of immunosuppression reduction there was a significant lower graft survival in BKVAN, specially in grades B and C. Methods: Single center, retrospective cohort analysis of patients with chronic CKD referred for cardiac PET-CT as part of pre-RTx ischemic risk assessment. CFR was calculated as the ratio of peak myocardial blood flow (MBF) with pharmacologic vasodilation compared with resting blood flow. MBF was assessed with Siemens PET-CT/ Syngo.Via.. Study cohort was compared to an age/gender matched controls without CKD. MACE were assessed at 60 days, 1 and 2 yrs post PET-CT;MACE defined as all-cause death or coronary revascularization. Results: As expected CKD RTx candidates had a higher proportion of DM vs.the control group. Verse the control group, CFR was lower among CKD with and without DM. Furthermore, CFR was significantly lower among CKD patients with and without DM who suffered MACE ; p-value for MACE events: 1) CKD patients with versus without DM =0.07 at 60 days; 0.006 at 1 year, 0.009 at 2 years; 2) Control patient with versus without DM =<<0.001 at 60 days; <<0.001 at 1 year, <<0.001 at 2 years Conclusions: CKD associates with decreased coronary flow reserve in patients with and without diabetes. Among patients referred for renal transplant evaluation , reduced global coronary flow reserve is associated with increased risk of near and long-term MACE events. Results support utility of PET-CT CFR in candidate evaluation and waitlist management. proportional hazards model adjusted for risk group, age, gender, race, living donor and cardiac work up (Catheterization, Lexiscan or dobutamine ECHO) between referral and transplant. Results: The cumulative incidence of cardiac events was the highest in the DM + / CAD + group (15%) and lowest in the DM -/ CAD -group (2%). Overall the incidence of events was significantly different between groups (log-rank p<0.001, Figure 1 ). In the multivariate analysis the risk of CV events was five times higher in the DM + / CAD + group compared to DM -/ CAD -(HR 5.02 (2.10, 11 .97), p<0.001). Deceased donor recipients had higher risk but only at the 10% significance level (Table 2) . Conclusions: Post-transplant cardiovascular events are common among diabetics with history of coronary artery disease. Cardiac work up between referral and transplant was not a predictor of outcome in our study. These results call for revisiting the guidelines to focus on the high-risk group and perhaps limit testing for low and intermediate risk groups. Methods: Over a five month period between March 2018 and August 2018, renal transplant patients were referred for cardiac catheterization over stress testing using the following criteria: history of peripheral vascular disease, >10 years of diabetes mellitus, > five years of hemodialysis treatment and an estimated pulmonary systolic pressure > 60 mmHg on echocardiogram. Data was collected regarding positive findings including percutaneous cardiac interventions, referral for cardiac surgery, referral to pulmonology, referral to heart failure service and referral to cardiac electrophysiology for diagnosis and treatment of arrhythmia issues found on catheterization. Results: Of the 32 patients referred for catheterization over stress testing, 41% had positive findings requiring referral or treatment. The patients with positive results required the following treatments: 46% were referred for cardiac surgery; 15% required percutaneous cardiac interventions; 15% had confirmed significant pulmonary hypertension and were referred to pulmonology for treatment; 15% were referred to the heart failure service and 9% were referred for cardiac electrophysiology testing and treatment. Our center has identified ways to spread awareness for LD and develop an active HIV+ LD kidney program. We sought IRB approval and UNOS variance 1yr ago, affording time to begin exploring educational opportunities within the HIV+ community. To date, we have identified 3 different HIV+ living donors, with 3 different viable routes to transplant. Methods: A critical review of our current LD practices was performed. The following aspects were considered: (1) How do we spread awareness of LD for people living with HIV (PLHIV), while maintaining high ethical standards? (2) What are the possible legal and medical pathways by which PLHIV could donate? (3) What operational changes are required within our center to ensure success? (4) What medical differences should be considered for an HIV+ living donor? Results: Our current transplant center protocols were adapted to meet the unique needs of PLHIV according to HOPE. Strategies were developed to expand this new LD pool, resulting in growth of the donation program. Specifically: (1) HIV clinicians now introduce living donation to potential donor candidates earlier; (2) Education classes expanded to include options for PLHIV to donate; (3) Outreach evenings to nephrology and HIV providers now include education regarding how to embrace a donor population that had historically been excluded; (4) HIV+ recipients volunteer to mentor and speak in public venues about LD. Furthermore, 3 key pathways to LD in PLHIV were identified and tested: Nondirected donation: A PLHIV approached our center to discuss donation as a result of seeing a news segment discussing local HOPE Act patients. A multi-disciplinary team continues to evaluate him, for possible D+/R+ donation, to someone at our center or beyond. Directed donation: A potential HIV+ donor, the partner of an HIV+ candidate, approached our center for evaluation. Additional testing and appointments required by the HOPE Act led to medical and ethical concerns, and ultimately, D-/ R+ transplant occurred. Paired kidney exchange: A PLHIV requested evaluation in 2018, as potential participant in our Kidney Paired Donation (KPD) program. The intended recipient (wife) remains HIV-, so our team agreed that we would process this pair as any other facing an incompatibility. Evaluation is ongoing. In each case, unique ethical challenges occurred, and modeling future renal function decline in the potential donor remained challenging. Conclusions: As our experience grows and our broader community becomes increasingly aware of the options for HIV+ LD, we believe we have created viable, safe and efficient pathways to transplant. Future research will focus on evaluation of these pathways, message penetration into the HIV+ community, donor selection, and transplant outcomes. the optimum model using preoperative factors that can predict eGFR at a year after kidney donation. We investigated the significance of preserved kidney volume in combination with other variables. Methods: We retrospectively reviewed medical records of 101 living-related kidney donors. The evaluated preoperative factors were baseline eGFR, age, BMI, the use of antihypertensive drug and calculated kidney volume of 3-dimensional reconstruction by thin-sliced enhanced computed tomography (CT) scan. Results: Total kidney volume which was calculated from CT scan data was significantly correlated with preoperative eGFR (Spearman's correlation of 0.317, p = 0.0029). The univariate analysis revealed that age and BMI significantly negatively correlated with eGFR at a year post-donation (correlation coefficient = -0.353 and -0.788, p = 0.0002 and 0.0118, respectively). Preoperative eGFR and preserved kidney volume significantly positively correlated with eGFR at a year post-donation (correlation coefficient = 0.610 and 0.086, p < 0.0001 and 0.0076, respectively). eGFR at a year post-donation in the group taking anti-hypertensive drugs was significantly lower than that in the group without anti-hypertensive drugs (Mann-Whitney U test, p = 0.0132). Multiple regression analysis employing sex, age, BMI, preoperative eGFR, preserved kidney volume, and the use of antihypertensive drug revealed that BMI, preoperative eGFR, and preserved kidney volume were significant predictive factors (R 2 = 0.752, p < 0.0001). The predictive equation was as follows; predicted post-donor eGFR = 20.22 +0.557×preope eGFR -0.694×BMI +0.057×preserved kidney volume (cm 3 ). Then we evaluated this predictive model in the validation cohort of 42 kidney donors. The significant correlation between the predicted postoperative eGFR and the exact postoperative eGFR was also observed by employing the predictive equation described above in the validation cohort (R 2 = 0.585, p<0.0001). In conclusion, donor kidney function a year after the kidney donation can be predicted by the combination of BMI and preoperative eGFR with preserved kidney volume. This prediction may be useful for both donor candidates and their physicians to assure the kidney donation would not imperil health of donors in the rest of their life. Purpose: Thymoglobulin® (r-ATG) has been used as induction therapy in liver transplantation (LT). No prospective, randomized, controlled, trial (RCT) has been performed to evaluate effect of r-ATG induction and delayed initiation of CNI on long-term renal function after LT Methods: 110 patients were randomized to r-ATG induction with delayed initiation of CNI for 10 days after LT, and standard CNI group, at 4 transplant programs. The eGFR and delta GFR were measured at 1, 3, 6, 9, and 12-month milestones for analysis Results: The median age, MELD score, baseline creatinine, baseline eGFR, and gender were similar between the two groups (all P>0.05). The median baseline, post-1, -3, -6, -9 and -12 month eGFR in CNI vs r-ATG groups were 88. The median delta eGFR at 12 months after transplant was better in r-ATG group although it did not reach statistically significant (-21 vs -12, P=0.23). The time-weighted average of CNI levels were similar between two groups (CNI group; 7.2ng/mL r-ATG group 7.5ng/mL P=0.60). The chronological changes of median and interquartile ranges of eGFR and delta eGFR according to pre-transplant GFR (≥ or <80) were shown in Figure 1 , and Table 1 . The protective influence of r-ATG was more pronounced (approaching significance), in patients with lower initial GFR at LT Conclusions: Early initiation of CNI has shown to affect long-term renal function after LT. Induction with r-ATG and delayed initiation of CNI seems to be protective of long-term renal function in LT. This effect is seen at every milestone of follow up but more pronounced in patients with initial degree of renal dysfunction, especially in the era of LT in high-MELD recipients Purpose: Chronic kidney disease (CKD) is the end manifestation of persistent renal damage, a pathology distinct from the reversible acute kidney injury (AKI). This distinction is critical because a single measure of creatinine, as included within the MELD score, does not differentiate between AKI and CKD. Despite this important distinction, little is known of the burden, the determinants, and the impact of CKD among patients with cirrhosis. Methods: UNOS data for all patients listed for LT in the U.S. between 2002-17 were analyzed. Patients listed as status-1 or with MELD exceptions were excluded. The primary outcome was the development of CKD pre-LT: eGFR<60 ml/min for 90 days with a final eGFR of <30 ml/min OR >72 days of hemodialysis. Competing-risk analyses determined the factors associated with CKD development, accounting for LT and waitlist death. Cox-regression determined the factors associated with post-LT mortality. The proportion of the adjusted effect of pre-LT CKD on post-LT mortality that was attributable to SLKT was determined comparing coefficient estimates and confidence intervals for this effect were calculated using bias-corrected percentile bootstrap methods. Results: Of 68,147 LT candidates, the proportion with CKD at last follow-up increased from 5.5% in 2002 to 11.7% in 2016 (p<0.001, test for trend) (Figure) . In adjusted analysis, the factors associated with the development of CKD included NASH (sHR:1.09), diabetes (sHR:1.24), age (sHR:0.98 per year), ascites (sHR:1.17), hepatic encephalopathy (sHR:1.38), MELD at listing (sHR: 0.89), and region median MELD score at LT grouping (High Median MELD Group vs. Low: sHR:1.50). Among the 36,203 patients who underwent LT, pre-LT CKD was significantly associated with post-LT mortality (HR 1.16, 95CI 1.07 -1.25, p<0.001) even after adjusting for donor risk index, age, final MELD, etiology of cirrhosis, presence of hepatic encephalopathy, receiving a SLKT, and presence of diabetes. There was no significant mediating influence of SLKT on the effect of pre-LT CKD on post-LT survival: 1-year proportion of the attributable effect of SLKT was 10.9% (bias corrected 95CI -30.3 -70%); 5-year proportion of the attributable effect of SLKT was 22.2% (bias corrected -12.6 -138.2%). The rising burden of pre-LT CKD is significantly associated with increasing co-morbidities (e.g., NASH, Diabetes, Age, ascites, hepatic encephalopathy) and longer waiting time (e.g., region risk, MELD at Listing). Pre-LT CKD has a significant deleterious impact on post-LT outcomes -an impact that cannot be corrected by receiving a SLKT. These findings highlight the need for the earlier identification of CKD where the implementation of preventative measures may optimize post-liver transplant outcomes. There were no long-term differences in creatinine (p=0.9449) or eGFR (p=0.7097) at 1 and 2 years between DBD and DCD SLK groups ( Figure 1 ). There were no differences in patient (p=0.3669), liver allograft (p=0.9675), or kidney allograft (p=0.9090) survival ( Figure 2 ). Conclusions: With experience, good outcomes can be achieved with DCD SLK transplants. Consideration should be given for using DCD donors for SLK recipients with lower MELD scores who are otherwise disadvantaged in organ allocation. The use of DCD donors for SLK transplants affords the opportunity to obtain good quality kidney allografts. ) were randomized (RND) to receive EVR+rCNI or MPA+sCNI with steroids. Efficacy assessments were incidence of binary composite of tBPAR or eGFR <50 mL/min/1.73 m 2 , incidence of tBPAR, graft loss (GL), or death, and evolution of eGFR up to M24; safety assessments were incidence of adverse events (AE) and infections. Results: Of 2037 RND patients,1693 received Bax and 342 received rATG. Consistent with previous findings, the EVR+rCNI regimen was noninferior (P<0.001) to MPA+sCNI for the binary endpoint at M24 for both induction groups. At M24, incidences of tBPAR, GL, and death were comparable between EVR+rCNI and MPA+sCNI arms regardless of induction type (Table) . Compared to Bax group, incidence of tBPAR was lower in both arms of rATG group. Mean eGFR was stable from Week 4 to M24 and comparable between arms at M24. Though the incidence of AE leading to study drug discontinuation was higher in EVR+rCNI arm, the incidence of AE leading to dose adjustment/interruption was higher in MPA+sCNI arm. Incidence of viral infections was lower in EVR+rCNI vs MPA+sCNI arm in both induction groups. Consistent with previous reports, incidence of CMV (Bax:8.4% vs 22.6%; rATG:10.1% vs 20.5%) and BKV (Bax:10.1% vs 14.1%; rATG:10.7% vs 20.5%) infections was lower in EVR+rCNI vs MPA+sCNI arm in both induction groups. Conclusions: Irrespective of the induction type, EVR+rCNI regimen offers comparable efficacy and safety and stable renal function to that of MPA+sCNI regimen up to M24 post-RTx. Conclusions: Elderly kidney transplant recipients may be selected for nondepleting induction therapies based on the perception that they are at increased risk of harm from overimmunosuppression and at a decreased risk of rejection due to immunosenescence. Contrary to this, our center experience shows an unacceptable increased risk of acute cellular rejection and allograft loss with BI in elderly recipients compared to TCDI. The current confidence in older age to protect against rejection is unwarranted and older recipients still benefit from lymphocyte depleting induction. Purpose: Our study was to investigate the potential associations between the single nucleotide polymorphisms (SNPs) in NFATC1 and the risk of T cell-mediated rejection (TCMR) in kidney transplanted recipients. Methods: A total of 200 patients were enrolled in our cohort and blood samples were collected. Total DNA was extracted and examined by target sequencing (TS) based on next-generation sequencing technology. Then, SNPs were genotyped and analyzed. Multivariable analysis was performed to identify the potential confounding factors of the results. Logistic regression analysis adjusted by confounding factors was carried out to evaluate the association of SNPs in NFATC1 and risk of TCMR in our cohort. Haplotype analysis was also used. Significant SNPs with TCMR were further investigated with the Banff score and rejection degree based on the pathological examination. Furthermore, to confirm the biological function of significant SNPs, we conducted the wild-type and mutation plasmids, which was co-cultured with T cells in vitro, respectively. T cell proliferation and inflammation cytokines were tested. Flow cytometry (FCM) assay was carried to explore the effects of mutations on SNPs on the T cell sub-populations. NFATC1 mRNA and proteins were also examined by PCR and western blotting. Results: As a result, 55 SNPs on NFATC1 were detected by primary sequencing. After minor allele frequency and Hardy-Weinberg equilibrium (HWE) tests followed by linkage disequilibrium analysis, a total of 14 tagger SNPs with 4 haplotypes were considered for further analysis. Then, 4 significant SNPs were identified with the risk of TCMR following renal transplant. No significance was observed in 4 haplotypes. The mutations of rs2290154 were found to be significant correlated with the Banff scores of TCMR cohort. In vitro, we found the mutation plasmids of rs2290154 could increase the expression of NFATC1 mRNA and proteins in CD3+ T lymphocytes, as well as promote the proliferation of T cells. Moreover, we found the accelerated secretion of IL-2 cytokines in supernatant of cell culture. FCM assay suggested the cell numbers of CD3+CD4+ and CD3+CD8+ T lymphocytes were both significant increased treated by mutation plasmids, as well as the proportion of CD4+/CD8+ ratio. In conclusion, our in vivo and vitro study suggested that SNPs of NFATC1 rs2290154 could significantly increase the risk of TCMR of recipients following kidney transplantation by promoting the activation and proliferation of T lymphocytes, and accelerating the polarization of CD3+ T lymphocytes. (T FH ) , germinal center (GC) B cell and DSA responses as compared to CTLA-4Ig following transplantation. T FH cells differentially express large amounts of CTLA-4 in response to alloantigen and several studies have implicated CTLA-4 as a key regulator of their differentiation and function. Therefore, we hypothesize that the enhanced DSA inhibition observed with selective CD28 blockade is dependent on the preservation of critical CTLA-4 coinhibitory signaling. Methods: Thus, we utilized a full MHC mismatch BALB/c to B6 and minor antigen (OVA) mismatch TCR transgenic murine skin allograft models to determine whether improved inhibition of the Tfh cell-mediated alloresponse compared to CTLA-4-Ig is CTLA-4 dependent. Skin-grafted mice received anti-CD28 domain antibody (dAb) with or without anti-CTLA-4 mAb (9H10) treatment for the analysis of Tfh cell-mediated responses and DSA formation. Results: Anti-CD28 blockade in the presence of CTLA-4 blockade abrogated the inhibition of DSA formation observed with anti-CD28 blockade alone, with detectable anti-donor antibodies 28 days post-transplant (Fig.1A) . Flow cytometric analysis of graft-draining lymph node T FH , GC-T FH , GC B, and plasma cells following skin transplantation did not reveal significant differences between treatment groups at 10 days post-transplant. Interestingly, examination of these compartments at day 24, immediately prior to the observed DSA breakthrough revealed significant development of T FH (3.35% vs. 0.46% p=0.007), GC-T FH (36% vs. 25% p=0.015), GC B (63% vs. 9.7% p=0.007) and plasma cell (5.3% vs 4% p=0.03) responses in anti-CTLA-4 treated mice compared to CD28 dAb monotherapy (Fig.1B) . Conclusions: Thus, selective CD28 blockade inhibition of Tfh cell-mediated humoral alloimmunity is CTLA-4 dependent beyond the day 10 peak Tfh cell response, supporting that preserved CTLA-4 signaling may lead to improved control of donor-specific immunity through superior inhibition of T FH and GC B cell responses following transplantation. Purpose: Induction of long term transplant survival by "costimulation blockade" (CoB) regimens is impaired by inflammatory responses. Despite the identification of type-1 interferons (TI-IFN) as mediators of this effect in multiple models, their target population and specific pathway used remain undefined. To better understand how TI-IFN could interfere with the induction of transplant tolerance, we studied their impact on the immunomodulatory properties of IL-10. Methods: Full mismatch (Balb/c into C57BL/6) skin transplant recipients received a peri-transplant regimen based on donor specific transfusion (DST) and anti-CD154 mAb (MR-1) +/-anti-IL-10R. Mouse T cells were isolated by negative-selection and Tmem and Treg subsets identified by flow cytometry. IL-10R expression and phospho-STAT3 induction after IL-10 or IL-6 stimulation in T cells were measured via flow cytometry. The gene expression profile of T cell subsets exposed to TI-IFN was assessed by microarray and quantitative PCR analysis. Protein levels were measured by Western Blot. Results: Our result showed that IL-10 has a fundamental role in the protective effect of CoB (transplant survival -MST 105d with CoB vs 47d with anti-IL-10R administration). We then studied the impact of T cell exposure to TI-IFN on IL-10 signaling. Following 48h of in vitro incubation with IFN-β, memory T cells (Tmem) and Tregs presented a dramatic defect in the production of phospho-STAT3 in response to IL-10. This effect was very selective, as IL-6 signaling (post IFN-β exposure) induced normal levels of phospho-STAT3. The reduced accumulation of Results: FcγRIIB is the sole inhibitory Fcγ receptor and is known to be expressed on B cells, DCs, and macrophages. We found that CTLA-4Ig-treated FcγRIIB -/animals exhibited accelerated graft rejection relative to WT controls (MSTs 20 and 33 d, p=0.0002). Interestingly, this was not associated with enhanced donorspecific antibody, but instead was associated with increased donor-reactive CD8 + T cell responses. We then discovered that FcγRIIB is upregulated on a subset of CD44 hi CD62L lo effector CD8 + T cells following transplantation. To determine if FcγRIIB plays a cell-intrinsic role in inhibiting CD8 + T cells, we generated a CD8 + T cell conditional KO system and observed enhanced accumulation of FcγRIIB -/-CD8 + T cells at 14 (p<0.05) and 21 (p<0.01) d post-transplant relative to WT controls. RNAseq analysis of FACS-sorted, donor-reactive CD8 + T cells revealed an enrichment of apoptosis-related genes in FcγRIIB + vs. FcγRIIBcells. Mechanistically, FcγRIIB -/-CD8 + cells exhibited lower expression of active caspase 3/7 on day 16 following transplant compared to WT cells (p=0.0315), suggesting that FcγRIIB signals may function intrinsically to induce CD8 + T cell apoptosis. To determine if FcγRIIB on donor-reactive CD8 + T cells impacted graft rejection, recipients of WT or FcγRIIB -/-CD8 + T cells were treated with anti-CD28 domain antibody. Animals that received FcγRIIB -/-CD8 + cells exhibited accelerated graft rejection following withdrawal from immunosuppression relative to those that received WT CD8 + T cells. Given these pre-clinical findings, we then interrogated gene expression profiles in renal transplant recipients enrolled in the CTOT09 study in which patients were weaned from tacrolimus immunosuppression. Compellingly, results indicated that FcγRIIB was one of only 7 genes that were significantly upregulated in PBMC isolated before withdrawal from patients that experienced freedom from rejection following tacrolimus withdrawal vs. those that did not. CellCODE analysis from RNA of patient PBMC revealed tighter associations of FcγRIIB with CD8 + T cell transcripts than with B cell, DC, monocyte, CD4 + T cell, or NK cell transcripts, indicating that CD8 + T cells were most strongly associated with differential expression of FcγRIIB in the stable vs. rejection patients. Conclusions: Based on these experiments, we conclude that FcγRIIB is a novel, cell intrinsic CD8 + T cell regulatory pathway that could be therapeutically targeted to promote CD8 + T cell apoptosis and improve outcomes following transplantation. Microbiology and Immunology, U. Maryland, Baltimore, Baltimore, MD Purpose: Lymphotoxin (LT) α1β2 is crucial for lymphatic organ development and orchestration of immune responses. LTα1β2 is preferentially expressed by regulatory T cells (Treg) for afferent lymphatic migration. The LT beta-receptor (LTβR) is highly expressed in lymphatic endothelial cells (LEC), and signals predominantly via non-classical NFκB (NIK) pathways. Here we show that IL-2R and CD3-mediated signaling pathways preferentially increase LTαβ expression in Treg, and test the hypothesis that this stimulates LTβR signaling to LEC to regulate the migration of leukocytes from inflamed tissues. Methods: Murine primary LEC were used in biochemical, phenotypic, and functional analyses of LTβR signaling. Murine naïve, activated, and regulatory CD4 T cells were isolated and migrated across LEC in vitro and lymphatic vessels in vivo. LTβR signaling was assessed with western blots, gene activation assays, and specific pharmacologic and genetic blockade. Results: IL-2R signaling induced stronger LTα1β2 expression on induced Treg (iTreg) and natural Treg (nTreg), compared to T cell receptor (TCR) signaling. IL-2 induced classical NFκB and mitogen-activated protein kinases (MAPKs) activation in iTreg and nTreg, but these subsets differentially utilized these signaling pathways. Blocking NFκB abolished both IL-2 and TCR driven LT expression in nTreg, while blocking c-Jun NH2-terminal kinase (JNK) and extracellular-signal-regulated kinase (ERK) inhibited LT expression to a lesser extent, indicating NFκB plays a major role in nTreg. In iTreg, blocking ERK abolished both IL-2 and TCR-mediated LT increases, while NFκB blockade had less inhibitory effect, suggesting iTreg were more affected by ERK and JNK activities (Table) . Activated iTreg and nTreg had the highest levels of LTαβ, and also showed the most efficient lymphatic migration. More efficient Treg migration prolonged islet graft survival by enhancing the resolution of inflammation through mobilization of inflammatory cells out of the inflamed graft. Conclusions: IL-2R and TCR signal through NFκB and MAPKs pathways to promote LT expression in Treg. There are differential signaling patterns in patrolling Treg in homeostasis versus iTreg inflammation. During inflammation, the highest levels of LTαβ in activated Treg endow these cells with the ability to harness LTβR signaling on LEC, thus modulating LEC structure and function and thereby condition the local environment for inflammation resolution and immune suppression. We have previously shown that CD28 blockade effectively inhibits de novo Tfh cell-mediated humoral alloresponses, but little is known about the role of memory Tfh (mTfh) cells in alloimmunity. Thus understanding if mTfh cells accelerate humoral responses in transplantation and whether they are susceptible to costimulation blockade will guide strategies to combat HLA antibodies. Methods: To examine the mTfh cell alloresponse we utilized a full MHC mismatch BALB/c to B6 murine skin allograft model. Naïve B6 recipients were grafted BALB/c skin, allowed to reject, and then re-grafted BALB/c skin 4-6 weeks after rejection of the primary graft. Mice were then left untreated or treated with CTLA4Ig, and serial draining lymph node (DLN) and serum analyses were performed to measure cellular and humoral recall responses. To determine whether memory responses depend on endogenous memory B cells, we also adoptively transferred 3x10^6 BALB/c-sensitized CD45.2 CD4+CD44+ memory T cells into naïve CD45.1 B6 mice that were then grafted BALB/c skin and their DLNs examined post-transplant. Results: DLN analyses of BALB/c sensitized mice following skin graft re-challenge revealed an accelerated mTfh cell response compared to the primary Tfh cell response in skin-grafted naïve controls 5 days post-transplant (8.9% vs. 1.8%, p=0.005). The germinal center (GC) B cell response was also more rapid 7 days post-transplant Purpose: Mouse pluripotent stem cell (PSC)-derived endoderm precursors (EPs) have small size, endoderm lineage gene expression signatures, and high proliferation rates, that allow engraftment in quiescent liver and create significant biomass without injury to the liver. This holds promise for the treatment of critical metabolic errors, where the structurally normal liver cannot tolerate reduction of hepatic function, however the analogue in human cell-based systems is not fully developed. Here we demonstrate amalgamated culture/selection conditions for derivation human EPs with potential clinical utility. Methods: We achieve precise and reproducible maintenance of pluripotency and differentiation over a time course spanning 3 days (EP), 6/7 days (hepatic progenitor (HP)), and 19 day hepatocyte-like cells. RT-qPCR and western blotting measure lineage-specific biomarkers at the population-level, while high throughput quantitative microscopy and flow cytometry measure these on a per cell basis. Cell proliferation states and viability are assessed using the trypan blue exclusion assay, and short-term engraftment into the murine liver is detected by in vivo imaging with the IVIS system. Results: Both EP and HP cells can be generated at greater than 90% efficiency from human pluripotent stem cells, and these cells proceed to become HLCs in vitro. Additionally, day 3 EP cells and day 6 HP cells maintain a high proliferation rate (30h and 22h, respectively). However, in contrast, day 6 HP cells are significantly larger than day 3 EPs. Early transplant analyses indicate EP cells can engraft in the undamaged murine liver parenchyma and persist for 8 days (see figure) , however observation of HP-derived cells at this timepoint is 1/3 partial hepatectomy dependent. Additionally a high rate of mortality was observed with HP transplants, which may be due to their larger size leading to portal thrombosis. The high efficiency of EP differentiation and evidence of persistence in the undamaged mouse liver suggests these cells may be superior transplant candidates. We find human EPs recapitulate many of the desirable characteristics we have identified using mouse EPs, which can engraft and persist long-term in the undamaged quiescent liver. Pending longer term analysis of in vivo engraftment Veloxis and Teraclone. Other; Nature of Relationship; Consulting honoraria and travel grants, Institution received research grants. F. Citterio: Other; Name of Commercial Interest; Novartis, Astellas, Pfizer, and Bristol Myers Squibb. Other; Nature of Relationship Srinivas: Other; Name of Commercial Interest; Novartis, Astellas, Bristol Myers Squibb and Novartis. Other; Nature of Relationship; Consulting honoraria Mor: Other; Name of Commercial Interest; Novartis. Other; Nature of Relationship; Consulting fee (study related). V.R. Peddi: Other; Name of Commercial Interest Consulting fees as advisory committee member. P. Narvekar: Other; Name of Commercial Interest; Novartis. Other; Nature of Relationship; Employee. M.P. Hernandez Gutierrez: Other; Name of Commercial Interest Name of Commercial Interest; Novartis. Other; Nature of Relationship; Employee A CMV Vaccine Based on Non-Replicating Lymphocytic Choriomeningitis Virus Vectors Expressing gB and pp65 is Safe and Immunogenic in Healthy Volunteers and Entering a Phase 2 Trial in Kidney Transplant Recipients Schwendinger: Salary Salary; Nature of Relationship Thiry: Salary; Name of Commercial Interest De Vos: Consulting Fee Consulting Fee Consulting Fee; Name of Commercial Interest; Hookipa Biotech GmbH. Grant/Research Support; Name of Commercial Interest Characterizing the Dominant Phenotype in Kidney Transplant Biopsies: Improving the Prediction of Risk by a Global View of Rejection, Injury, and Atrophy-Fibrosis Honoraria; Nature of Relationship; honoraria for lectures Ownership Interest; Nature of Relationship Quantifying Donor Effects on Transplant Outcomes Using Kidney Pairs from Deceased Donors Nature of Relationship; Epidemiology consulting. Grant/Research Support USHER(heart) and SHELTER(lung) trials of transplanting organs from donors with hepatitis C, Investigator initiated grants from CVS to the University of Pennsylvania to support studies of medication adherence Consulting Fee; Name of Commercial Interest; Renalytix. Grant/Research Support; Name of Commercial Interest; NIDDK, NHLBI. Other; Name of Commercial Interest Unos Analysis Of Deceased Donor Renal Transplant (ddrtx) Outcomes In Sensitized Patients After Implementation Of The Kidney Allocation System (kas) Urinary Injury Biomarkers in Deceased Donors and Acute Rejection Post Kidney Transplant Consulting Fee; Nature of Relationship; Epidemiology consulting related to dialysis outcomes. Grant/Research Support Grant/Research Support; Nature of Relationship; Investigator initiated grants to support trials of transplanting HCV infected organs into uninfected recipients, Investigator initiated grants to support medication adherence research. Salary Name of Commercial Interest; Jazz Pharma. Consulting Fee; Nature of Relationship Name of Commercial Interest Other; Nature of Relationship Support Ownership Interest; Name of Commercial Interest; Pfizer. Ownership Interest MD followed by glomerulonephritis (23.0%), diabetes (17.9%), other diagnoses (11.3%), and unknown (14.0%). Diabetes was most common among Hispanic (25.5%) and other race (33.3%) PLDs Hypertension was the most common diagnosis 10-20 and 20-30 years post-donation (38.1% and 33.7%, respectively), while the most common diagnoses 30-40 years post-donation were diabetes Racial Variation in ESRD Diagnosis Patterns among Prior Living Kidney Donors on the OPTN Kidney Waiting List Quantifying Differential Infection Risks in HLA-and/or ABO-Incompatible Kidney Transplant Recipients Intellectual Property Rights; Name of Commercial Interest; MycoMed Technologies Identification Of Pre-transplant Biomarkers Of Recipient Immune Dysfunction That Predict Recipient Risk Of Death Following Liver Transplant: The Immune Frailty Index 82)) and NKp44 -ILC3 (p=0.88) in fresh and in stable ITx recipients, indicating a dysbalance between protective and proinflammatory ILC subsets in fresh but not stable ITx recipients. By intracellular cytokine staining, we confirmed that NKp44 + ILC3s produced protective IL-22, while ILC1s and NKp44 -ILC3sproduced pro-inflammatory IFN-γ, TNF-α and IL-17. Importantly, serial, early clinical complications Abstract# 483 Local IL-33 Regulates Heart Transplant Infiltrating Myeloid Cells Metabolism and Differentiation to Protect against Chronic Rejection Disparities in Pediatric Liver Transplant Waitlist Access McAdams-DeMarco: None. Out of those matched, 100 patients (57 HLA I, 15 ABO I and 28 transplantation Abstract# 513 Kidney Paired Donation Transplant Outcomes: Experiences from the First Ten Years of the National Kidney Registry We estimated the risk of deathcensored graft failure (DCGF) and mortality using inverse probability of treatment weighted Cox regression. Results: NKR recipients were more likely to be female, African-American, older, on public insurance, have PRA>80, spend longer on dialysis, and be previous transplant recipients (all p<0.001). NKR recipients were followed for a median 3.2 years (max=10.3 years). NKR recipients had similar DCGF (log-rank p=0.2) and mortality (log-rank p=0.6) incidence compared to non-NKR recipients. After adjustment for donor, recipient, and transplant factors, there no detectable difference in DCGF Consulting Fee Other; Nature of Relationship Using convenience sampling, participants were recruited from five transplant centers in the Midwestern and southern US. The 6-month SystemCHANGE TM intervention supported patient-designed, interventionist-guided, small experiments to assess personal daily life systems, how systems influence medication taking, and to determine improvement solutions. A 6-month maintenance phase was included. Results: An intention-to-treat analysis, censored for non-protocol related drop out Using single trough concentration the predicted AUC showed too high bias for clinical application. On the other hand, taking advantage of microsampling and utilizing a 3-sample LSS method provided good predictions of individual systemic exposure of Tac, suitable for optimal individualization of Tac doses in the clinic LCPT; Envarsus XR) Dosing Considerations in De Novo Kidney Transplant Recipients Methods: A post-hoc analysis of studies comparing LCPT to IR-Tac was conducted. LCPT was initiated at 0.14 (lower dose or "LCPT-LD") or 0.17 mg/kg/day (higher dose, or LCPT-HD), while IR-Tac was initiated at 0.1 mg/kg/day, divided. Tacrolimus was initiated within 48 hours of transplant. Time to target range (defined as tacrolimus concentrations of 6 -11 ng/mL), percent of patients within target range, and frequency of dose changes during the first weeks post 06 days) and both LCPT groups exhibited significantly shorter time to obtain a tacrolimus concentration of at least 6 ng/mL compared to IR-Tac. More LCPT-LD patients remained within the target range during the first two weeks compared to the LCPT-HD patients, whereas more LCPT-HD patients reached trough concentrations above 11 ng/mL compared to LCPT-LD or IR-Tac immediately after transplant (Figure 1). Conclusions: LCPT-HD resulted in fewer patients with subtherapeutic tacrolimus concentrations early post-transplant but more patients with levels >11 ng/mL vs IR-Tac. LCPT-LD may represent a feasible initial dosing strategy that allows for rapid attainment of tacrolimus blood concentrations ≥6 ng/mL while exposing fewer patients to early trough concentrations >11 ng/mL versus LCPT-HD, though further evaluation in larger patient populations is needed to confirm this finding Consulting Fee; Nature of Relationship; advisory board. P. West-Thielke: Grant/Research Support Name of Commercial Interest; Veloxis. Honoraria; Nature of Relationship Patel: Salary; Name of Commercial Interest Compared to NMA, SC-BKVAN had higher Banff acute injury scores (P ≤ 0.001 for tubulitis, endarteritis, and interstitial inflammation) but similar chronic injury scores. SC-BKVAN was associated with over 4-fold increased hazard for the composite endpoint after adjustment for race, donor type, repeat transplant, and pre-transplant diabetes. Conclusions: We detected SC-BKVAN in 6% of high-risk recipients at 6 months post-transplant, much higher than previously reported in lower risk cohorts. SC-BKVAN was associated with subclinical inflammation and inferior 3-year outcomes. This study illustrates the importance of underlying immunologic risk on the impact of subclinical BKV infection Abstract# 531 Deceased-Donor Acute Kidney Injury Associates with Less BK Viremia in Kidney Transplant Recipients On multivariable adjustment, stage 1 donor AKI remained protective against both BKV and the composite outcome (figure). Conclusions: In this large and carefully phenotyped cohort, deceased-donor AKI was common but did not associate with increased risk for BKV or a composite outcome of BKV, graft failure or poor allograft function at 1 year. In fact, mild donor AKI was independently protective against this complication. While there is the potential for selection bias in that it is likely that centers preferentially selected better-quality AKI kidneys for transplant, these findings indicate the possibility that donor AKI induces morphological or immunological changes to the allograft that protect against BKV post-transplant except when donor AKI is severe. More research is needed to better understand these unexpected results CITATION INFORMATION: Hall I Muthukumar: None. B. Schröppel: Consulting Fee Novartis, Medeor Therapeutics. Grant/Research Support; Nature of Relationship; Industry sponsored studies. Ownership Interest; Name of Commercial Interest; Pfizer. Ownership Interest; Nature of Relationship; Individual stock Research Support; Nature of Relationship; Investigator initiated grants and industry supported clinical trials. Grant/Research Support; If "Other" Please Explain; NIH/ NIDDK, NIH/NIAID. Other; Name of Commercial Interest Name of Commercial Interest; Renalytix. Grant/Research Support; If "Other Genfit. Other; Nature of Relationship; Data safety monitoring board Abstract# 532 Donor-Derived Cell-Free DNA in Urine Identifies BKVN with Graft Injury in BK Viruria Kidney Transplant Recipients Suzhou, China Treatment with IVIg was initiated in conjunction with or after failure of reduced immunosuppression and/or lefluonomide therapy if plasma BK PCR was >10,000 copies/ml. IVIg was given at a dose of 2g/kg monthly until response (reduction in plasma BK PCR to <1000 copies/ml) or when patient was deemed to be a nonresponder. Results: Seventy-one patients were treated with IVIg, of whom 56 (78.9%) had a response. Table 1 shows the baseline characteristics of responders and nonresponders. Median time from BK diagnosis to IVIg initiation was 35 days and median plasma BK titer at IVIg initiation was 25750 copies/ml. Figure 1 shows time from IVIg initiation to response. Most of the responders (49/56; 88%) responded within 6 months of IVIg initiation. Using forward stepwise logistic regression (p<0.10 for inclusion), increasing time from transplant to BK diagnosis and increasing time from BK diagnosis to IVIg initiation were negatively associated with response to IVIg therapy Consulting Fee; Name of Commercial Interest Positron Emission Tomography (PET-CT) Global Coronary Flow Reserve (CFR) Stratifies Cardiovascular Risk among Renal Transplant (RTx) Candidates Short term and long term patient survival were significantly decreased in recipients with AIC (1-year patient survival: RR 2.19, 95% confidence interval (CI) 1.39-3.44, p<0.001, 5-year patient survival: RR 2.47, 95% CI 1.75-3.48, p<0.001). One-year uncensored graft survival was inferior in recipients with AIC (RR 3.15, 95% l (CI) 1.30-7.64, p=0.01). However, 1-year and 3-year deathcensored graft survival was similar in recipients Aorto-Iliac Calcification is a Risk Factor for Inferior Patient and Graft Survival in Kidney Transplant Recipients; A Systematic Review and Meta-Analysis Abstract# 544 Care of International Living Kidney Donor Candidates in the U.S.: A Survey of Contemporary Experience Assessing Pressure to Donate among Potential Living Kidney Donors Living Kidney Donors: Survey of US Transplant Centers Estimated Glomerular Filtration Rate at Donor Screening to Predict Measured Glomerular Filtration Rate after Living Kidney Donation A Randomized Controlled Clinical Trial of Thymoglobulin® and Extended Delay of Calcineurin Inhibitor Therapy for Renal Protection after Liver Transplantation: A Multicenter Study Methods: This was a retrospective cohort study of first-time, liver only recipients transplanted between 2005-2015 without exception points using data from the United Network for Organ Sharing (UNOS). The exposure of interest was the proportion of MELD-sodium (MELD-Na) score attributable to creatinine (Cr), described as "KidneyMELD" and defined as: (9.57*ln Cr) / (MELD-Na -6.43). Cox proportional hazards analysis adjusted for clinical and basic donor factors evaluated the association of KidneyMELD, both as continuous and categorical, with post-LT mortality. Results: Unadjusted post-LT survival was lower in recipients with increased KidneyMELD (Figure -Panel A; log rank p<0.001). Increasing KidneyMELD was associated with increased post-LT mortality independent of MELD-Na score (Figure -Panel B; p=0.001). The risk of post-LT mortality Conclusions: In conclusion, recipients prioritized for LT primarily on the basis of renal dysfunction have significantly increased post-LT mortality independent of MELD-Na score SLKT after implementation of the new policy was not associated 90d-RF/mortality: as compared to Era Early Impact of the New United Network for Organ Sharing Simultaneous Liver-Kidney Transplantation Allocation Policy: Improved Standardization but Similar Utilization Nature of Relationship; Advisory Committees or Review Abstract# 553 Hephaistos Study Outcome on Renal Function after 12 Month Everolimus Plus Reduced Tacrolimus in De Novo Liver Transplant Recipients versus Standard Tacrolimus TAC-C) in de novo liver transplant [LTx] recipients and to demonstrateimpact of CNI minimization on renal function. Methods: In this 12 months [M] prospective, open-label, randomized study with 15 German sites, 333 patients [pts] were randomized 1renal benefits Honoraria; Nature of Relationship; Study honoraria to institution, Adboards, Speaker. A. Pascher: Honoraria; Name of Commercial Interest; Novartis Pharma GmbH, Astellas, Shire. Honoraria; Nature of Relationship; Study honoraria to institution, Speaker. C. Klein: Honoraria; Name of Commercial Interest; Novartis Pharma GmbH. Honoraria; Nature of Relationship Wimmer: Other; Name of Commercial Interest; Employee of Novartis Pharma GmbH. B. Nashan: Honoraria; Name of Commercial Interest; Novartis Pharma GmbH, Astellas, Chiesi and Novartis. Honoraria; Nature of Relationship; Study honoraria to institution Chronic Kidney Disease among Liver Transplant Candidates: A Rising Burden and Its Impact on Post-Liver Transplant Outcomes Conclusions: rATG would significantly induce functional M-MDSCs in kidney transplantation recipients in IFN-γ/IL6-depended manner. Further investigation with humanized animal models are needed to verify the clinical observation Recipient age, race (White, Black, and others), sex, PRA, previous transplants, BMI, diabetes, hypertension, cause of ESRD, preemptive transplants, time on dialysis, serum albumin, PVD, malignancy, HCV, insurance. Donor age, race (White, Black, and others), sex, type (deceased vs living), BMI, HCV. For deceased donors -donation after cardiac death, diabetes, hypertension, cause of death, machine perfusion To Each Their Own: Center-Level Variation in Tailoring Induction Immunosuppression to Kidney Transplant Recipients Association Of Lymphocyte-depleting Antibody And Steroids Induction With The Risk And Outcomes Of Post-transplant Diabetes: Analysis Of Deceased-donor Kidney Transplants Methods: From 12/5/2007-7/16/2015 OPTN data, we used logistic regression to analyze the risk factors for PTDM at one-year post-kidney transplant with LDA+steroids induction regimens (ATG+steroids and ALM+steroids) as main explanatory variables and interleukin-2 receptor antagonist+steroids (IL2-RA+steroids) as reference. We used Cox regression models to determine the 5-year risks of overall graft loss (OAGL), death-censored graft loss (DCGL) and patient death (mortality) associated with LDA+steroids regimens in KTR with and without PTDM. Results: : ATG+steroids and ALM+steroids were associated with 26% and 29% higher odds ratios of PTDM, respectively than IL2-RA +steroids at 1-year post-KT However, the risk of 5-year DCGL was higher in KTR who received ATG+steroids induction and developed PTDM at 1-year post-transplant compared with KTR who received ATG+steroids induction but were PTDM-free at 1-year post-transplant LDA+steroids induction is a significant risk factor for death-censored graft loss in KTR with PTDM Association Of Lymphocyte-depleting Antibody And Steroids Induction With The Risk And Outcomes Of Post-transplant Diabetes: Analysis Of Deceased-donor Kidney Transplants After adjusting for MELD in separate bivariate models, both early (HR 6.36, p=0.001) and later (HR 2.27, p=0.09) alcohol use were associated with increased risk of post-LT death, though the association with later alcohol use did not reach statistical significance. Conclusions: Pre-LT factors associated with early versus later post-LT alcohol use are different, which may inform surveillance strategies for post-LT alcohol use. Early (vs. later) post-LT alcohol use appears to be more harmful as evidenced by the higher mortality risk. This highlights the first year post-LT as an especially important period to target interventions to prevent and treat alcohol use Abstract# 565 Liver Transplant Candidates: 9-Center Functional Assessment in Liver Transplantation Study Purpose: Obesity and frailty are associated with an increased risk of waitlist mortality (WLM) in liver transplant (LT) candidates. However, body mass index (BMI) may not identify candidates at high risk for waitlist mortality, given ascites and sarcopenia seen across all ranges of BMI. We investigated the relationship between frailty, BMI, and WLM. Methods: We studied adult LT candidates without HCC at 9 LT centers. The Liver Frailty Index (LFI; grip strength, chair stands, balance) was assessed at outpatient visit; frail was defined as LFI≥4.5. We estimated the prevalence of frailty in non-obese (BMI 18.5-29), class I obese (BMI 30-34), and class II and higher obese (BMI≥35) candidates. We estimated risk of WLM (death/delisting for sickness) using competing risks regression by frailty status Non-obese/ frail and class I obese/frail candidates had a higher risk of WLM compared to nonfrail counterparts (non-obese aSHR: 1.54,95%CI:1.02-2.33,p=0.04; class I aSHR:1.72,95%CI:0.99-2.99,p=0.05; p interaction=0.8). Yet, class II and higher obese/frail candidates had a 3.19-fold increased risk of WLM compared to nonfrail class II and higher obese/candidates (aSHR:3.19,95%CI:1.75-5.82,p<0.001; p interaction=0.047). Conclusions: In this 9-center study of 1108 LT candidates, the prevalence of frailty was similar across all BMI categories Impact of Recipient Age in Combined Liver-Kidney Transplantation (CLKT): Caution is Needed for Elderly Patients≥70 Years Division of Liver Diseases and Transplantation Elevated VSR was defined as >1.14. Frailty was defined as a Fried Frailty Index >=3. VAT and VSR were evaluated as predictors of post-LT mortality with Kaplan Meier and regression. Results: 91 patients were analyzed, mean age 56 years, 75% male, 53% had HCV. Median MELD-Na at listing was 14, 71% underwent LT but not in CTP B (rho=0.39, p<0.01) or CTP C (rho=0.35, p=0.14) cirrhosis. However, VSR poorly correlated with BMI in CTP A Conclusions: BMI correlates poorly with VAT and AT redistribution in ESLD and is a poor surrogate for metabolic risk. Elevated VSR, but not BMI, is more common in frail patients and significantly associated with post-LT mortality, independent of liver disease severity and frailty. More accurate measurements of AT depots are required to understand the impact of visceral adiposity on ESLD outcomes Up-to-Seven 0.502, and AFP score 0.526. Conclusions: HALT-HCC at the time of listing demonstrated higher prognostic value compared to other proposed allocation metrics for predicting waitlist dropout. Due to the superiority of HALT-HCC, it is not unreasonable to consider HALT-HCC as a new liver allocation metric Abstract# 570 TIPS Effectively Treats Refractory Hepatic Hydrothorax: A Multi-Center U.S. Retrospective Study of 1,260 Patients Other; Name of Commercial Interest; Speaker's bureau, Salix. E. Spengler: None. Concurrent Session: Lymphocyte Biology: Signaling, Co-Stimulation, Regulation phospho-STAT3 in conditioned cells resulted in the inhibition of the upregulation of mRNAs for LIGHT, Sphk1 and Tarm-1 -three genes we have discovered are induced by IL-10 in T cells. Encouragingly, this effect was slowly reversible with removal of TI-IFN, suggesting a "druggable" pathway. Microarray and flow cytometry data indicated that this IL-10-specific unresponsiveness was not associated with any reduction of IL-10R expression or an increase in SOCS (Suppressor of Cytokine Signaling) 1 and 3, nor with reduced STAT3 cytoplasmic availability. Instead, this analysis suggested a novel role for the transcription factor STAT1 in dampening IL-10 signaling. Using T cells from STAT1-KO mice, we show that the absence of STAT1 prevented IL-10 inhibition induced by IFN-β in Treg and Tmem, supporting our new model. Conclusions: Overall, these results highlight the importance of IL-10 in the therapeutic effect of CoB regimens and reveal a new molecular mechanism whereby TI-IFN interfere with IL-10 signaling, and ultimately with regulation of alloreactivity ) and R studio (R Studio Team). Cox regression models were built in a stepwise fashion to avoid over-fitting. Proportionality of hazards for covariates was investigated by inspecting the Schoenfeld residuals. The study protocol was approved by the UMCG institutional review board (METc 2008/186) and adhered to the Declarations of Helsinki and Istanbul. Results: 666 renal transplant patients had a mean TTV of 2.54 log copies/mL and 188 potential donors had a mean TTV of 1.14 log copies/mL, a significant reduction compared to patients High levels of TTV were also predictive of death due to infection HR 1.26 (95%CI 1.07-1.48) p = 0.001. Conclusions: This study shows that a levels of TTV, tested at least 12 months after transplantation, may be useful in predicting outcome after renal transplantation Research Support; Nature of Relationship; Dr. Reese has received investigator initiated grants from Merck to support trials of using HCV-infected organs in HCV-negative transplant recipients Hepatitis C Positive Donor to Negative Recipient Kidney Transplantation: A Single Center Real World Experience Other; Nature of Relationship; Speaker Rapid Growth of HCV-Infected Donors for HCV-Uninfected Kidney and Liver Transplant Recipients in the United States Several single-center reports of using HCV-viremic organs for HCVuninfected recipients were recently published in the US. In light of recent DAA introduction, we sought to characterize temporal changes in national utilization of HCV-exposed donors for HCV-uninfected recipients (HCV D+/R-) in kidney transplantation (KT) and liver transplantation (LT) HCV D+/R-recipients of viremic and aviremic livers had median (IQR) MELD scores of 24 (19-31) Conclusions: While few programs have adopted these strategies, the striking increases in HCV-infected donors for HCV-uninfected recipients indicates clear dedication to expansion of the donor pool despite infectious risk Renal Outcomes from Expander-1: Pilot Study of HCV+ Donor Kidneys for HCV-Kidney Recipients Results from the LIVE-C Free Trial DuBay: Grant/Research Support; Name of Commercial Interest; Gilead. Grant/ Research Support; Nature of Relationship Human Urine-Derived Stem Cells Protect against Renal Ischemia Reperfusion Injury via Releasing Exosomal miR-146a-5p Which Targets IRAK1 in a Rat Model Differences of Gut Microbiota Composition between Donor and Recipient May Predict 1-Year Graft Function Abstract# 595 Engineering Regulatory T Cells With TCR-Signaling-Responsive Interleukin 2 Nanoparticles Provides In Situ Suppression Of Alloimmunity To test the T Reg homeostasis in vivo, we transplanted BALB/c skin onto B6.RAG1 -/-mice and injected an equal ratio of B6 T Regs to CD8 T cells. Comparing CT T Regs , CT T Regs plus systemic IL-2, and NG T Regs 7 days post adoptive transfer we found that only CT+IL-2 treatment improved peripheral T Reg homeostasis. These mice, however, showed an unrepressed expansion of graft-resident CD8 + CD69 + T cells. NG-treated mice, on the contrary, exhibited gross suppression of CD8 cells in the draining lymph nodes and skin graft. We later confirmed that solely NG T Reg therapy could significantly prolong the mean skin graft survival (6 vs.18 days). We hypothesized that the NG platform was selectively benefitting antigen-specific T Regs , and to test this, we performed OVA to B6.RAG1 -/-skin transplants and injected an equal ratio of OT-II T Regs to OT-I CD8 T cells. For the NG T Reg -treated mice we observed a resounding suppression of OT-I CD8 T cells in the skin graft and a 30-fold increase in the T Reg :CD8 ratio. Finally, to assess if this platform could be translated to humans, we transplanted NSG mice with skin grafts from a healthy patient and transferred PBMCs and T Regs from a healthy donor 7 days later. We found that 21 days post adoptive transfer, the human NG T Reg treatment had efficiently attenuated CD8 T cell proliferation in the skin grafts, and while 5/6 (83%) of NG grafts remained healthy, only 3/6 (50%) of CT grafts and 1/4 (25%) of CT+IL-2 grafts did. Conclusions: Here, we describe a novel method for enhancing T Reg transfer therapy through spatiotemporal provision of IL-2 to antigen-specific T Regs . Importantly, our platform can be harnessed to complement CAR-T Reg therapies Abstract# 596 Differentiation and Characterization of Human Endodermal Precursors for Liver Cell Therapies Opthalmology and Visual Sciences Differentiation and Characterization of Human Endodermal Precursors for Liver Cell Therapies Research Support; Nature of Relationship MELD (35+) & high risk (3-5 risk score) Group3 patients, with the highest 1-year mortality, and longest median LOS, observed 98% increased costs program, providing patient-centric care with improved quality and outcomes, at lower costs Oral Abstracts Results: From Time 1 to Time 3, total prescriptions filled by the Transplant Pharmacy increased from 13,523 to 45,320 (Figure 1), and physician and nurse refill workload reduced from 42.7% and 57.3% to 8.7% and 5.9% for prescription generation. No safety concerns were reported in any quarterly review. Conclusions: After implementing a CPPA, prescription generation volume increased and physician and nurse burden decreased. This streamlined process from prescribing to dispensing prescriptions ensures close monitoring of post-transplant patients, while allowing physicians and nurses to dedicate more time to focus on patient care Segev: None. burden of health care resource utilization by women compared with men in the peri-transplant period may require interventions to prevent hospitalizations pre-LT among women on the LT waitlist Abstract# 618 Methods: We analyzed time to DDKT and LDKT among 827 KT registrants from 2012-2018 at a large transplant center in Brazil, using multivariable Cox regression adjusting for age, sex, race (White/Asian, Black, or pardo (mixed-race)), and income quartile (Q 1 (lowest)-Q 4 (highest)). We studied LDKT with DDKT as a competing risk and vice versa and performed a mediation analysis for race, income, and access to transplantation Abstract# 619 Racial Disparity in Pre-Kidney Transplant Work-Up Three-Year Outcomes of Highly-Sensitized Kidney Transplant Recipients Desensitized with Imlifidase (IdeS) The HLA-incompatible (HLAi) barrier has been traversed using desensitization therapies with IVIg, rituximab and/or plasmapheresis, although antibody-mediated rejection (ABMR) continues to be observed in 25-42% cases and constitutes the primary cause of early graft loss. Imlifidase (IdeS) is an IgG endopeptidase which cleaves human IgG into Fc and F(ab)'2 fragments. Previous data (N Engl J Med 2017) demonstrated efficacy in rapidly desensitizing HLA sensitized patients, allowing successful kidney transplantation. Here, we report follow up data up to 3 years on patients transplanted after IdeS desensitization Three-Year Outcomes of Highly-Sensitized Kidney Transplant Recipients Desensitized with Imlifidase DISCLOSURES: E. Huang: Consulting Fee Consulting Fee; Nature of Relationship; Consulting. Grant/Research Support Research Support; Nature of Relationship Nature of Relationship Other; Nature of Relationship; Employee Winstedt: Other; Name of Commercial Interest; Hansa Medical. Other; Nature of Relationship Consulting Fee; Nature of Relationship; Consulting. Grant/Research Support Grant/Research Support; Nature of Relationship ABO Incompatibility in Kidney Transplantation Can Increase a Safety Margin of Epitope Mismatch Levels against De Novo DSA Production Alloreactive B Cells in Transplant Tolerant Recipients Cannot Differentiate into Antibody-Secreting Cells but Can Suppress Donor-Specific IgG Production By Naïve B Cells showed that tolerant B cells did not recover their ability to differentiate into antibody secreting cells. In fact, congenic hosts of tolerant B cells produced significantly reduced anti-B/c IgG compared to congenic hosts of naïve B cells. Finally, the ability of tolerant B cells to suppress host IgG production was donor-specific, as congenic host receiving tolerant B cells produced comparable IgG to third-party (C3H) splenocyte immunization. Conclusions: Taken together, our data demonstrate that tolerant donor-specific B cells are profoundly altered compared to naïve B cells: they have significantly diminished ability to differentiate into germinal center and antibody secreting cells, but acquired the ability to suppress donor-specific, but not third-party, antibody responses by naïve B cells Abstract# 624 Long Term Renal Function Following Islet Transplantation in the Clinical Islet Transplantation Consortium Rickels 3 , t. Clinical Islet Transplantation (CIT) Consortium 4 , 1 Internal Medicine Purpose: Interventions on deceased donors or their organs hold promise for increasing the quantity and quality of organs for transplantation by minimizing organ injury and optimizing functionality. However, the transplant field is in debate over whether and how to obtain informed consent from transplant candidates offered intervention organs given time constraints intrinsic to allocation. Donor intervention research remains at a standstill until ethical, regulatory, and legal issues are resolved. This multi-center study aimed to assess waitlisted candidates' preferences for informed consent if offered an organ from a donor after organ intervention. Y. Jiang, X. Zhang, Beijing Chao-yang Hospital, Beijing, China Purpose: Myeloid-derived suppressor cells (MDSCs) consist of heterogeneous myeloid progenitors and precursors that was first defined in cancer patients and are characterized by a immunoregulatory ability to inhibit T cells responses. A growing body of knowledge indicates the potential involvement of monocytic myeloidderived suppressor cells (M-MDSCs) in transplantation. As the most commonly used induction regimen, rabbit anti-thymocyte globulin (rATG) exerts a great influence on T cells and NK cells, however, its effect on M-MDSCs remains ambiguous. Methods: We compared the number and immunosuppressive function of circulating M-MDSCs in kidney transplantation recipients with or without rATG in a 90-day follow-up period. Cytometric bead array of cytokines in the plasma of kidney transplantation recipients were also tested. An established system with the associated cytokines to induce M-MDSCs in vitro was applied to demonstrate the relevant expansion and activation mechanisms. Results: We herein reported, compared with patients without rATG induction, circulating CD11b+CD33highHLA-DR-CD14+CD15-M-MDSCs in rATG group increase significantly within the first month after transplantation, then restored to the preoperative levels ( Figure 1 2 NW, Chicago, IL, 3 MtSinai, NYC, NY, 4 Wisconsin, Madison, WI, 5 Swedish, Seattle, WA, 6 Ochsner, Jefferson, LA, 7 USC, LA, CA, 8 Houston, Houston, TX, 9 Penn, Philadelphia, PA, 10 Georgetown, DC, WA, 11 Iowa, Iowa City, IA, 12 Rush, Chicago, IL Purpose: ACCELERATE-AH is a multicenter consortium studying early liver transplant (LT) for alcoholic hepatitis (AH). To inform surveillance and intervention strategies for post-LT alcohol use, we sought to identify pre-LT factors to predict early vs. later post-LT alcohol use, and if these patterns were associated with post-LT survival. Methods: In this multi-center longitudinal analysis, 11 US sites provided detailed pre-LT psychosocial, clinical, post-LT alcohol use, and survival data. Consecutive patients with clinically-diagnosed severe AH, no prior diagnosis of liver disease or AH, who received LT from 2006-2018, were included. Alcohol use post-LT was any evidence of alcohol use post-LT: by clinical interview, biochemical testing, including ethyl glucuronide (ETG) or phosphotidylethanol (PEth). Alcohol use was categorized by date of first drink post-LT: none, early (≤1 year post-LT), later (>1 year post-LT). To evaluate factors predicting early vs. later post-LT alcohol use, Cox regression was performed, with LT recipients with no post-LT alcohol use as reference group, and center clustering. Results: 140 LT recipients for AH survived to home discharge (69% male, median pre-LT abstinence 55 days, MELD-Na 39, Lille 0.79, 49% overt encephalopathy), with median post-LT follow-up of 2.5 years . Post-LT alcohol use was as follows: 91 (65%) none, 32 (23%) early use, 17 (12%) later use. The proportion with sustained alcohol use among early (11/32; 34%) vs. later (4/17; 24%) alcohol use was similar (p=0.43). Probability of any alcohol use post-LT at 1-, 3-, 5-years was 24% (95%CI: 18-32), 37% (95%CI: 29-46), 42% (95%CI: 33-53). In MV analysis, Gastroenterology and Hepatology, Henry Ford Hospital, Detroit, MI, 2 Internal Medicine, Henry Ford Hospital, Detroit, MI, 3 Transplant Institute, Henry Ford Hospital, Detroit, MI Purpose: Biliary complications, including anastomotic biliary strictures (ABS), remain the most common cause of morbidity in liver transplant (LT) patients. Endoscopic therapy has become the first line treatment for these patients. The use of multiple plastic stents (MPS) in parallel is the current standard of care, but fully-covered self-expandable metal stents (cSEMS) are increasingly being utilized.Here we present one of the largest retrospective reviews regarding the management of ABSs after LT. Methods: We reviewed all endoscopic retrograde cholangiopancreatographs (ERCP) performed between January 2011 and August 2017 in post-LT patients at a single tertiary care center. No patients who underwent initial ERCP after this date were reviewed to allow for adequate follow-up. A total of 151 patients underwent ERCP. ABSs were found in 115 patients. ERCP with initial stent placement was successfully performed in 112/115 (97.4%) patients. Four additional patients were excluded from analysis because of death prior to undergoing repeat ERCP. The remaining 108 patients were eligible for analysis. Results: Plastic stents were the index stent in all 108 patients. Serial ERCPs were performed until ABS resolution or treatment failure. A mean of 2.74 ERCPs were performed with a mean indwelling stent duration of 102.9 days. Stricture resolution was achieved with plastic stents in 95/108 (88.0%) patients. Mean follow-up was 1226.4 days. Of the 13 patients with initial treatment failure, 10 patients received cSEMS placement for salvage therapy and 9 of these patients ultimately achieved ABS resolution (90%). The other three patients required surgery. Twelve of 95 (12.6%) patients with initial stricture resolution had ABS recurrence during the follow-up period. Five of these patients ultimately received cSEMS placement and all 5 showed resolution of ABS. Of the seven other patients, five received repeat MPS placement with resolution and two required surgical revision. Migration of cSEMS occurred in 4/15 (26.7%) patients. All four patients received repeat cSEMS placement with resolution of ABS. There were no deaths or cases of severe pancreatitis as a result of treatment for ABS. Conclusions: Our retrospective review supports the ongoing use of endoscopic biliary stent placement for the management of ABS in LT patients. Resolution can be attained with either MPS or cSEMS placement. cSEMS are an excellent option for refractory or recurrent ABS with success rates of 90 and 100% respectively. We did experience a high stent migration risk and vigilance to this complication is warranted. Purpose: For decades, donor-specific antibodies (DSA) have been thought not to be clinically relevant in orthotopic liver transplantation (OLT). However, recent studies have shown a negative impact of DSA class 2 (DSA2) on short term outcome of OLT recipients. So, aim of this study was to assess the long-term graft outcome with respect to the presence of DSA class 1 and 2. Methods: OLT recipients presenting to the outpatient clinic of the University Medical Center Hamburg Eppendorf with a 10-20 years post op follow-up were included in the study. Patients with HCV were excluded. Liver function tests, liver elastography and HLA antibodies were determined and liver histologies were reviewed. DSA with a MFI > 1500 were regarded as positive. Results: Altogether 132 patients with a mean follow-up of 5198 days post OLT were analysed. On last follow-up DSA2 were positive (DSA2 pos) in 55/132 (41.7 %) patients. There was no significant difference between DSA2 pos and DSA2 neg patients in terms of sex, length of follow-up post OLT, indication for OLT (Re-OLT, autoimmune, viral, alcoholic disease or other), CIT, WIT, and type of transplantation (split vs full organ). However, DSA2 pos patients were significantly younger (42y vs 59y; p < 0.001). Most importantly, the median liver stiffness on elastography of DSA2 pos patients was significantly higher than of DSA2 neg patients (9.4 ± 9.7 vs 6.9 ± 7.1 kPa; p < 0.001). In agreement, on liver histology (n=53) more DSA2 pos patients were found to have fibrosis > stage I (69.7% vs 40%, p = 0.03). Also a significant higher incidence of chronic rejections (12.7% vs 2.6%; p=0.02) and graft losses (5.5% vs 0%; p=0.04) were found in DSA2 pos as compared to DSA2 neg patients. However, there was no significant difference between DSA2 pos and DSA2 neg patients with regard to early or late acute rejection episodes. DSA class 1 (DSA1) were only positive in 22/137 (15.3%) of patients on last followup. However, there was difference between DSA 1 positive and negative patients in terms of any patient characteristics or graft outcome. Conclusions: Presence of DSA class 2 was associated with increased fibrosis grade, higher incidence of chronic rejection and graft loss in long-term liver transplant recipients. CITATION INFORMATION: Sterneck M., Sultani B., Rodriguez Lago M., Grabhorn E., Briem-Richter A., Herden U., Fischer L. Presence of DSA Class 2 is Associated with Fibrosis Progression after OLT Am J Transplant. 2019; 19(suppl 3 Purpose: Recent reports indicated important while inconsistent roles of preformed donor specific antibody (DSA) to HLA in liver transplantation. Re-transplant liver patients were understudied due to worse outcome than 1 st graft. Auto antibodies to Angiotensin II Type I Receptor (AT1R) have been reported to be detrimental for survival of kidney transplant but weren't studied in liver transplantation. In this study, we try to determine the roles of preformed DSA and anti-AT1R auto antibodies in long-term survival of 2 nd liver transplant. Methods: We retrospectively reviewed patients undergoing a 2 nd liver transplantation in our centre with available pre-operative serum and donor HLA typing. Patients who received ABO incompatible, multiple organ transplants or incomplete demographics were excluded. Banked sera were tested for anti-HLA antibodies with Luminex-based solid phase assays. Anti-AT1R antibodies were tested with ELISA kit. All statistical analyses were done with IBM SPSS version 25. Results: We included 94 patients in this study. With long follow-up up to 25 years, 48 (51.1%) patients had lost 2 nd liver transplant. Preformed DSA to 2nd liver transplant were found in 34 (36.2%) patients. Surprisingly, 48 (51.1%) patients had positive anti-AT1R antibody >17U/ml; 22 (23.4%) had anti-AT1R antibody>40U/ml so extra dilutions were performed for accurate measurement. When backward stepwise methods were used to select covariates in multivariate cox regression, only three variables were left in model: MELD (Model for End-Stage Liver Disease) Score VCU Medical Center, Richmond, VA, 2 Alberta Transplant Genomics Applied Centre, Edmonton, AB, Canada Purpose: Circulating donor-derived cell free DNA (cfDNA; Allosure, CareDx, USA), a non-invasive test that could detect rejection in kidney allografts was validated using histologic diagnoses(HDx). The interpretation of these findings could be difficult as the inter and intra-observer agreement with regards to histologic diagnoses of rejection has been no higher than 60% in published studies. The Molecular microscope (MMDx, Edmonton, CA) might be more accurate than histology. In this prospective study, we compared the detection of rejection from cfDNA to both histology and MMDx reads. Methods: 50 indication kidney transplant biopsies were analyzed using HDx and MMDx. cfDNA was measured prior to biopsy. Gold standard diagnosis was based upon a clinical consensus(CDx) by 5 transplant nephrologists after assessing all clinical factors and results. Sensitivity and specificity were calculated using cfDNA cut-offs of <0.21% to rule-out rejection and >1% to rule-in rejection. Spearman's correlation between cf-DNA and MMDx was calculated. Results: Based upon CDx, 17 (34%) had antibody-mediated rejection [AbMR; median cfDNA=1.2 (range: 0.23-6.9)], 3 (6%) had mixed rejection (median cfDNA=0.67 (range: 0.23-1.3)], 2 (4%) had T-cell mediated rejection [TCMR; median cfDNA=1.7 (range:1.2-2.2)], 29 (58%) had acute kidney injury (AKI; tubular injury, median cfDNA=0.23, range=<0.19-2.2)] and 1(2%) had BKV nephropathy (cfDNA=1.6). 7 cases (14%) had discrepant rejection diagnoses between MMDx and HDx. MMDx had a 100% agreement with CDx, compared with HDx (sensitivity: 86%; specificity: 93%). cfDNA>1% had a higher sensitivity and specificity for prediction of rejection when compared with MMDx than with histology. cfDNA<0.21% had a negative predictive value (NPV) of 93% to rule out rejection when compared to either HDx or MMDx. cfDNA scores showed a continuous positive correlation with histologic microvascular inflammation (G+PTC) scores (ρ=0.48; p<0.001), MMDx rejection score (ρ=0.64; p<0.0001), ABMR score (ρ=0.62; p<0.0001) and global disturbance score (ρ=0.54; p<0.0001). No such correlation was seen with GFR at biopsy or change in GFR from baseline. Conclusions: In this first report, we find that cfDNA>1% was better correlated with the Molecular microscope than it was with histology with excellent specificity for the prediction of rejection but poor sensitivity. cfDNA can be used as a screening test to rule out rejection (NPV~93% for cfDNA<0.21%) but cannot replace kidney biopsy to determine the type of rejection. Purpose: Liver steatosis is a known risk factor for exacerbated ischemia-reperfusion injury (IRI) and poor outcomes after liver transplantation (LT). RNA profiling was performed to gain hepatic expression profiles of mRNAs and miRNAs in fatty (F) and non-fatty (NF) liver donors in paired pre-implantation (PI) and post-Reperfusion (PR) biopsies to identify those pathways associated with worse injury during IRI. Methods: A total of 88 samples from 44 LT patients were evaluated. Liver steatosis was defined by histology with macro steatosis >20%. Gene and miRNA expression was done using microarrays and thresholds of a two-fold change and a false discovery rate (FDR) less than 0.05 were used. Ingenuity Pathway Analyses and Cytoscape were used for data analyses, integration and network identification. Results: 237 mRNAs and 17 miRNAs (most of them targeting genes associated with increased inflammatory disease and response) were identified as differentially expressed when comparing PI biopsies between F vs. NF liver donors (FC >2, FDR <0.05). When comparing PI vs. PR samples, 53 and 24 miRNAs and 2,730 and 1,338 probe sets were significant between F vs non-F livers, respectively. Using IPA filters, 18 miRNAs (including miR-486-5p, miR-297, miR-6780b-5p with FC>6) targeting 331 mRNAs with appropriated directionality in expression were identified from integrative analyses for F livers. Cell-to cell signaling interaction, cell cycle, cell death and lipid metabolism were the top biological functions associated with the differentially expressed molecular features. Adipogenesis pathway was identified as the top canonical function. Then, 10 miRNA -mRNA regulatory pathways were obtained, and networks were constructed for the F livers. The top regulatory network included as disease and molecular functions accumulation of lipids, apoptosis cell lines, quantity of antigen presenting cells, and quantity of macrophages with high positive consistency score: 89.3 (a high consistency score associates with increased consistent paths-those that connect an upstream regulator to a target and then to a disease or function). This regulatory network included 73 nodes and 42 regulators (i.e., Ca2, CAMP, CD40, EBK1/2, miR-155-5p, palmate acid, among others) targeting 23 differentially expressed molecular features in our data set (i.e., ABCA1, ATF3, CCL20, CCL3, CDKNJA1, CXL1, KRAS, NF2, SERPINE 1). Representative miRNAs and mRNAs were validated by real-time qPCR.Conclusions: The analyses of regulators effects using integrative approaches allow to identify how the fatty liver is affected during IRI by activated or inhibited upstream regulators. Consequently, integrated analysis of mRNA and miRNA profiles in liver tissue represents a powerful tool to identify novel targets for assessing with accuracy organ injury associated with graft steatosis. The risk-benefit framework currently used in living organ donation presumes all benefit is afforded to the recipient, whereas the donor only assumes risks. However, additional measurable benefits may be granted to the donor that are neglected in current paradigms of risk-benefit analyses. Methods: We conducted in-depth interviews with 56 living kidney donors regarding their decision to donate and any benefits they experienced from donation (Table 1) . Interviews were conducted and recorded over a 5 week period. Using grounded theory, qualitative themes were derived from interview transcripts by two independent coders; differences in coding were reconciled until reaching consensus. These themes were then analyzed for common patterns. Results: Twenty-five participants were in interdependent relationships with their recipients (i.e. partner or parent to child) meaning they had a shared household and/or significant caregiving responsibilities. Participants reported they were motivated to donate a kidney based on a more nuanced understanding of the benefits of donation than accounted for by the current paradigm. Some additional benefits included improvements in caregiving burden, wage earnings, donor independence, and the donor's ability to have children with the recipient ( Table 2) . Conclusions: Participants' evaluation of the benefits of organ donation included tangible benefits currently overlooked in live donor transplantation decisions. These additional benefits may alter present risk-benefit calculations so as to allow a greater threshold of acceptable donor risk. This in turn may expand the pool of potential donors which has significant implications for transplant medicine. Univ, St. Louis, MO, 4 SRTR, Minneapolis, MN, 5 Visient, Inc., Minneapolis, IA Purpose: Despite innovations in clinical care, liver transplant (LT) remains highly resource intensive due to medical complexity and organ quality. National allocation systems prioritize access to the deceased donor organ supply based on medical urgency, resulting in increasing numbers of LTs performed in patients with high model for end-stage liver disease (MELD) scores, and increasing prevalence of organs imported from distant allocation regions. Despite the shift in transplant demographics, reimbursement is not directly tied to illness severity or to organ origin Methods: LT registry data from the Scientific Registry of Transplant Recipients (SRTR) were linked to the Vizient clinical database to assess the total cost and total direct cost of 4856 LTs in 51 transplant programs performed from 2006-18. Reimbursement data were derived from a dataset linking SRTR to Medicare Part A (facilities) payments for 7432 LTs. LT recipients were excluded if the pretransplant period exceeded 7 days. Medicare payments via the cost report were not included in this analysis. Laboratory MELD score at LT was used as a marker of illness severity. Results: The average cost of LT increased by 29% over the past decade ($148,721 to $191,927) . The cost of LT varied significantly by MELD score (Figure) and donor location (local: $179,215; shared: $223,545) . While mean Medicare payments increased by 34% for high MELD patients ($88,429 to $134,867, P<.0001 ) and 31% ($93,417 to $136,070, P<.0001) for imported organs, this was significantly less than the reported increase in cost per case. Conclusions: LT care has become more financially challenging as the population of patients undergoing LT evolves. Recent shifts in allocation in addition to higher transportation costs and organ import fees are likely to intensify financial pressure on program finances. Novel strategies to lower costs and increase reimbursement are needed to preserve the financial viability of LT programs. The vs. 15 (IQR 9-33), p <0.001. Allosensitized LA and SLK recipients also had higher rate of readmission in the 1 st 6 months after LTA (14% vs. 10%, p < 0.001) and SLKT (12 vs. 9%, p <0.001, respectively). Rates of rejection at discharge were higher for allosensitized recipients after LTA (6.7 % vs. 5.8%, p=0.03) as well as SLKT (3.9% vs. 1.8%, p=0.01). Rates of liver graft failure were higher for sensitized patients both after LTA and SLKT. (Figure 1a and b) . Further, rates of kidney graft failure were higher for sensitized patients after SLKT (Figure 1c ). After adjusting for donor, recipient and transplant related characteristics, allosensitization was associated with increased mortality after LTA (AHR=1.06, 95% C.I. 1.00-1.12) as well as SLKT (AHR=1.07, 95% C.I. 1.01-1.13).Conclusions: This analysis shows that LTA and SLKT in allosensitized recipients is associated with significantly longer hospital stay and readmissions with a modest albeit statically significant higher rates of rejection, graft loss and patient death. This economic impact of allosensitization maybe applicable to other organ transplants and warrants further investigation. Purpose: Self-molecules containing damage-associated molecular patterns stimulate the pro-inflammatory response of infiltrating innate immune cells after their release during tissue injury. Yet, if local endogenous negative regulators are also present at the site of injury to control early innate immune responses is poorly understood. Methods: Wild type BALB/c, il33 +/+ Bm12, il33 -/-Bm12 HTx were transplanted heterotopically into WT or il33 -/-B6N recipient mice. In some groups, endogenous interleukin-33 (IL-33) was restored locally using WT or il33extracellular matrix (ECM) -bound nanovesicles (MBV) hydrogel at the time of transplantation. MBV were characterized by Cytokine Array and western blot. At days 3-7, or 100-120 post HTx, isolated grafts were stained with H+E or Trichrome or subjected to qRT-PCR, Western blot, and IF staining to define differences in cytokine expression. Leukocytes isolated from HTx and recipient spleens were compared through flow cytometry. Mitochondrial bioenergetics/function of bone marrow-derived dendritic cells (DC) was assessed using a Seahorse XF96e Analyzer. DC Mitochondrial mass, potential, and uptake of glucose and fatty acids were completed by flow cytometry. Results: We show that IL-33, a stromal-cell derived cytokine with emerging immunoregulatory properties is rapidly upregulated after heart transplantation and during graft rejection. We also establish that, in addition to its expected presence in the nucleus, IL-33 is also contained in MBV of the ECM. In a mouse heart transplant model, we show that transplants unable to express IL-33 displayed dramatically accelerated chronic rejection-associated fibrosis and vascular occlusion. This pathology was associated with augmented early graft infiltration by proinflammatory macrophages and monocyte-derived DC. Local delivery of IL-33 + MBV in an ECM-based hydrogel immediately after transplantation profoundly reduced the frequency of pro-inflammatory myeloid cells in IL-33 deficient-grafts. This treatment also reduced the subsequent development of chronic rejection. Related mechanistic studies demonstrate that IL-33 increases DC oxidative phosphorylation and fatty acid uptake. In total, our data establish that IL-33 is a powerful local regulatory factor that limits chronic rejection after heart transplantation by restraining the differentiation of infiltrating myeloid cells and causing a metabolic reprogramming that consistent with immunosuppressive DC and reparative macrophages. Our data also suggest the local delivery of IL-33 in ECM-based materials is a promising biologic for chronic rejection prophylaxis. The aim of this study was to assess improvements in long-term survival of pediatric patients after liver transplantation, by analyzing outcomes in transplant recipients who survived beyond 1 year after transplantation. Liver transplantation remains the only definitive therapy for children with end-stage liver disease, and has been the standard of care for over three decades. There has been a marked increase in the one-year survival of pediatric patients, from 78% in transplant recipients between 1987 and 1990 to 95% in transplant recipients between 2011 and 2017. However, the long-term outcomes have not seen a similar trend, and it is therefore crucial to shift priorities towards analyzing why this is the case. Methods: We analyzed 13,753 pediatric patients who survived for 1-year after receiving orthotopic liver transplantation between 1987 to 2017. The study period was divided into eras: 1987-1990, 1991-1995, 1996-2000, 2001-2005, 2006-2010, and 2011 onwards. Outcomes were analyzed using the Kaplan-Meier method for time-to-event analysis, and multivariable Cox regression. Results: There were no significant gains in unadjusted long-term outcomes among 1-year survivors over the past three decades ( Figure) . The log-rank tests for equality of survivor functions between each era and 1987-1990 were not statistically significant. Cause of death analysis revealed that although infections caused 20.6% of deaths in patients transplanted between 1987-1990, this number dropped to 5.6% in those transplanted between 2011-2017 (p=0.01). Furthermore, malignancy caused 10.6% of deaths in 1987-1990 but caused 22.2% of the deaths in 2011-2017 (p=0.04). Conclusions: Despite the gratifying gains in short-term survival of pediatric patients, 1-year survivors have no significant improvements in long-term survival after undergoing a liver transplantation. Long-term sequelae of immunosuppression, such as malignancy and infection, continue to be the most common causes of death. This study highlights the importance of better long-term management of immunosuppression. We identified a cohort of ITx patients with a history of rejection along with uncomplicated controls from our IRB-approved Immunomonitoring and Tissue Bank Study. A polychromatic flow cytometry (PFC) panel with and without PMA stimulation and culture was used to analyze peripheral blood and intestinal allograft samples to characterize surface receptor phenotype, transcription factor expression, and cytokine production. Results: PFC of biopsies showed a significant increase in CCR6+ IL-17 producing Th17 effector cells in rejection patients as compared to controls. To our surprise there was an overall increase in the proportion of CD25+ FoxP3+ Tregs in rejection patients however the majority of which were induced (Helios-) with less natural thymic derived Tregs (Helios+) which was not appreciated in blood PFC. Further Treg subset analysis revealed a significantly higher proportion of Th17-like CCR6 expressing and Th1/Th17-like dual CXCR3/CCR6 expressing memory (CCR4+CD45RA-) Tregs in rejection patients. Therefore we hypothesized that given Treg plasticity in a pro-inflammatory environment these Th17-like Tregs in ITx rejection assume an effector like phenotype. To test this hypothesis we performed an ex vivo restimulation assay which demonstrated that both natural and induced Tregs produce more IL-17 in rejection than control patients further corroborating their pro-inflammatory phenotype and disbalance in the Th17/Treg axis. Conclusions: Our study characterizes ITx rejection as driven by a severely altered Treg/Th17 axis with IL-17 producing CCR6+ Th17 effector and potentially proinflammatory Treg cells which may have strong implications on future clinical therapies. 2 Univ Miami, Miami, FL, 3 NWU, Chicago, IL, 4 UCLA, Los Angeles, CA, 5 Seattle Child, Seattle, WA, 6 Mt Sinai, NY, NY, 7 Henry Ford, Detroit, MI, 8 UNMC, Omaha, NE, 9 Georgetown, Washington, DC, 10 UPMC, Pittsburgh, PA, 11 Dallas VAMC, Dallas, TX Purpose: Review the management of immunosuppression (IS) for intestinal transplantation (IT) in the USA Methods: A survey was created and sent via email to surgical directors of centers performing at least 10 IT total over the last 3 years. We asked about human leukocyte antigen (HLA) testing, desensitization, IS and antibody-mediated rejection (AMR) Results: 8/10 centers responded. All perform routine HLA donor specific antibody (DSA) testing pre-IT. 37.5% check DSA titers only after infections or transfusions. 62.5% centers transplant through a positive type I DSA crossmatch (some, regardless of MFI titers) while only 37.5% do so with type II DSA crossmatch. In patients with pre-IT DSA, all centers perform follow up testing post-IT, usually every 1-2 weeks. 87.5% do this for those without pre-IT DSA. 50% centers perform pre-IT desensitization for isolated IT and 25% for multivisceral transplants with combinations of intravenous immunoglobulin (IVIg), rituximab, bortezomib or plasmapheresis. 87.5% centers use induction with antithymocyte globulin (ATG). Post-IT, the standard maintenance IS regimen is tacrolimus (FK) and steroids with 25% also using mycophenolate mofetil and 37.5% using an mTor inhibitor. Goal FK level is 10-15 ng/mL in the first 3 months and <10ng/mL beyond 1 year. If a desired level is not achieved, 50% centers use a sublingual (SL) formulation; 12.5% use neither a SL nor intravenous formulation. 75% centers run IS lower with a livercontaining graft. 75% centers perform protocol intestinal biopsies in the absence of symptoms, mostly weekly for the first 3 months post-IT. All centers diagnose AMR with one or more of the following criteria: refractory rejection, increase in DSA titers, C4d staining in tissue or histologic findings. Therapy is performed with plasmapheresis, IVIg, rituximab or steroids. Only 1 center uses bortezomib and none use ATG. When treating moderate/severe acute cellular rejection, the most commonly used agent is ATG (87.5%). Conclusions: All centers perform routine HLA DSA testing before IT; the majority check titers pre-IT every 3-6 months. Most centers transplant through a positive type I but not type II DSA crossmatch. Desensitization is mostly performed in isolated IT and when the panel-reactive antibody (PRA) is >70%. While most centers have similar practices for pre-IT DSA testing, transplanting through a positive crossmatch, induction and post-IT IS, there are several different strategies for desensitization and for the diagnosis/therapy of AMR. Formal protocols for desensitization and diagnosis/management of presumed AMR should thus be pursued across centers. Purpose: Transplantation tolerance is a highly sought and rarely achieved state in clinical medicine. Defined as long-term stable and acceptable graft tolerance without immunosuppression and with an otherwise competent immune system, tolerance maximizes graft survival and function and decreases organ demand without the negative side effects of immunosuppression. While there have been prior reports of tolerance after liver and kidney transplantation, we herein report and mechanistically characterize the first case of tolerance after intestinal transplantation. Methods: Intestinal transplantation was performed using our standard method. Immunosuppression for induction was thymoglobulin and steroids and for maintenance was tacrolimus. Graft and blood lymphocytes from the patient, 8 stable intestinal recipients, and 8 healthy controls (blood only) were analyzed using flow cytometry. Recipient T cell responses to donor and third party antigens were assessed by Pleximmune (proprietary). Results: We performed intestinal transplantation on a 14-year old male for severe pseudo-obstruction. Course was notable for grade 3 graft-versus-host disease (GVHD) at 7 months treated with thymoglobulin and steroids. Subsequently lost to follow up, he presented 7 years post-transplant and 3.5 years after stopping all immunosuppression with no graft problems. Biopsy showed pristine allograft histology. Given his history of GVHD, we hypothesized that central tolerance was mediated by chimerism. However, serial peripheral chimerism studies did not show donor chimerism. Thus, we speculated that he had peripheral tolerance mediated by regulatory T cells (Treg). This was confirmed by flow cytometric analysis showing that, compared to both sets of controls, the patient's blood had similar CD4/CD8 proportions but higher levels of naïve and lower levels of effector memory CD4/ CD8 as well as higher Treg percentages. Within the graft, we similarly showed lower levels of CD4/CD8 effector memory cells and higher levels of Foxp3 expression when compared to controls. Demonstrating immunocompetence, recipient T cells robustly produced IL-17, TNF-α, and IFN-γ when stimulated by PMA/IO and highly upregulated antigen-specific CD154 expression in response to CMV and EBV antigen PMT were identified who underwent VAE before or during transplant. Clinical characteristics, intraoperative variables and outcomes were reviewed Results: Median recipient age was 44 (40-55) years; 60% were men. Indication for MVT was cirrhosis with PMT precluding isolated liver transplantation. Median MELD was 26 (21-37). Interventional radiologists performed VAE immediately prior to MVT in 4 and intraperatively (after initial attempts at explantation of native viscera) in 1 with interlock coils, Amplatzer plug and gelgoam with or without Trufill glue and Lipiodol including splenic and superior mesenteric arteries in 5; while common hepatic, celiac axis and gasproduodenal artery were embolized in 3,3 and 1 patients respectively. Median blood loss was 6000 (800-7000) ml. Transfusion requirements were: red blood cell 16 (2-47) units, fresh frozen plasma 14 (0-29) units, cryoprecipitate 2 (1-14) units, platelets 4 (2-10) units. Advantage of VAE was clear in 2 cases: in patient undergoing intraoperative VAE, there was markedly improved hemodynamic stability, decreased vasopressor requirement and blood loss after embolisation. In a second patient, incomplete occlusion of arterial flow due to challenging vascular anatomy was associated with large volume blood loss (47 units of red cell transfusion). Time from incision to explant was 420 (198-490) min. Median lactate values pre-,post-VAE and at end of surgery were 4.1 (2.1-10.1), 5.8 (3.1-10.6), 4.2 (2.4-6.3) respectively. No correlation was identified between lactate level and length of the explant, transfusion volume, vasopressor use or postoperative infections. In 2 out of the 5 patients infection with multiple atypical organisms including non Tuberculous Mycobacteria and various Candida species were identifies; 1 of the 2 died. Both patients had prolonged cold schema times (557 and 474 min) compared to recipients who did not develop infections (average 347 min) Conclusions: MVT in candidates with PMT is associated with large volume blood loss and hemodynamic instability which can be improved with VAE. Further studies are needed however to determine the contribution of native gut necrosis, to the occurrence of potentially life-threatening atypical infections. Purpose: 37 subjects have been transplanted in a phase 2 protocol to induce tolerance in recipients of living donor renal allografts (KTx) based upon tolerogenic CD8 + / TCRfacilitating cells (FCRx) and nonmyeloablative conditioning. Methods: Recipients were conditioned with fludarabine (30mg/m2/dose, days -5,-4,-3), cyclophosphamide (50mg/kg/dose, day-3 and+3), 200 cGy TBI followed by KTx (day0). A G-CSF mobilized product was apheresed from the donor pre-KTx, processed to remove graft-versus-host disease (GVHD)-producing cells yet retain CD34 + cells and FC and infused day+1 post-KTx. Results: All subjects have reached at least 1 year of FU (range 23 -117 months). Pts ranged in age from 18-65 years and were from 6/6 HLA matched related (n= 2) to 0/6 matched unrelated. 17 subjects had unrelated and 20 had related donors. Two subjects were re-transplants. MMF and tacrolimus immunosuppression (IS) was weaned and discontinued at 1 year if chimerism, normal renal fcn and normal KTx biopsy were noted. 35 of 37 subjects exhibited peripheral blood donor chimerism at one month. Durable chimerism allowing full IS withdrawal developed in 26 subjects (time off IS ranging from 12-100 months); 23/26 of these pts showed >95% donor whole blood and T cell chimerism. All stable chimeric subjects retained chimerism after removal of IS and remain rejection-free. Long term chimeric subjects off IS have no evidence of immune defect: there have been no late opportunistic infections, they show robust T, B, and NK cell reconstitution, and they can be safely and effectively vaccinated and develop protective immunity. Transiently chimeric subjects resumed host-derived hematopoiesis and were maintained on low-dose IS with stable renal fcn. Late (> 4 yrs post-Tx) acute rejection occurred in two subjects with transient chimerism who became noncompliant. There have been two cases of GVHD. 1 subject exhibited grade 1-2 acute GI GVHD that responded to corticosteroids; this subject has gone on to develop mild chronic GVHD. The second subject presented late following development of symptoms and manifested treatment resistant GI GVHD with associated CMV that proved fatal at 11 months post-Tx. There have been two additional kidney graft losses, both previously reported and related to infections. A second subject death occurred in >100 pack year smoker who developed advanced stage lung cancer 4.5 years after KTx. Overall patient survival is 94.4% and death censored graft survival 94.1%. Tolerant FCRx subjects off IS had significantly better renal function than comparable KTx on SOC IS. Medical therapy for hypertension and hyperlipidemia was more common in SOC than tolerant pts. In summary, high levels of durable chimerism and tolerance with a low (5.5%) incidence of GVHD has been achieved in mismatched related/unrelated recipients of living donor KTx. We conclude there are significant long term medical benefits to establishing tolerance in KTx recipients using the FCRx approach. Purpose: Tacrolimus-based regimens remain the backbone of maintenance immunosuppressive therapy in kidney transplant, and therapeutic drug monitoring of tacrolimus is necessary due to a narrow therapeutic window. Time-in-therapeutic range(TTR) is important in the use of narrow therapeutic window medication. We investigated the effects of time-in-therapeutic range(TTR) of tacrolimus on acute rejection, infection control, patient and graft survival. Methods: In this retrospective cohort study, we collected clinical data from total 1421 living related kidney transplants from August 2007 to April 2017. All included patients received tacrolimus-based triad regimens and were followed up for at least 1 year. TTR at 1 year was calculated using the target tacrolimus trough level(5-12 ng/ml for the first year), and the optimal cut-off point of TTR were calculated using the receiver operating characteristic(ROC) curve based on acute rejection (AR) rate within the first year. The primary endpoints were the rate of AR and graft loss. The secondary endpoint was the rate of infection and patient survival. The optimal cut-off point for TTR was 59%. Methods: Two 12-hour Tac pharmacokinetic investigations (13 samples each), separated by at least one week, were performed in 27 renal transplant recipients with target Tac trough 4-7 µg/L. Patients were included 3 ± 1 weeks after transplantation and received mycophenolate mofetil and prednisolone in addition to twice daily tacrolimus as maintenance immunosuppressive therapy. At each sampling point 2 venous and 2 capillary samples (10 µL) were obtained; one pair (venous/capillary) went directly to the lab while the second pair was sent via ordinary mail service. Tacrolimus was analyzed using the standard mass spectrometry assay at the hospital and the microsampling method was validated according to the guideline on bioanalytical method validation of the European Medicines Agency (EMA). Results: A total of 682 pairs of venous and microsampled specimens from patients were assayed for Tac concentrations. Mean Tac dose and trough concentration were 3.5 ± 1.5 mg and 6.6 ± 1.5 µg/L, respectively. The micro sample concentrations were on the average 4.2% (95% CI: -5.1% to -3.4%) lower than the venous samples. All results of the method validation were within the criteria of the EMA guideline. Range of Tac concentrations tested was from 0.7 to 57 µg/L, the accuracy ranged from 88% to 98% and imprecision was below 5% (except for the 0.7 µg/L where CV was 11%). The Mitra tip® micro samples were stable for at least 30 days in room temperature and were not influenced by postal service shipment. Conclusions: Measurement of whole blood tacrolimus concentrations in 10 µL capillary blood from renal transplant recipients can be reliably and safely performed using the VAMS™ technique. Following a minimum of training, this also provides an option for self collection by patients. There was not significant difference observed in application of relative ddcfDNA quantitative method. There is no difference in urinary BKV loads among 3 groups (median 2.9×10 8 vs 3.5×10 7 vs 1.4×10 8 copies/ml, P=0.284). Plasma BKV loads in definitive BKVN group (median, 3000 copies/ml) is significantly higher than resolving BKVN group (median, 0 copies/ml, P=0.015) and non-BKVN group (median, 0 copies/ml, P=0.003). Mean serum creatinine level in definitive BKVN group (223.5±114.9 μmol/L) is slightly higher than resolving BKVN group (182.9±70.9 μmol/L, P=0.186), but is significantly higher than non-BKVN group (130.7±39.4 μmol/L, P=0.014). The receiver operating characteristic analysis by combining urinary dd-cfDNA and BK viruria (>10 7 copies/ml) reveals that the optimal cut-off is 10.2 ng/ml to detect BKVN, with high sensitivity (55.6%), specificity (92.3%), and area under the ROC curve (0.759). The combined use of absolute quantification of dd-cfDNA and BK viruria (>10 7 copies/ml) testing may improve the noninvasive diagnosis of definitive BKVN in kidney transplant recipients. Patients with dd-cfDNA+/ BK viruria (>10 7 copies /ml) have a high probability of definitive BKVN. Methods: Affymetrix microarray chips were used to build a molecular classifier to distinguish biopsies with BK virus from all others (N=1679, 55 with BK and 195 with histologic TCMR). The same analysis was repeated in all biopsies with molecular TCMR (N=192, 30 with BK). Molecular diagnoses were assigned with 10-fold cross-validation using the glmnet package in R. All steps, including probe set selection, were carried out from scratch within each cross-validation training set. These classifiers were used in conjunction with published classifiers for TCMR. Results: The top differentially expressed genes between TCMR and BK were higher in BK, and included PIF1 (5'-To-3' DNA Helicase, which suppresses genome instability), CCDC74A (Coiled-Coil Domain Containing 74A), and immunoglobulin genes (IGHM and IGLL5) representing a plasma cell infiltrate. While most biopsies with high molecular BK scores had histologic BK, there was considerable overlap with molecular TCMR; some BK biopsies had low molecular BK scores ( Figure 1A ).The area under the curve (AUC) for BK vs all other diagnoses was 0.82 in the entire population (not shown) and 0.81 in the molecular TCMR population ( Figure 1B ). At the optimal cutoff (0.15 on the x-axis of Figure 1C ), the balanced accuracy (mean of sensitivity and specificity) within the molecular TCMR population was ~0.77. The diagnostic accuracy of the classifier using human mRNAs on the current microarray is not sufficiently high to allow reliable separation between kidneys with and without BK (although measurement of viral mRNAs will probably resolve this -research in progress). However, given the relatively high AUC, the classifier can be used to flag biopsies with molecular TCMR that have a high probability of BK coexisting with TCMR. Purpose: BK viremia is associated with inferior kidney allograft outcomes. Typical treatment involves reduction in immunosuppression. However reduction in immunosuppression is also likely to lead to increased immune response against the allograft. We tested the hypothesis that increasing net immunosuppression among those with resolved or low grade BK viremia lead to improved graft outcomes. Methods: We conducted a retrospective analysis of patients transplanted at a large midwestern academic medical center from July 2011-June 2013. All patients who were positive for BK viremia (termed primary event) with a peak level above 1000 copies/ml (Viracor laboratory) were selected to be included in the study. Changes in immunosuppression were noted from nursing notes and confirmed from the clinic visits/medication list in the EMR. Our outcome of interest was time to a composite of recurrent BK viremia, Biopsy Proven Acute Rejection (BPAR) and de novo DSA (together termed secondary event) as well as All Cause Graft Failure.Recurrent BK viremia was defined as reemergence of BK >1000 copies, if had become undetectable, doubled from prior and/or affected another decrease in IS when it had never become undetectable. Changes in immunosuppression between the primary and secondary event were classified as follows. Group 0 : After the initial reduction due to primary event, net immunosuppression was not increased. Group 1: The net immunosuppression was increased. Patients entered the analysis at the time of the primary event.Results: 88 patients were available for the study with 44 in Group 0 and 44 in Group 1. There was no statistical difference between the groups with respect to gender, pretransplant sensitization and peak BK viremia levels. Similar proportion of patients in each group had undetectable BK viral levels. Figure below highlights that the rechallenging with increased net immunosuppression (Group 1) lead to lower composite of recurrent BK viremia, de novo DSA and BPAR. The lower composite was driven by lower de novo DSA formation and BPAR. There was an increased rate of recurrent BK viremia with net increase in immunosuppression (Group 1). However, all cause graft loss was significantly lower in the group that was rechallenged with immunosuppression. Rechallenging patients with increasing immunosuppression after BK viremia leads to a reduced risk of overall graft failure. In the future randomized control trials focused specifically on increasing immunosuppression in those with undetectable or low grade BK viremia to reduce risk of graft failure need to be conducted. (23), total cholesterol >200 (40), hypertension (183) and diabetes on insulin (79). The hypotheses were tested using t-tests with a p-value < 0.05 for significance. Results: Eighteen deaths and 43 CVE were observed. FRS and RCRI (utilizing pre-transplant creatinine) were unable to predict CVE or mortality at 1 or 3 years post-transplant. RCRI (utilizing post-transplant creatinine) was not statistically significant for CVE or death at one year. At 3 years, RCRI was not substantial for death but did show a significant prediction of CVE (p = < 0.05).Conclusions: This data identifies the potential for RCRI to measure CVE risks outside of the perioperative timeframe at 1 and 3 years with predictability at the 3 year time point. FRS was unable to significantly predict CVE at 1 and 3 years in this sample. Longer follow-up is necessary to accurately assess both mortality and CVE. Studies using these and other cardiovascular risk tools prospectively could yield a credible non-invasive approach in assessing this at-risk population. . Among respondents, 46% identify 1 staff member to serve as "point person", most commonly a coordinator (83.8%). Communication with candidates is predominantly conducted by email (38.7%), telephone (35.5%) and mail (19.5%). 17 reported written evaluation protocols and 25 reported educational materials for candidates. The most common reasons for closing evaluations in 2017 included visa barriers (24.2%), inability to complete evaluation (13.95%), and program concerns regarding follow-up (11.3%), access to other healthcare (9.8%), donor-recipient relationship (9.8%), financial impacts (8.8%) and motivation (7.7%). Programs that do not evaluate international LKD endorse similar lead concerns. Compared to staff time required for evaluation of a domestic candidate, time for an international candidate evaluation was estimated as >3-fold higher by 29.6%, 1.5 to 3-fold higher by 46.9%, equivalent by 21%, and less by 2.4%. The evaluation of international LKD is a resource intensive process that raises key considerations for communication, travel, assurance of motivation, evaluation completion, financial impacts, and access to follow-up care. The lack of written protocols at most programs and the time-consuming nature of these evaluations highlight the need for guidance to support efficient, appropriate care of this unique candidate group. In an effort to standardize SLKT utilization, UNOS/OPTN implemented a new SLKT policy on August 10 th , 2017. We aimed to characterize the early impact of this policy change on listing for SLKT, SLKT transplantation, and early post-SLKT renal/overall survival. Methods: UNOS data for all non-status 1 adult (≥18y) patients listed for LT from 8/10/16 -11/10/17 were analyzed. To determine the impact of the SLKT policy change, we compared LT candidates in the following periods based on either their listing (to compare listing characteristics) or transplant date (to compare transplant recipient characteristics): Era 1 (8/10/16-11/10/16); Era 2 (5/10/17-8/10/17); Era 3 (8/10/17-11/10/17). We then determined the primary outcome, a composite of either 90d-renal graft failure (RF) (as reported to UNOS, at 90 days post-DDRT, not including mortality) or 90d-mortality, by era of SLKT. Cox-regression clustered on center then determined the association between the period of SLKT and the primary outcome. Results: There were 1621 patients listed in Era 1, 1672 in Era 2, and 1623 in Era 3. There were no significant differences in the % of patients listed for SLKT between the listing eras: Era 1: 13% v. Era 2: 12% v. Era 3: 12% (p=0.56) (Figure) . There were 969 LT recipients in Era 1, 1018 in Era 2, and 997 in Era 3. There were no significant differences in the % of patients who underwent SLKT between the transplant eras: Era 1: 14% v. Era 2: 12% v. Era 3: 11% (p=0.17) (Figure) . As compared to SLKT recipients before the policy change, SLKT recipients who underwent transplant after the policy change were more likely to meet the UNOS SLKT criteria (Era 3: 62% v. Purpose: Acute and chronic kidney injury are a common occurrence for patients with end-stage liver disease and are associated with significant morbidity and mortality. We aimed to determine the utility of native kidney biopsy to predict renal dysfunction following liver transplantation as well as to better characterize pathologic findings in patients with hepatorenal syndrome. Methods: We performed a prospective observational trial to identify potential predictors of severe renal dysfunction following liver transplantation. A native kidney biopsy was performed at the time of liver transplantation which was reviewed by a single pathologist. The primary outcome of interest was the development of chronic kidney disease (CKD) stage 4 or 5. The secondary outcome measured was death. Immunofluorescence, light microscopy, and electron microscopy findings on the native kidney biopsy were compared between patients with the primary and secondary outcomes.Results: A total of 89 patients underwent native kidney biopsy at the time of liver transplantation. Recipient, transplant, and donor characteristics among the study population are shown in Table 1 . 16 patients went on to develop CKD stage 4 or 5 and 14 patients died during the follow-up period. A larger proportion of patients who either developed CKD stage 4 or 5 or died had >20% interstitial fibrosis and tubular atrophy (IFTA) as compared to those that did not experience either endpoint (p-value = 0.03). There were no significant differences in glomerulosclerosis (GS) or arterial sclerosis between the two groups (p-value = 0.18 and 0.38 respectively). Conclusions: Cirrhosis of the liver can be accompanied by a variety of microscopic glomerular disorders. Advanced interstitial fibrosis on native kidney biopsy is associated with poor renal outcomes following liver transplantation. Table] . For instance, across the entire population, older recipient age was associated with lower odds of ATG use [OR= 0.81 0.86 0.91 ; per 10.9 y (1 SD) increase after 40 y]. However, when estimated specifically for each center, this odds ratio was greater than 1.0 in 72 (26.0%) centers, indicating older recipient age was associated with greater odds of ATG use in these centers [Figure] . There is a substantial center-level variation in personalizing induction immunosuppression, and practices at different centers are contradictory. An evidencebased guidance on this practice is warranted. Purpose: Alemtuzumab induction leads to profound T cell depletion followed by a prolonged period of repopulation. Repopulation is thought to be influenced both by thymic output and homeostatic expansion of mature peripheral cells, though the relative contribution of each has not been well studied. We have longitudinally investigated the kinetics of repopulating T cells in 40 patients (median 47 years, range 20-69) following alemtuzumab induction over 36 months posttransplantation. Maintenance immunosuppression was belataceptbased. In vitro, we studied purified CD57 + and CD57cells with or without IL-7 or autologous mature dendritic cells (mDCs). Results: Substantial homeostatic T cell proliferation was seen in all patients and was characterized by increased intracellular Ki67 expression (p<0.05) that retuned to baseline over a period on 6-18 months. This was not observed in a control cohort of non-depleted patients. Following repopulation, alemtuzumab-treated patients were enriched significantly for naïve T cells (T N , CCR7 + , CD45RA + , CD57 -PD1 -, CD28 + ) when compared with baseline (p<0.05). Use of a generalized estimating equation (GEE) linear model revealed a strong negative linear association between frequencies of reconstituting CD4 + and CD8 + T N cells and advancing patient age (Bonferroni-adjusted p<0.0036) that was evident within 18 months post-depletion. An direct relationship between age and persistence of CD4 + and CD8 + effector memory T cells (T EM ) including CD57 + CD57 + memory subtypes was also shown (p<0.0036). To assess the role of thymic output in T cell expansion, we analyzed 28/40 patients for repopulating CD31 + CD4 + cells within 12 months posttransplantation, and found significant increase of frequency for CD4 + CD31 + cells when compared with baseline (p<0.002 at 6 months, p<0.001 at 12 months). Patients under 30 years old showed significantly more CD4 + CD31 + cells (p<0.0009), while the frequency of CD4 + CD31 + cells in patients over 55 years old was similar to baseline (p<0.6), and was significantly lower than young patients (p<0.0087) before and after depletional induction. CD4 + CD57cells proliferated in the presence of IL-7 (78.7±15.4%) or mDCs (71.3±28.1%). In contrast, CD4 + CD57 + cells did not proliferate in the presence of IL-7 (8.2±5.3%), but autologous mDCs induced CD57 + cell proliferation (37.6±5.6%). In summary, our data establish insight into the importance of agerelated thymic output and hoemostatic expansion of T cells after alemtuzumab induction. We also find a subordinate role for autologous mDCs for expansion of residual memory T cells. . Overall rates of acute rejection and major infection within the first year were 25%, and 15%, respectively, with no significant differences identified between groups. Conclusions: Alemtuzumab induction appears to be safe and effective regardless of recipient age. Use of alemtuzumab in elderly recipients may mitigate any potential negative immunologic effects of receiving ECD kidneys. In the present study, recipient and donor characteristics were comparable between all age groups, except recipient age as expected (p<0.01). Although patient survival at 1-year (80%, 88%, 92%) and 3-year (80%, 83%, 92%) was similar in age groups 18-45, 46-59, and 60-69, respectively, it was significantly lower in recipients≥70 years at 1-year (60%) and 3-year (40%) ( Figure 1A ). Recent SRTR data which compared recipient age < and >65 years in CLKT showed similar outcomes. To replicate the SRTR survival stratification, we ran an analysis using 18-45, 46-64, and ≥65 years age groups, as controls ( Figure 1B) . Control analysis showed that patient survival was similar among different age groups, as shown by the SRTR analysis. Conclusions: Although LTA can be safely performed in selected elderly recipients, extreme caution is needed in recipients ≥70 years undergoing CLKT due to the magnitude of operation and expected poor outcomes. Purpose: Alcoholic liver disease (ALD) is a common indication for liver transplantation. Patients with ALD are often required to be abstinent for a period of time prior to being listed. This may reduce disease progression while on the waitlist and therefore alter their waitlist outcomes compared to those with other disease etiologies. This study aimed to evaluate discrepancy of waitlist outcomes between those with ALD and non-alcoholic liver disease (NALD). Methods: Data for adult patients listed for liver or liver-kidney transplant after the introduction of MELD-Na score based liver allocation (Jan 2016-June 2018) were obtained from the OPTN/UNOS. The following were selected as major liver disease etiologies: ALD cirrhosis, hepatitis C cirrhosis, non-alcoholic steatohepatitis cirrhosis, primary biliary cholangitis, and primary sclerosing cholangitis. Patients with overlapping diseases and those with an exception score were excluded. Patients were categorized into different listing MELD-Na score groups (6-14, 15-20, 21-25, 26-29, 30-34 and >34) to identify variations of outcomes among the different score ranges within each etiology. Waitlist outcomes were studied by adjusting risk for recipient characteristics including age, gender, race, region, functional status, dialysis use, presence of diabetes, ascites and encephalopathy. Results: Patients with ALD showed the lowest waitlist mortality rate and highest recovery rate among disease groups (Figures 1 A- Figure 1 D-F) . Overall waitlist mortality stratified by listing MELD-Na score was lower in ALD patients compared to NALD patients in mid-score groups (score of 15-29) (p<0.05), whereas there was no difference in the lowest (6-14) and higher-score groups (30 or higher, Figure 2 ). Overall transplant probability was similar in ALD and NALD patients in the score groups of 26 or higher, whereas ALD had lower probability in the lower score groups (<26). ALD also showed higher chance of recovery in lower to mid score groups (6-29). Conclusions: ALD cirrhosis patients have lower waitlist mortality and better recovery while on the waitlist compared to patients with other etiologies. This was prominent in mid score groups. These results suggest that risk stratification and prioritization of liver allocation may need to be altered according to liver disease etiologies. Purpose: We previously identified donor lymphocytes in peripheral blood of recipients directly after transplantation and persisting at least three weeks that were characterized by the tissue retention marker CD69. In order to determine their origin, we hypothesized that donor T and NK cells have a peculiar tissue-resident memory phenotype that is shared between cells derived from lung perfusates, trachea, parenchyma and lymph nodes (LN). Methods: Donor lymphocytes in recipient blood were determined in 27 lung recipients at T0, T24, 3 weeks by staining of donor HLA class I molecules in combination with lineage-and tissue-specific markers using FACS. The phenotype of T and NK cells in perfusates (n=30), donor trachea (n=5), LN (n=10) and recipient explanted parenchyma (n=15) was compared to circulating cells using the same markers. In peripheral blood of all lung recipients, donor T and NK cells were detected at T0, T24 and 3 weeks and had higher CD69 expression compared to recipient cells (p=0.001 to 0.03) and were mostly CD25 -. This phenotype was similar to T and NK cells in corresponding perfusates, with significantly increased CD69 expression compared to circulating PBMCs (all p<0.005) without CD25 expression. NK cells from donor perfusate and tracheas were CD56 dim CD16 + , in contrast to lung-draining LN that only possess CD56 bright CD16 -NK cells. T cells from trachea and LN showed also high CD69 expression and a significant enrichment of effector memory (CCR7 -CD45RO + ) T cells (all p<0.03). In donor trachea and recipient parenchyma, CD69 + T cells showed coexpression of other tissue residency markers such as CD103, CD49a and PD-1, while these were not found in perfusates, highlighting differences between these compartments. and evaluated the relationship with EAR and graft loss. Results: Pre-transplant blood RNA sequencing profiles revealed down-regulation of NK and CD8 + T cell signatures associated with EAR post-transplant. We further identified a set of 23 genes that predicted EAR in the discovery set (AUC = 0.80) and in the validation set (AUC = 0.74). When we excluded the recipients of allografts with > 4 HLA-allele mismatches, the AUC increased to 0·89. The risk score derived from the gene-set was also significantly associated with AR after 6 months posttransplant (p = 0.041), antibody mediated rejection (ABMR) or de novo donorspecific antibodies (DSA; p = 5.17e-4), and long-term graft loss (p = 0.043) in the two validation sets, especially for recipients of allografts with ≤ 4 HLA mismatches (p = 0.005 for AR, p = 3.07e-8 for ABMR or de novo DSA and p = 3·09e-4 for graft loss). Lastly the gene set appears to be associated with autoimmune diseases (such as lupus, inflammatory bowel disease, and diabetes) in publicly available blood expression datasets of autoimmune disease cohorts. We identified a pre-transplant blood 23-gene set that predicts EAR and is associated with ABMR, de novo DSA, and allograft loss. This gene set is an important new tool to risk-stratify recipients before kidney transplantation and help titrating immunosuppression to single-patient needs. This assay could be applied to immune monitor also individuals with autoimmune diseases. Here we analyzed the phenotypic polarization, activation and function status of CXCR5 + CD45RO + CD4 + circulating (c)T follicular helper cell (T FH ), which are known to be critical for antibody production by B cells. We have consented 11 healthy controls (HC) and 47 Thymoglobulininduced KTx patients participating in an ongoing observational cross-sectional study. Twenty-four patients were identified as DSA + (MFI above 1000), including 13 asymptomatic DSA patients without proven antibody-mediated rejection (ABMR) and 11 undergoing biopsy proven ABMR. In addition, 23 DSA negative patients comprised of 9 patients with acute cellular rejection (TCMR) only, and 14 stable, rejection free patients. Whole blood was obtained and analyzed by flow cytometry to identify cT FH subsets polarization (Th1, Th2, Th17) and activation (ICOS/PD1) status. For functional assays, FACS-sorted cT FH were incubated with memory (m) B cells ± Staphylococcus aureus Enterotoxin B (SEB) for 6 days in vitro co-cultures and analyzed for their ability to trigger plasmablast formation and IgG production. Results: All DSA + KTx patients display increased levels (%) of Th1/Th17-cT FH cells with activated phenotypes as compared to HC (p= 0.007) or stable patients (p=0.05). Interestingly, only ABMR patients presented also significant higher levels of activated (CD62L low ICOS hi PD1 hi ) Th2-cT FH subsets (most functionally potent) compared to all other groups (0.0004 to HC, p=0.003 to stable, p=0.01 to TCMR, and p=0.03 to asymptomatic DSA + patients). Principal component analysis (PCA) incorporating all 47 patients and repeat count values from 9 phenotypic markers, independently confirmed that ABMR KTx recipients cluster distinctly from asymptomatic DSA + , stable and TCMR KTx recipients. In addition, while cT FH from all KTx patients are functional and support in vitro plasmablast formation and IgG production, cT FH from ABMR patients triggered significant higher pathogenic IgG1/IgG3 production from mB cells compared to asymptomatic DSA + patients. Conclusions: Our data underscore the value of cT FH cell monitoring, in addition to the currently available DSA measurements, for a comprehensive mechanistic characterization of DSA responses, and support immunotherapeutic strategies to target T FH cells for DSA + KTx patients at risk of ABMR. Purpose: Treg-rich organized lymphoid structures (TOLS) has been identified in recipients with renal allograft tolerance induced by transient mixed chimerism in non-human primates (NHPs) and humans. To further clarify its relevance to tolerance, detailed immunological profiles in renal allografts were compared with those in chronic antibody-mediated rejection (CAMR) and T cell mediated rejection (TCMR) in NHPs. Methods: Using the NanoString nCounter platform, we retrospectively studied 53 mRNAs in 256 kidney allograft formalin-fixed paraffin-embedded serial samples taken from NHP recipients of combined kidney and bone marrow transplantation that achieved tolerance (TOL), developed CAMR or TCMR. Results: TOL recipients (n=14) survived for >1736 ± 454 days with normal kidney function, while recipients with CAMR (n=13) survived for 899 ± 152 days with compromised graft function and recipients with TCMR (n=15) survived only shortly (130 ± 17 days) (Fig. 1A ). Most prominent difference observed among three groups was FOXP3, which was significantly higher in TOL than both CAMR and TCMR in both early (one year) after transplantation. Other mRNAs potentially related to Tregs, such as IL10, TGFβ and GATA3 were also high in TOL. In contrast, transcripts of inflammatory cytokines (IFNG, CXCL11, FCGR3A, GNLY, GZMB, IL4 and IL1RL1) were higher in TCMR, while activated endothelium associated transcripts, such as CAV1, MALL, VWF, TEK, ROBO4, SOX7 and PECAM1, were higher in CAMR (Fig. 1B) . Purpose: Donor specific anti-HLA antibodies (DSA) are a major risk factor associated with renal allograft outcome. As a trigger of B cell antibody production, T follicular helper cells (Tfh) promote DSA appearance. Recent data suggest that cTfh may be present long before DSA appearance. Methods: We measured circulating Tfh (cTfh) levels the day of transplantation and one year after in blood from a prospective cohort of 237 renal transplanted patients without DSA during the first year post-transplantation. Total cTfh were characterized as CD4 + CD45RA -CXCR5 + and three subsets of activated cTfh were analyzed: CXCR5 + PD1 + , CXCR5 + PD1 + ICOS + and CXCR5 + PD1 + CXCR3 -. Results: Immunizing events (previous blood transfusion and/or pregnancy) and the presence of class II anti-HLA antibodies were associated with increased frequencies of activated CXCR5 + PD1 + , CXCR5 + PD1 + ICOS + and CXCR5 + PD1 + CXCR3 -cTfh subsets. Whereas ATG-depleting induction and calcineurin inhibitors treatments decreased the total level of cTfh, activated cTfh subsets were increased at one year post-transplantation. In multivariate survival analysis, we reported on the decrease of activated CXCR5 + PD1 + ICOS + at one year after transplantation from blood of DSAfree patients and a significant association with the risk to develop dnDSA after the first year (p=0.018, HR =0.39), independently of HLA mismatches (p=0.003, HR =3.79). These results highlight the importance to monitor activated Tfh in patients early after transplantation. They show that current treatments are not able to prevent early enough and efficiently activation and migration of such Tfh, pointing on the need to develop innovative treatments to specifically target them in order to prevent DSA appearance in renal transplantation. Purpose: kSORT, a 17 gene blood biomarker, has been retrospectively validated for detection of biopsy proven acute rejection (AR) and immune quiescence in kidney transplantation (tx). This is the first assessment of the predictive accuracy of pre and serial post-tx kSORT in a prospective clinical trial of high immunologic risk tx patients. Methods: 113 kidney tx recipients, with cPRA of >50% (median 97%) were enrolled pre-tx and followed for 12 months post-tx in the PRISM (Prediction of Rejection In Sensitized saMples) trial. A protocol bx was done at 6 months and/or at graft dysfunction. kSORT was run pre-tx and 5 times post-tx (0.5, 1, 2, 3, 6 months posttx). Customized software kSAS generated actionable immune risk scores as High-(HR) or Low-(LR) risk for AR. All patients had induction with Thymoglobulin and maintained on TAC, MMF and prednisone. Statistical analysis used R and Fisher's exact test and kSORT reads were correlated with biopsy histological results. Results: 98 evaluable patients had kSORT analyzed across 560 serial blood samples and results correlated with 93 bx (34 for cause/indication), with 12 AR and 20 borderline AR (bAR) cases. The cumulative incidence of biopsy confirmed rejection was 25% at one year (25/98 patients). 15.7% of blood samples processed for kSORT had a null call of Indeterminate-Risk (IR). The overall predictive accuracy of the pre-tx kSORT (available in 54 pts) was 90.3% for no-rejection post-tx. 18% of pretx samples were HR and did not correlate with cause of sensitization. For post-tx blood samples paired with bx, 51 patients had no-rejection and 47 were correctly classified as LR (specificity 92.2%). Among the 20 BL-AR, kSORT was HR in 93% with definitive kSORT scores (HR in 14; LR in 1, IR in 5). Among the 12 AR, 78% were HR. Combining the AR and BL-AR groups, the accuracy of kSORT for matching HR results with a biopsy rejection call had 84% accuracy. A diagnostic odds ratio of a kSORT+ (HR) call in the AR and BL-AR group, compared to the odds of a LR call in the same group was >27-fold (dOR=27.4), confirming that the kSORT test can discriminate between BPAR outcomes (p<0.0001). After treatment of AR, 42% of patients remained kSORT positive. Conclusions: kSORT can be applied both pre-tx and at serial times post-tx to monitor patients at risk of rejection in highly sensitized patients. A LR kSORT score has 90% accuracy for predicting freedom from rejection, either before or after tx. Pre and post-tx kSORT assessment is an important adjunct measure for instituting precision medicine in the management of sensitized renal tx recipients and optimizing their outcomes. The field of organ transplantation continuously struggles with inadequate numbers of donor organs. Therefore, efforts remain focused on increasing organ utilization and expanding the donor pool. Transplant centers, UNOS, and SRTR have continued to explore increased organ utilization in the areas of PHS increased risk, high KDPI and most recently the use of organs exposed or infected with viral hepatitis-organs with historically higher discard rates. Organ acceptance starts at listing when designating whether a patient will take a HCV Ab+ and/or NAT+ organ. There is ongoing ambiguity surrounding the clinical impact on recipients when considering HCV Ab+/NAT-and HCV NAT+ donors, this ambiguity extends to calculating KDPI. Aims: (1) To determine if HCV Ab+/ NAT-donors result in recipient infection with HCV. (2) To determine if the risk of recipient infection varies across organ types in our multi-organ experience. Methods: A retrospective review was conducted on all transplant recipients from 2013-2018 at Vanderbilt Transplant Center who received HCV Ab+/NAT-grafts. All transplant recipients were not previously exposed to HCV at the time of listing. A total of 27 patients received an HCV Ab+/NAT-graft: 9 heart recipients, 3 kidney recipients, and 15 liver recipients. Results: Out of the 27 transplant recipients, no recipient developed infection with HCV post-transplant as evidenced by serial undetectable PCR testing. Table 1 outlines the number of post-transplant recipients who received PCR testing at the following testing intervals: first 3 months, 6 months, and 12 months. Conclusions: Our experience expands on the existing literature, adding multi-organ data on the use of HCV Ab+/NAT-grafts. The results of this study, suggest that the use of HCV Ab+/NAT-grafts carry no perceived additional risk of HCV infection to recipients. This raises the importance of educating transplant candidates and the community on acceptance of organ offers for HCV Ab+/NAT-to encourage utilization of these often-overlooked organs. Table 2 . Average time from viremia to start of DAA was 15.8 ± 10.4 days. The most common genotype was 1a (60%), followed by 3a (28%). The most commonly prescribed DAA was ledipasvir and sofosbuvir (56%) followed by velpatasvir and sofosbuvir (32%) then glecaprevir and pibrentasvir (12%). Eleven patients had a detectable HCV RNA at SVR4 with a median of 109 copies/mL (IQR 7, 251) . Three patients had a detectable HCV RNA at SVR 12, which was detected < 12 copies/ mL, the lower limit of quantification of our lab. Eight patients achieved clearance at SVR24, while one patient had a detectable level at SVR24 which was a new genotype from when therapy was started. Patient and graft survival was 100% and 96%, respectively. No patients developed clinical liver disease. Conclusions: HCV negative recipients can be safely and successfully transplanted with HCV positive donor kidneys outside of a research protocol. HCV organs can expand the organ pool and should no longer be considered experimental. We believe the use of these organs in HCV negative recipient's decreases waiting time, have excellent outcomes and should be considered standard of care. Purpose: Ledipasvir/sofosbuvir (LDV/SOF) in combination with ribavirin (RBV) has shown excellent results post-liver transplant. However, due to the poor tolerability of RBV, it is commonly questioned whether LDV/SOF monotherapy would be equally as efficacious. Additionally, a short course of LDV/SOF has shown preliminary feasibility immediately post-transplant, however may be challenging to replicate. We aimed to compare the efficacy and safety of LDV/SOF monotherapy with LDV/SOF+RBV in patients post-liver transplant, as well as determine the ability of LDV/SOF monotherapy for 8 weeks at preventing recurrence if given within 90 days post-transplant. Methods: This was a multicenter, randomized, open-label, phase IV study in patients post-liver transplant with genotype 1 or 4 HCV. Patients within 90 days of transplant (Early Cohort) were randomized to receive either 8 or 12 weeks of LDV/ SOF monotherapy (target N = 60) and patients more than 90 days out from transplant (Late Cohort) were randomized to LDV/SOF monotherapy or LDV/SOF+RBV for 12 weeks (target N = 170). Results: The Early Cohort was discontinued after 3 patients had been enrolled and the study was halted after 32 patients had been enrolled in the Late Cohort. One patient in the Early Cohort received 12 weeks of LDV/SOF and was included in the Late Cohort data. Two patients in the Early Cohort received 8 weeks of LDV/SOF, one of whom had virologic relapse with a newly detected NS5A mutation and one of whom died from a cause unrelated to treatment. Within the Late Cohort, there were no differences amongst baseline demographics between Late Cohort groups, although 2 patients in the LDV/SOF group had a mutation that predicted NS5A resistance, but were determined clinically insignificant by investigators (Table 1) . Treatment outcomes were similar between groups, with 88% of the LDV/SOF cohort and 75% of the LDV/SOF+RBV cohort achieving SVR24 (Table 2 ). There was one death, unrelated to treatment, in the LDV/SOF cohort and one relapse. The relapse occurred in one of the two patients with a detected NS5A resistant mutation. In the LDV/SOF+RBV cohort, there was one patient that withdrew consent, one nonresponse, and two relapses. No patient that had virologic failure had a newly detected mutation predicting resistance. Treatment tolerability appeared to favor the LDV/SOF cohort over the LDV/SOF+RBV groups (Table 3 and 4) .Conclusions: Due to changes in the national treatment atmosphere in the time it took to begin enrollment, there was a paucity of untreated HCV patients being transplanted. At any time post-liver transplant, ribavirin inclusion as part of the antiviral regimen leads to more overall and clinically significant adverse effects, and does not appear to improve treatment efficacy. Purpose: Severe ischemia reperfusion injury (IRI) causes delayed graft function and impaires renal allograft survival. Human urine-derived stem cells (hUSCs) is a new source of stem cells detached from kidneys and may possess tissue-specific previldge in treatment of kidney injury. Transplantation of hUSCs may be a potential therapeutic strategy for treatment of renal IRI. This study investigated the protective effect of hUSCs on renal IRI in a rat model and explored the underlying mechanism involving the exosomal microRNAs. Methods: hUSCs was harvested from fresh spot urine. The phenotype of cell surface markers and differenatiation ability was analyzed to determine the characteristics of hUSCs. Adult male SD rats were used to induce a lethal renal IRI model. Left renal pedicles was occluded with a vascular clamp for 45 minutes. The contralateral kidney was nephrectomized after reperfusion of the left kidney. One dose of hUSCs (2*106 cells) in the experiment group or saline in the control group was intravenously administered right after blood reperfusion. Blood was drawn every other day to measure sCr, BUN. Results: We identified hUSCs by the morphological feature, growth curves, CD markers (like CD29, CD73, CD44), renal markers (like NEPHRIN, WT-1) (Fig 1) . Compared to the IRI control group, hUSCs significantly increased survival rate, decreased sCr and BUN after reperfuion. Acute tubular injury score was significantly decreased in hUSCs group (Fig 2) . TUNEL analysis showed fewer apoptotic cells in hUSCs group. The cleaved-caspase 3 and apoptosis-related protein bax were decreased while anti-apoptotic protein bcl2 was increased in hUSCs group. MPO staining score was significantly lower in hUSCs group. MDA level was lower and SOD level was significantly higher in hUSCs group (Fig 3) . We seprated the exosomes from hUSCs conditional medium, and identified by TEM images, Western blots and sizes analysis. miR-146a-5p is the highest in the miRNA pool of exosomes by microRNA sequence. Through bioinformatics analysis, we found that miR-146a-5p could target and degrade IRAK1 mRNA 3'UTR, and confirmed the reaction between them and subsequently inhibits the activation of NF-κB signaling (Fig 4) . Overexpress miR-146a-5p by mimic could reduce the oxidase stress level and downregulate the level of IRAK1, and finally inhibit the nuclear translocation of NF-κB p65 in HK2 cells after H/R injury (Fig 5) . We systematically analyze the protective function of hUSCs in a rat IRI model and demonstrate the underlying molecular mechanisms. hUSCs could protect renal function after IR injury and hUSCs derived exosomes contained miR-146a-5p, which could target IRAK1 3'UTR region, subsequently inhibit the activation of NF-κB signaling, infiltration of inflammatory cells and protect the renal function. As a novel cell source, hUSCs act as an excellent non-invasive therapy approach for kidney tissue regeneration, specifically for IR injury. We recently reported that c-Kit cells isolated from developing kidneys exhibit regenerative potential in mouse models of acute nephrotic syndrome and ischemia-reperfusion injury. These cells have already been isolated from human deceased donors. We hypothesize therefore that c-Kit cells represent a kidney-specific stem population that is involved in kidney development and is maintained throughout adult life, as verified by lineage tracing studies in vivo. We crossed the inducible c-Kit reporter (c-KitCre ERT2/+ ) mice with IRG, mT;mG, LacZ, multicolored Confetti, BTBRob/ob mice. By varying the time of tamoxifen treatment, c-Kit cells and their descendants were specifically labelled with enhanced green fluorescent protein (EGFP), LacZ, or multicolour fluorescence. So, their spatiotemporal distribution was followed during kidney development and acute (ischemia-reperfusion and rhabdomyolysis) and chronic (diabetic nephropathy) kidney injury. Results: c-Kit expression was more abundant in early postnatal (P) period (7.91 in P0.5-3.5; 10.6 in P7-14 vs 3.13 in embryonic [E] 17.5-18.5 (P<0.0001), and was maintained throughout adult life, yet at lower levels (5.7 in P30 and 2.2 in P90-180). When tamoxifen was injected during E7.5-9.5, a few EGFP/LacZ+ cells were observed in tubular segments from cortex to medulla, and at E10.5-12.5, when metanephros development initiates, ribbons of c-Kit-EGFP/LacZ+ cells expanded to form tubular structures that resembled S-shaped bodies. In postnatal period, the number of c-Kit-EGFP/LacZ/clonal multicolour cells increased in the cortex, medulla, and papilla, where they were found in proximal and distal tubules, collecting ducts, and podocytes. In adult mice, these cells were found in distinct segments, including co-localization with calbindin-D28K (distal tubules) and AQP2 (collecting ducts). After acute injury, the number of c-Kit clones increased from 10±3 to 36.5±8 (P<0.0001) in the outer medulla. In the c-Kit;mTmG;BTBR ob/ob mice, a robust model of diabetic nephropathy, we found on average 21±3 glomeruli that exhibit c-Kit-EGFP cells in these mice (versus 12.2±2.2 glomeruli in heterozygous mice; P=0.035) and also 2. Purpose: Signals derived from gut microbiota are critical for the development and maintenance of the human immune system. In view of kidney transplantation (KT), allograft should be shifted from donor's microbiome environment to recipient's one, although of which significance has been unclear. Methods: We prospectively enrolled living donor KT cases at two centers. Stool samples were obtained from them before transplantation. Microbiota composition was analyzed using extracted metagenomic DNA from the feces, using the Illumina MiSeq system. Gut microbiome difference between donor and recipient was determined via the Euclidean distance of the microbiome. Results: The microbial distance was estimated from 55 donor-recipient pairs. The recipients were 47.7 ± 13.0 years old; donors, 47.5 ± 11.2 years old. There were 26 relatives, 25 married couples and 4 non-relatives. 17 pairs (30.9%) had ABO blood group incompatibility and 24 pairs (43.6%) had numbers of HLA mismatches more than 3. The mean 1-year eGFR was 65.0 ± 16.6 mL/min/1.73 m 2 . Interestingly, the shorter Euclidean distances were, the lesser eGFR values were at 1-year after transplantation (regression beta -0.503, P<0.001). Moreover, the lower tertile group of Euclidean distances revealed a significant lower 1-year eGFR, compared with higher or middle tertile distance groups (P=0.020 1) . In this study, we demonstrated that differences of gut microbiome composition between the recipient and donor were negatively associated with 1-year graft function. Moreover, they predicted lower 1-year graft function with eGFR <45 ml/min/1.73 m 2 better than other basic immunologic parameters. Immunosuppression Purpose: Cardiac progenitor cells (CPCs, c-kit + ) are well-characterized stem cell type shown to potentiate cardiac regeneration in myocardial infarct (MI) model. However, immune response to transplanted adult CPCs (aCPCs) reduces cell retention and MI recovery. We aimed to increase the regenerative potential of aCPCs by reducing immune response directed to the transplanted aCPCs using cyclosporine A (CSA). Methods: Adult CPCs isolated from Wister Kyoto rat was transplanted (1X10 6 ) to Brown Norway MI rats as aCPC group. Immune suppressant CSA were administered orally to MI rats with rCPCs transplantation grouped as aCPCs+CSA. CSA alone and Iscove's Modified Dulbecco's Media (IMDM) were served as control groups. Echocardiogram was performed to determine MI recovery on day 1, 7 and 28. The transplanted rCPCs cell retention was detected using GFP expressing rCPCs on heart tissue by immune-histochemistry (IHC). The rate of apoptotic cell death on injured myocardium was measured using TUNNEL assay. The inflammatory CD-68 + cells were measured by IHC. Sera collected on day 2, 7, 14, 21 and 28 were utilized to measure the inflammatory (IL-2, IL-17, IFN-γ, TGF-β), anti-inflammatory (IL-10) cytokines and antibodies against cardiac self-antigens (SAgs) Troponin-T and Myosin. Results: MI rats transplanted with aCPCs+CSA demonstrated significant increase in MI recovery and cell retention compared to MI rats with aCPCs, CSA and IMDM controls (p<0.05). MI rats transplanted with aCPCs showed increased infiltration of CD-68 + cells and apoptotic cells in the infarcted myocardium compared to aCPCs+CSA group (p<0.05). Sera collected on day 2 and 7 of aCPCs group showed increased inflammatory cytokines, antibodies to cardiac SAgs and reduced antiinflammatory cytokines. In contrast MI rats transplanted with aCPCs+CSA showed reduced inflammatory cytokines, abs to SAgs and increased anti-inflammatory cytokine IL-10 (p<0.05). In conclusion, MI rats transplanted with allogeneic aCPCs+CSA showed increased cell retention, reduced inflammatory cells and cytokines compared to aCPCs, CSA and IMDM control. Therefore, CSA reduces inflammation and increases the retention of transplanted aCPCs that resulted in increased myocardial recovery. We previously demonstrated that donor-derived regulatory dendritic cells (DCreg), contrary to recipient derived DCreg, promote antigen-specific allograft protection. As myeloid-derived suppressor cells (MDSC) show an enhanced immune suppressive capability, we hypothesize that donor-type MDSC suppress alloimmunity by regulating donor-specific T cell clones. Methods: MDSC were generated from BALB/c bone marrow cells. Allogeneic mix lymphocyte reaction (alloMLR) and regulatory T cells (Treg) induction assays were performed in vitro. B6 recipients were pre-treated with BALB/c MDSCs or conventional DCs (cDC) control. Subsequently, these mice were transplanted with either donor-type (BALB/c) or third-party (C3H) cardiac grafts. We then examined graft survival, allo-rejection/tolerance associated immunophenotype and graft mRNA expression. Purpose: Contemporary clinical ex vivo lung perfusion (EVLP) protocols generally limit metabolic supplementation to insulin and glucose. Furthermore, conventional perfusates are void of many ingredients considered essential for cellular viability. We sought to determine whether improved metabolic supplementation with total parenteral nutrition (TPN) would improve lung function and graft quality in prolonged EVLP. Methods: Ten porcine lungs were perfused using normothermic EVLP for 24 hours. Lungs in the control group (n=5) were perfused with a cellular perfusate and supplemented with insulin and glucose. In the treatment group, the perfusate was also supplemented with conventional TPN containing a continuous infusion of lipids and amino acids, as well as the addition of essential vitamins and cofactors to the original perfusate. Functional parameters including oxygenation, pulmonary vascular resistance, and dynamic compliance were evaluated. Edema formation (as indicated by percent weight gain) was used as a surrogate for lung injury. Electrolyte and lactate profiles were monitored. Perfusate lipid concentrations were also followed over time to assess whether free fatty acids were being utilized over time. Results: Lungs in both treatment and control groups demonstrated stable and acceptable oxygenation (partial pressure of arterial oxygen/ fraction of inspired oxygen ratio >350 mmHg) up to 24 hours; however, oxygenation was significantly higher in the TPN group (p0.01). Other physiologic parameters including compliance and pulmonary vascular resistance were not different between groups (p>0.05). Lactate was not different between groups (p>0.05). Electrolyte profiles including sodium were more stable in the TPN group (p0.04). Lipid analysis of the control group demonstrated a continuous depletion of free fatty acids over time. After initial stabilization, perfusate free fatty acid concentration in the TPN group were significantly higher than controls (p<0.001) and resembled in vivo levels. TNF-α was also significantly lower in the TPN group than the control group (p0.02).Conclusions: The addition of total parenteral nutrition in NPV-EVLP allows for better electrolyte composition, decreased inflammation, and improved graft performance in ex vivo lung perfusion up to 24 hours. 74, 9.28, 9.38mg/mL vs 5.72, 6.25, 4.75mg/mL, respectively) . In the HMP group, 4 of the 7 grafts were poor performing with two of these developing renal vein thrombosis. Conversely, only 1 of the 7 grafts was poor performing in the NEVKP group and there was no evidence of renal vein thrombosis ( Figure 1 ). The consistent improvement in NEVKP vs heterogeneity in HMP was also observed through the variation in creatinine clearance (POD7: 26.31±11.54mL/ min vs 16.8±18.9mL/min) and on histological analysis of tubular injury. Marginal kidney grafts subjected to 120min of WI before retrieval showed reliable improvement in function following 8hrs of continuous NEVKP compared to HMP where improvement was inconsistent. This suggests NEVKP would be a preferable storage strategy for DCD procured grafts with extended WI times. (Figure 1 ). Histone H3.3, which accumulates at sites of DNA injury, was increased in LDGF and its levels correlated with DGF duration (Pearson r 0.7224). Conclusions: DCD kidneys with short duration of DGF present acute cellular injury at time of donation, alongside upregulation of repair pathways (e.g. chaperonemediated autophagy, eIF2-dependent protein translation). In contrast, DCD kidneys with prolonged DGF present widespread translational, metabolic and antioxidant deficiencies. These pathways could be targeted therapeutically to reduce DGF incidence and duration. Purpose: Previously, we found up-regulation of miR-20a after kidney ischemia reperfusion (I/R) injury and may serves as a protector; however, its specific mechanism remains unclear.Autophagy is also a key regulator of kidney I/R injury. Thus, we investigate whether miR-20a can regulate autophagy in renal I/R injury. Methods: An I/R-induced mice I/R injury model and a hypoxia-induced HK-2 cell model were used.The concentration of renal function serum creatinine, blood urea nitrogen were mesured. miR-20a agomir or siRNA were used to up-and down regulate the expression of miR-20a. Cell apoptosis was evaluated by flow cytometry and autophagy level was assessed by autophagy marker and autophagy flux. Results: MiR-20a up-regulated in vivo but down-regulated in vitro. In vitro, miR-20a agomir, an miR-20a mimic, can greatly reduced the apoptosis by decreasing the autophagy level. miR-20a directly inhibited the expression of PTEN and BIM and attenuated apoptosis and autophagy, similar to the effect of Chloroquine, a pharmacological inhibitor of autophagy. Mechanistically, the anti-autophagic and anti-apoptotic effect of miR-20a can be reversed by AKT-siRNA, a down-stream protein of PTEN and BIM. Further, we identified the Akt phosphorated Beclin1 was up-regulated, may play the direct anti-autophagic effect. In addition, the administration of miR-20a agomir can greatly inhibited the autophagy level and apoptosis, leading to significantly improved renal function recovery and tissue injury in mice IRI model. Purpose: The purpose of this study was to: 1) determine the cause of microvascular obstructions that arise during ex vivo normothermic perfusion (NMP) of human kidney; and 2) develop a therapeutic strategy to clear these obstructions to restore perfusion in marginal human kidney allografts. Methods: A series of 18 discarded donor human kidneys were employed. 3 organs were used to assess renal production of fibrinogen during cold storage. 15 additional organs underwent NMP following established protocols. Organs were biopsied and assessed for renal fibrinogen with histological and immunofluoresence staining. Treatment with tissue plasminogen activator (tPA) and plasminogen was used to resolve microvascular obstructions during NMP. Biopsies collected before and after treatment were sectioned and stained for microvascular obstructions for analysis by quantitative microscopy. TEM was used to confirm the composition of the microvascular obstructions. Perfusate samples were tested to confirm fibrinolysis and physiological assessment of ex vivo kidney function was performed. Results: In the absence of traditional clotting factors (using serum-free washed red blood cells), microvascular obstructions formed during NMP and were found to be red cell plugs rich in fibrinogen with a rouleaux like presentation. Fibrinogen was not present in the washed red blood cells prior to NMP, but rapidly appeared in the perfusate following 15 min of normothermic perfusion. Histological and immunofluorescent staining combined with protein elisa confirmed that tubular epithelial cells produce fibrinogen locally in response to cold storage. NMP elicits release of the epithelial fibrinogen into renal vasculature leading to the red cell plugs. Treatment with tissue plasminogen activator (tPA) in combination with plasminogen efficiently cleared these obstructions in all kidneys and restored perfusion in even highly obstructed organs. This treatment reduced markers of inflammation (e.g. IL6) and improved physiological parameters (e.g. urine production). Conclusions: These results reveal a new mechanism of local renal production of fibrinogen in response to cold storage. Normothermic perfusion conditions induces release of fibrinogen into the vasculature, which can cause microvascular obstructions by cross linking red cells. NMP allows this process to happen in a controlled ex vivo environment where therapeutics (i.e. tPA+plasminogen) can be used to clear these obstructions and restore perfusion. This approach also avoids release of the fibrinogen in the organ recipient, where a thrombotic response could elicit delayed graft function and myriad other complications. Collectively our results demonstrate the power of NMP as: 1) a preclinical research tool to investigate the pathophysiology of ischemic injury; and 2) a platform for therapeutic repair of marginal human organs. Purpose: Despite a tripled higher incidence of DGF in DCD kidney grafts compared to DBD kidney grafts, large studies show equivalent long-term graft survival with DBD and DCD grafts. This implies differential impacts of DGF on DCD and DBD graft survival. Possible explanations are more severe DGF in DBD grafts, and/or DCD grafts are more resilient than DBD grafts. The aim of this study was to assess the biological basis of differential impacts of DGF on long-term outcome of DBD and DCD grafts. The impact of DGF on long-term graft survival was analysed for 3744 DBD and 2891 DCD kidney transplants performed in The Netherlands between 2000 and 2018. The severity of DGF was estimated for 640 DBD and DCD kidneys transplanted at the LUMC by evaluating the number of posttransplant dialyses and postoperative functional recovery (eGFR). In parallel to findings in tumour biology, where p53, phospho-EGFR, IGF-1R, phospho-mTOR, phospho-MAPK14, PCNA, BCL2 and PPARγ are associated with tumour resilience, we determined expression of these factors by immunohistochemistry in pre-reperfusion kidney biopsies (DBD n=40; DCD n=40). Gene expression profiles (array analysis) followed by Ingenuity Pathway Analysis was performed to identify pathways differentially activated in 8 DBD and 7 DCD grafts. Results: Our data confirmed a higher incidence of DGF in DCD grafts (DCD 42% vs. DBD 18%). This higher incidence of DGF did not impact long-term graft survival. Multivariate analysis showed that this was mainly due to differential impact of DGF on long-term outcomes, with a major impact in DBD grafts (RR: 1.62, 95%CI: 1.24-2.11) but no significant impact in DCD grafts (RR: 1.29, ). This was not caused by a more severe form of DGF in DBD grafts, a conclusion based on equal numbers of DGF-associated dialyses and superior posttransplant eGFRs in DBD grafts. Immunohistochemistry showed expression of all components of the resilience network in biopsies. Pathway analysis identified 24 differentially expressed pathways with the resilience associated pathways EGF-signalling (p:0.003), BRCA1 (p:0.005) and p38-MAPK-signalling (p:0.009) in the top-6. The absent impact of DGF on long-term graft survival in DCD kidneys is paralleled by activation of dedicated resilience pathways. Targeting of these pathways may provide a major opportunity to modulate organ resilience in kidney transplantation. Purpose: The donation after circulatory death (DCD) donors has the potential to match the increasing need for transplantable organs, however, DCD kidneys have a higher risk of primary graft dysfunction or delayed graft function. Although static cold storage (SCS) has been the gold standard for organ preservation to reduce metabolism and oxygen demand, DCD kidneys are more susceptible to cold ischemic injury. Recent murine studies have shown the effect of simple static subnormothermic storage (SSS) on maintaining the physiologic and metabolic function of DCD organs without increasing the risk of oxygen demand. In this translational study, we evaluated the efficacy of SSS of extended warm ischemic porcine kidneys especially focused on 1) preoperative assessment by machine perfusion (MP), and 2) longitudinal assessment by kidney transplant (KTx) model using miniature swine. Methods: Kidneys of twelve MHC-inbred CLAWN miniature swine were subjected to 120 min of warm ischemia (WI) followed by either 60 min of SCS (4°C group) or SSS (22°C group) with extracellular-type solution. As experiment 1, the preserved 120-min WI kidneys were perfused with normothermic MHC-matched oxygenated red blood cells in Ringer's solution for 120 min at mean arterial pressure (MAP) of 85 mmHg using cardiopulmonary bypass. As physiologic parameters, renal blood flow (RBF), MAP and urine output were monitored continuously and intrarenal resistance (IRR) was calculated. As a metabolic parameter, blood gas analysis was performed every 15 min to calculate oxygen consumption. As experiment 2, the preserved 120-min WI kidneys were transplanted into MHC-matched recipients with 12-days of continuous tacrolimus (blood level: 20-25 ng/ml). Renal function was monitored by serum creatinine (sCr) and renal biopsies. Results: Experiment 1: All of the 120-min WI kidneys preserved for 60 min at either 4 or 22°C were successfully perfused using our established method of normothemic MP for 120 minutes. Although all kidneys showed the same macroscopic appearance during MP in both groups, physiologic or metabolic parameters in 22°C group revealed better condition of the kidneys compared to 4°C group (22°C vs 4°C: mean RBF 26.7 ± 4.7 vs 10.0 ± 0.0 ml/min, mean IRR 3.5 ± 0.5 vs 8.7 ± 0.1 mmHg/ml/ min, total urine output 14.3 ± 5.3 vs 5.4 ± 3.6 ml, oxygen consumption 223.5 ± 38.6 vs 92.5 ± 15.6 ml/min/g), suggesting that 22°C-preserved WI kidneys show better outcome following KTx.Experiment 2: In KTx model, peak sCr of the animals in 22°C Group was lower than that of 4°C Group (22°C vs 4°C: peak Cre 3.9 ± 0.6 vs 8.9 ± 0.3 mg/dl, p=0.0015). Kidney biopsies taken 4 days after transplantation revealed renal tubular necrosis over wide spread areas in 4°C group, while renal tubular necrosis was limited in 22°C group. Moreover, prompt regeneration of tubular epithelium was observed indicated by positive PCNA cells in 22°C group. Conclusions: We demonstrated that subnormothermic (22°C) organ preservation of kidney grafts exposed to 120-min warm ischemic time was more effective than hypothermic (4°C) preservation in miniature swine. To our knowledge, this is the first demonstration of the applicability of static subnormothermic storage in preclinical large animals. Purpose: Opioid use after kidney transplant has been shown to be a risk factor for chronic opioid use, which leads to an increased risk of mortality. The purpose of this study was to evaluate the early impact of a multimodal pain regimen and education quality improvement program on opioid use after kidney transplant two months after implementation.Methods: This was a retrospective, single-center analysis of post-operative opioid use, comparing the average daily MME of the patients who received education on opioids and a multimodal pain regimen (preoperative TAP/QL block, scheduled APAP and gabapentin) compared to a historical control group. Results: Despite having no differences in pre-transplant opioid exposure (Table 1) , daily and overall inpatient opioid utilization was significantly reduced in the multimodal pain protocol cohort (38.6 vs 8.0 MME/day; p<0.001); 5% of patients in the multimodal pain protocol cohort were discharged with an opioid prescription, compared to 96% of controls (p<0.001) ( 1 ). 11 patients refused follow up. Reasons included recipient complication, post-op pain, and psychosocial issues. TIEDI forms were submitted as "patient not seen" approx. 8 weeks postpolicy change for these patients. Pre-policy, 31% of forms were submitted on time.Since initiation of the project, for donors from 1/2017-4/2018, 58 forms were due of which 98% were submitted on time (figure 2 The eICU nurses are critical care trained nurses that have been trained to respond to data driven alerts. In collaboration with our local OPO, the eICU nursing staff conducted a 2 month blind study where eICU nurses were trained to identify patients who met clinical triggers while the bedside team continued to make the referrals as they normally would. Results: During the initial blind study, the eICU nurses identified all the patients for whom actual referrals were made by the hospital staff. In addition, the eICU nurses appropriately identified 5.5 times more referrals than the ones called in to the OPO from the hospital care team. Of the actual referrals made by the hospital staff, 76.5% of them were identified earlier by the eICU nurses than by the bedside team. Due to the compelling data, the organ donation referral process was officially implemented for real-time use at 5 hospitals. In an expanded study to review the efficacy of the intervention, we compared referral data 24 months pre-and postimplementation. The number of organ donor referrals increased by 112.1% in ICU's with eICU capability and the percent of timely referrals rose from 95.1% to 97.0%. In comparison, the ICU's without the eICU function increased the number of organ referrals by 26.8%, however the percent of timely referrals fell from 95.1% to 93.4%. By evaluating our current workflows in the ICU and Tele-ICU, we were able to enhance the already existing eICU Monitoring Program to include the organ donation referral process. In doing so, we were able to significantly improve the number and the timeliness of organ referrals to our OPO. Accurate and timely referrals help maintain donation as an option for all potential patients. We have previously shown that women on the liver transplant (LT) waitlist are at greater risk of hospitalization compared with men, but whether this impacts length of stay (LOS) in the peri-transplant setting is not known. We aimed to evaluate gender differences in post-LT LOS, an important surrogate of health care resource utilization post-LT. Methods: Using the UNOS/OPTN registry, we analyzed all US non-Status 1 adult LT recipients of deceased donor (DD) LT from 2008-2017, excluding those with MELD exception points. Poisson regression associated female gender with post-LT LOS, and logistic regression associated female gender with prolonged LOS (defined as >/= 20 days, the 75th percentile LOS cut-off in our cohort), and 3-and 12-month post-LT death. Results: Included were 28,281 DDLT recipients: 36% were female, 73% were non-Hispanic white. Median MELD at transplant was 28 [interquartile range (IQR) 22-36], 41% were hospitalized for ≥ 1-day pre-LT. Women were more likely to be hospitalized pre-LT than men (45% vs 40%, p<0.01), and were more likely to have prolonged (>10 days) pre-LT LOS (25% vs 20%, p<0.01). Post-LT, median LOS was 11 days (IQR 7-19); 24% had prolonged post-LT LOS (>= 20 days). Women were significantly more likely than men to have prolonged post-LT LOS (26% vs 23%, p<0.01). In univariable analysis, female gender was associated with longer post-LT LOS [incidence rate ratio (IRR) 1.09, 95% CI 1.09-1.10, p<0.01]. Prolonged pre-LT LOS (>10 days) and pre-LT ICU admissions were also independently associated with increased post-LT LOS (IRR 1.82, 95% CI 1.81-1.83, p<0.01, and IRR 1.91, 95% CI 1.90-1.93, p<0.01, respectively). In multivariable analysis, after adjustment for age, ethnicity, UNOS region, MELD, hepatic encephalopathy, and donor risk index, female gender remained independently associated with longer post-LT LOS (adjusted IRR 1.05, 95% CI 1.05-1.06, p<0.01). Women also experienced 12% increased adjusted odds of prolonged post-LT LOS (95% CI 1.05-1.20, p<0.01).Female gender was not associated with increased odds of death at 3-or 12 months. However, women with prolonged post-LT LOS had 2.1x adjusted odds of death at 1-year compared to women with shorter post-LT LOS (95% CI 1.84-2.50, p<0.01). Conclusions: Women who undergo DDLT in the US are more likely to have a prolonged post-LT LOS compared to men. Pre-LT LOS and pre-LT ICU admissions are strongly associated with post-LT LOS, and interestingly, women are also more likely to be hospitalized immediately pre-LT than men. Reducing the disproportionate Purpose: Predialysis education is associated with improved patient survival and improved access to kidney transplantation (KT), however many of the findings are based on a dialysis facility's nephrologist's report when starting dialysis. We aimed to utilize chronic kidney disease (CKD) clinics' electronic medical record data to objectively determine if predialysis education was positively associated with receipt of KT in a Southeastern, privately insured population. Methods: We identified 568 Black or White adults (age≥18) with CKD late stage 4-5 referred to a free, one-time CKD treatment education class from January 1, 2010-October 31, 2017 to allow for a full year of follow-up. We used a virtual data warehouse to identify their KT status. Crude and multivariable Cox proportional hazard models determined the association between class attendance and KT. Timeto-event was defined as time from CKD class to time of KT, censored at death or end of follow-up. Results: CKD patients referred for the class (n=568) were 55.6% male, 76.6% Black, and aged 58.9 years (mean), and only 12.3% (n=70) received a KT, with a median follow-up time from class to KT was 4.4 (IQR 2.8, 5.8) years. Among the 304 (53.5%) that attended the CKD class, 53.3% were male, 78.6% were Black, and 75.0% were diagnosed with diabetes, and 14.1% (n=43) received a KT. Class attendance improved receipt of KT two-fold (crude HR=2.18; 95% CI 1.32, 3.60 and multivariable HR=2.17; 95% CI 1.30, 3.62). In the multivariable analysis, there was no difference in access to KT by race (HR=1.00; 95% CI 0.52, 1.92) or gender (HR=1.28; 95% CI 0.79, 2.08). Patients age≥60 and diagnosed with heart disease were less likely to receive a KT. Conclusions: CKD treatment education was associated with improved access to KT while controlling for patient characteristics. Efforts to improve CKD education need to continue while public health researchers and community members continue to strive to reach patients who do not, or cannot, attend in-person treatment education classes.Purpose: The purpose of this study was to evaluate the disparity, by race, for prekidney transplant work-up completion rates of referred patients. We conducted a retrospective cohort study of consecutive adult patient records referred for first time pre-kidney transplant evaluation from January 1, 2015 through December 31, 2016. Patients were grouped based on racial identity as indicated on the medical record as either African American or Caucasian. The drop out reasons for failure to complete the pre-kidney evaluation process or medical review board (MRB) denial were identified as five categories (medical, workup incomplete, financial/social, noncompliance, and other). The "other" category included patient death, relocation, transplant elsewhere, etc. The difference in continuous variables was assessed using one-way analysis of variance and for binary outcomes using chi-square test. Multivariate analysis was conducted when indicated. All p-values were 2-sided and <0.05 was considered to be statistically significant.Results: There were 4,939 patient referral entries. We excluded 2,589 patients due to duplications, age <18 years, prior transplant, Hispanic ethnicity, and or race categorized as Asian or Other (N=2,350). Males significantly outnumbered females in each racial group (p<0.001). African American patients were younger (p<0.001), lived closer to the transplant center (p<0.001), had a larger BMI (p=0.007), more dialysis days (p<0.001) and more hypertension (p=0.001) than the Caucasian patients.There was no significant difference by race in the number of referrals that completed the evaluation process at one year (p=0.702). However, when examining the dropout rates by race, statistically significantly more African American patients dropped out during the referral process due to incomplete workups, financial/social problems, and noncompliance as compared to Caucasian patients (p<0.001). Caucasian patients were approved for listing by the MRB at a higher rate compared to African American patients (p=0.003). Caucasian patients were transplanted at greater rates than African American patients (p=0.003). The explanations associated with these inequalities are multifaceted and need to be further explored to facilitate the development of solutions that advance the system while improving patient outcomes by race. Central Purpose: Graft-versus-host disease (GVHD) is a relatively common and highly morbid complication after intestinal transplantation. Its pathophysiology remains poorly understood. Resident memory T cells (TRM) are a newly described T cell subset with memory phenotype localizing to peripheral tissue rather than blood/ lymphatics. We hypothesized that the pathophysiology of GVHD might be related to increased donor TRM in the graft subsequently migrating into host blood and tissue. Methods: Intestinal transplantation from deceased donors was performed using our standard method. Graft and blood lymphocytes from 10 patients with GVHD and 34 without were longitudinally analyzed using flow cytometry. Results: In the grafts of GVHD vs. stable patients there were approximately 20% more CD4 and CD8 TRM cells prior to implantation and significantly higher percentages of CD4 and CD8 TRM cells at the time of GVHD (p = 0.02 and 0.04). There was also a mean 60.3% higher level of CD8 TRM cells in native bowel of GVHD patients compared to controls and 20-30% higher levels of IFN-γ and TNF-α expressing CD4 cells and TNF-α expressing CD8 cells in both graft and native bowel of GVHD patients. Importantly, the percentage of CD4 and CD8 TRM in the blood of GVHD vs. stable patients, initially similar, significantly increased during GVHD (p = 0.005). Peripheral blood T cells in GVHD patients also had higher level of mature, antigen-experienced, and/or exhaustion markers than in stable patients. There were significant differences in CD4/CD57 (p = 0.01), CD4/HLA-DR (p = 0.044), CD8/HLA-DR (p = 0.007), CD8/CD28 (p = 0.039), CD4/PD-1 (p = 0.041), and CD8/PD-1 (p = 0.038). There were also significant increases in CD8 effector memory cells (p = 0.0056) and decreases in naïve cells (p = 0.0034) not seen in stable patients. Notably, CD8/PD-1 was also significantly elevated prior to transplantation in patients who later had GVHD (p = 0.025). In the largest longitudinal analysis to date, we demonstrate that increased TRM percentage and inflammatory cytokine expression in graft bowel correspond with increased TRM in blood and native bowel and increased cytokine expression in native bowel at time of GVHD. Thus GVHD pathogenesis may depend on donor TRM in graft bowel migrating to the blood and native tissue of recipients. Recipients with higher PD-1 expression, indicating T cell exhaustion, might be more vulnerable, providing a possible biomarker for GVHD risk. Purpose: Chronic antibody-mediated rejection (cABMR) caused by de novo donor specific HLA antibodies (DSA) is one of major obstacles to long term graft outcome. ABO incompatible kidney transplantation (ABO-I) has shown favorable graft survival, probably caused by accommodation. In vitro study demonstrated that anti-A/B antibody binding to endothelial cells could upregulate complement regulatory proteins (CD55, CD59) and downregulate HLA-DR expression through inactivation of ERK and mTOR pathway, respectively, exhibiting a protective effect against antibody-mediated injury. We have recently reported a lower production of de novo DR-associated DSA in ABO-I, possibly resulting in a reduced incidence of cABMR. The purpose of this study was to clarify the significance of HLA epitope mismatch analysis and investigate the beneficial effect of ABO incompatibility on de novo DSA production. Methods: This retrospective observational cohort study included 691 living donor kidney transplantations (454 ABO-Id/C and 237 ABO-I) with a mean observation period of 67 months. Patients in ABO-I were additionally pretreated with rituximab (RIT, n=168), splenectomy (SPX, n=10) and neither (none, n=59) due to low anti-A/B antibody titers. De novo DSA production was annually examined by LABScreen SAB and the prevalence was analyzed by HLA-DRB1/3/4/5 and DQB epitope mismatch (MM) levels.Results: De novo DSA were detected in 101 recipients (14.6%) including class I (n=12), class I+DQ/DR (n=3), DR (n=17), DQ (n=60) and DR+DQ (n=9). Logistic regression analysis revealed that significant factors for de novo DSA were ABO-I (OR=0.589 (0.362-0.957), p=0.326) and epitope MM (DRB and DQB) (OR=1.034 (1.018-1.051), p<0.0001). Epitope MM levels of DRB and DQB were significantly correlated with production of DR and DQ DSA, respectively. 5-year cumulative incidence of de novo DSA in low (0-5), moderate (6-15) and high (16-44) DRB epitope MM levels were 0%, 5.3% and 8.5% in ABO-Id/C, whereas those were 0%, 0% and 4.8% in ABO-I (Table 1) . Similarly, ABO-I showed a beneficial effect in moderate (1-5) DQB epitope MM level. ABO-I without rituximab pretreatment or splenectomy also exhibited the favorable results for de novo DSA production.5-year cumulative incidences of de novo DSA were 6.1% in none, 8.9% in RIT and 10.0% in SPX. Conclusions: DRB and DQB epitope MM levels could predict the risk of de novo DSA production. ABO-incompatibility which exhibits a protective effect against de novo DSA production, could increase a safety margin of epitope MM levels to 0-15 (DRB) and 0-5 (DQB), respectively. Potential suppression of de novo DSA-induced cABMR in ABO-I may contribute to favorable graft survival. Purpose: Donor specific transplantation tolerance has long been the goal of clinical transplantation. Clinical observations suggest that donor-specific antibodies (DSA) are a major cause of graft rejection despite ongoing immunosuppression, leading us to hypothesize that stable transplantation tolerance requires donor-specific B cell responses to also be profoundly suppressed. We hypothesize that this can be achieved by the control of T cell help to B cells, and also through the induction of B cell-intrinsic tolerance. The objective of this studies is to define how donor-MHCspecific B cells are constrained in experimental model of transplantation tolerance. Methods: We used a well-established experimental model of transplantation tolerance induced to allogeneic B/c hearts with anti-CD154+ donor spleen cell transfusion. In addition, to test whether tolerant B cells could be rescued by ongoing germinal centre (GC) responses, we adoptively transferred tolerant B cells (CD45.2, Igha) into congenic CD45.1/IgHb hosts, followed by B/c or C3H spleen cell immunization and then measured the alloantibodies at day 21 post-adoptive transfer. We observed that donor-specific B cells are intrinsically tolerant. B cells from tolerant recipients did not produce αB/c IgG when adoptively transferred into naïve MD4 host (~95% of B cells are specific for HEL) and then challenged with B/c spleen cells. We showed that the tolerant B cells were defective in their ability to differentiate into germinal center (GL7 + Fas + ) B cells. In addition, we also Purpose: Islet transplantation (IsletTp) is safe and effective treatment of Type 1 diabetes complicated by severe hypoglycemia (Diabetes Care 2016 , 39:1230 . But the requirement for calcineurin inhibitor (CNI) based immunosuppression raises concern about possible long term adverse effects of CNI on recipient renal function.We report the analysis of GFR and eGFR in 128 subjects in 8 CIT protocols. Methods: Glomerular filtration rate (GFR) and estimated GFR (eGFR) were assessed during 7 CIT 1-3 year studies; eGFR was also measured for up to 8 years in a study (CIT-08) of subjects completing the main studies. GFR was measured by plasma clearance of iohexol. eGFR was assessed from serum creatinine (Scr) values using the CKD-Epi formula. GFR and eGFR measured on the same day agreed closely. Average and individual GFR and eGFR intercepts and slopes following IsletTp were assessed using Bayesian hierarchical repeated measures analysis. The impact of IsletTp on renal function was analyzed as the step drop from baseline to the time 0 intercept and subsequent long-term slope. Results: The figure shows in grey the GFR and eGFR of each CIT subject over follow-up. The red curves show a smoothed average of GFR and eGFR over follow-up time.Average GFR and eGFR drop for the first 1 -1.5 years following IsletTp, but then stabilize, seen especially in the longer-term follow-up of eGFR. The table shows estimated average post-transplant drops in GFR and eGFR and their subsequent slopes.The median drops of GFR and eGFR from pre-to post-IsletTp were similar. The median post-IsletTp GFR slope, based on the first two years post-IsletTp, was -3.95 (MAD 0.37). But the eGFR slope based on up to 8 years of follow-up was -0.42 (MAD 1.99), not different from that in the general public. Estimated eGFR slopes of 6 of the 128 CIT subjects (~5%) were < -5 mL/min/year. There is little long-term risk of adverse impact of IsletTp and CNI immunosuppression on long term renal function in most CIT subjects. Purpose: Blocking lymphangiogenesis is thought to diminish immune responses to donor tissues. However, lung transplantation (Tx), which involves airway and vascular but not lymphatic anastomoses, is plagued by primary graft dysfunction and acute and chronic allograft rejection, leaving unanswered the importance of lymphoangiogenesis in graft outcomes. We have undertaken a reductionist approach to investigate this issue, by undertaking lymphatic disruption in healed murine lung isografts in which lymphatic reconstitution has already occurred. Methods: Using fully MHC disparate BALB/c and C57BL/6 mice, left lungs from triply transgenic donors (mice with a floxed stop codon associated with the diphtheria toxin receptor and VEGFR3, plus tamoxifen-regulated CreERT2) were engrafted into WT recipients treated with conjunction with CD154 mAb/DST plus 28 days of rapamycin (2 mg/kg/d via Alzet pumps). In addition, RAG-/-lung allograft recipients were adoptively transferred with conventional T cells (1x10 6 ) alone or plus Foxp3+ Treg cells (2x10 6 ). Results: Mice treated with costimulation blockade plus RPM showed long-term allograft survival (>150 days). Likewise, Treg cell therapy induced long-term allograft survival (>100 days) in contrast to mice receiving conventional T cells alone (rejection by 14 d). Installation of fluorescent microparticles at 3 weeks post-Tx and subsequent detection in draining pulmonary lymph nodes demonstrated restoration of lymphatic drainage within lung Tx. After 3 weeks, we deleted intrapulmonary lymphatics by administration of tamoxifen and then use of diphtheria toxin. We confirmed lymphatic deletion by lack of fluorescent bead uptake by draining lymphatics, and by staining for lung lymphatic endothelial cells that were identified in control non-diphtheria toxin treated mice lungs as Lyve-1 and Prox1 positive cells. Analysis at 30 days of mice with depletion of intrapulmonary lymphatics showed dense bronchovascular mononuclear cell infiltrates (A3, B1R) despite costimulation blockade and RPM. Conclusions: These studies show that pulmonary lymphatics are essential to costimulation blockade induced allograft survival. Ongoing studies are directed towards assessing whether Tregs can still induce allograft survival in the absence of pulmonary lymphatics. Our work highlights how lymphatics are more than simply drain tubes and suggest that they play key roles in leukocyte recirculation, immune surveillance and allograft acceptance.