key: cord-0041422-rqwobdvt authors: nan title: Abstracts 544 ‐ 1089 date: 2004-04-02 journal: Am J Transplant DOI: 10.1111/j.1600-6135.2004.0480b.x sha: 48448309285d548ce08aa28efb039588dbe28cea doc_id: 41422 cord_uid: rqwobdvt nan At MELD <9, recipient risk of death during the first posttransplant year was more than 3-fold higher than for candidates (p<0.0001). At MELD 9-11, the 3-fold higher risk of death persisted (p<0.0001); a >2-fold difference was observed for MELD 12-14 (p=0.001). Significantly lower risk of death among recipients was observed at MELD >17. Survival benefit increased with higher MELD scores, up to and including the maximum score of 40. Conclusions: Liver transplant survival benefit is concentrated among patients at higher risk of pretransplant death. On average, transplants among severely ill patients are not futile under current practice. With one-year posttransplant follow-up, patients at lower risk of pretransplant death have no demonstrable survival benefit from liver transplant. We studied relationships of histology to transcriptional changes during rejection of mouse kidney transplants from CBA donors into B6 recipients. Interstitial infiltration and edema developed rapidly by day 5 (D5) and evolved slowly thereafter; tubulitis and arteritis were absent at D5 but extensive by D21. All early changes (D5, D7) were T cell dependent i.e. unchanged in mice lacking immunoglobulins (AJT 3:1501 (AJT 3: , 2003 . We studied gene expression using Affymetrix MOE430A microarrays. Of 22690 genes on this array, 15% were highly expressed in normal kidneys (signal >1000). By biological function, three main groups accounted for these genes: metabolism, cell maintenance, and cell communication. Subgroups strongly represented in highly expressed genes included mitochondrial transport, energy pathways, and response to oxidative stress. Following transplantation, there was selective downregulation of certain highly expressed kidney genes, with a subset of 118 genes decreased more than 3 fold at D5. Most were involved in metabolism, energy, and transport. Table 1 shows the 5 genes with the greatest changes. Downregulation occurred early (D5), with gene expression either stabilizing at a low level or continuing to decrease further. No groups of downregulated genes recovered by D21. Some genes mediating epithelial function (chloride/potassium transport, ATPases, water channels) were included in the affected genes, showing early decreased expression followed by continuing slow decline. Mixing kidney RNA with RNA from lymphocytes showed that the decreased gene expression in rejection was not due to dilution by RNA from infiltrating cells. Thus T cell mediated kidney rejection causes rapid, selective decreases in expression of some renal genes before tubulitis develops, suggesting that early function changes in rejection are instructive rather than destructive. There is substantial evidence to suggest that the development of anti-donor HLA class I antibodies (Ab) is involved in the process of transplant arteriosclerosis, the hallmark of chronic rejection. In previous studies we have shown that anti-HLA Ab may contribute to the process of chronic rejection by binding to endothelial cells (EC) and smooth muscle cells and stimulating fibroblast growth factor expression, increased ligand binding and cell proliferation. Investigation of the intracellular signaling cascade showed that ligation of class I molecules stimulated Src phosphorylation and complex formation between Src, and the focal adhesion proteins FAK and paxillin. In the present study, we have characterized the role of Src kinase activity on the regulation of FAK activation. and triggering of molecular complexes with other downstream signaling proteins. For these studies, EC were treated with anti-HLA class I monoclonal Ab (mAb) W6/32 for various time points and cell lysates were immunoprecipited and immunoblotted with anti-phosphotyrosine Ab. Treatment of EC with anti-HLA class I Ab stimulated a time dependent increase in phosphorylation of FAK at residues Tyr-577, Tyr-576 and Tyr-925. Treatment with the pharmacological inhibitor PP2 completely inhibited class I mediated FAK phosphorylation at these residues indicating a critical role for Src kinase activity in this process. Ligation of class I molecules also triggered a Src dependent assembly of molecular complexs of FAK/Shc/Grb2 and FAK/Grb2/Sos providing a link between class I signaling and the MAP kinase pathway. Elucidation of the molecular basis of antibody-mediated alterations in graft cells may permit the development of immunotherapuetics designed to regulate the response to injury. Organ-specific injury following transplantation presents with a variety of clinical and pathologic phenotypes, yet the factors influencing development of each outcome are poorly understood. As primed T lymphocytes must re-encounter their antigen within the target organ to engage effector functions, we postulated that the cellular location of antigen within that organ could significantly impact the induced pathology. To assess this issue, we used female Mar CD4 T cell receptor transgenic mice (n= 4-6 per group), in which all T cells are specific for the male minor transplantation antigen plus I-A b , as recipients of heterotopic heart transplants. Notably, these recipients were bred onto a RAG -/-background and therefore have no other T cells and no B cells. The Mar recipients were either given male heart grafts expressing the relevant peptide: MHC complex on graft parenchymal/vascular cells or, alternatively, only on graft-infiltrating mononuclear cells. Transplantation of the two different graft types led to equivalent activation of recipient T cells as assessed by frequency, cell surface marker expression by flow cytometry, cytokine production by ELISPOT, and the ability to traffic to the graft as demonstrated by histologic and immunohistochemical examination of graft tissue. Nonetheless, at the effector stage, if the target antigen was expressed on graft vascular and/or parenchymal cells, the outcome was acute graft destruction. In contrast, at the effector stage, if the antigen was expressed only on graft-infiltrating mononuclear cells, the same primed T cell repertoire caused chronic rejection and vasculopathy. The data demonstrate for the first time that target antigen location in the graft, independent of T cell specificity, frequency and effector function can markedly influence pathologic outcome. The results have significant implications for understanding the pathogenesis of acute and chronic transplant rejection in humans. FOXP3 is a forkhead transcription factor encoded on the X chromosome. Recently it has been described as the most specific marker for T regulatory cells (T R ). FOXP3 seems to control both development and at least some aspects of suppressor functions of T R cells. We report analyses of FOXP3 in the blood of several informative groups of patients with renal transplantation to determine implication of this transcriptional factor on graft outcome. Methods: Four groups of individuals have been studied: a) 4 kidney recipients spontaneously tolerating their graft 8±3 years after complete interruption of immunosuppressive drug (2 PTLD and 2 incompliance), b) recipients with similarly stable long term kidney graft function but under classical immunosuppression, c) patients with histologically documented chronic rejection and graft function degradation and d) normal healthy individuals. We used real time PCR to quantify FOXP3 in PBL in separated subsets CD4+ and CD8+ cells purified from patients blood. Data were normalized by HPRT transcripts. Results: Spontaneously tolerant patients presented more circulating CD4+CD25+ cells (20.5±7.3% versus 10.6±5.2%; p<0.05) than patients with chronic rejection. Then, we analysed the level of FOXP3 transcripts in PBL, and showed no difference between Abstracts patients tolerating their graft and patients with chronic rejection. However, a decrease of accumulation of FOXP3 transcripts was observed in CD4+ cells in patients with chronic rejection compared both to drug-free tolerant patients and stable patients. In contrast, FOXP3 transcripts were almost undetectable in CD8+ cells, whatever total purified CD8+ cells or CD8+ cells from specific Vβ families, harboring a highly selected Vβ species (CDR3 length distribution) were tested. FOXP3 transcripts are now analysed on CD4+CD25+ and CD25-subsets. Our data suggest a decrease of FOXP3+ regulatory cells in patients with chronic rejection rather than an increase in tolerant and stable patients. Following transplantation, recipient T cells can recognize and respond to donor antigens expressed directly on donor cells and can respond to donor-derived peptides that have been processed and presented in the context of recipient MHC through the indirect pathway. Indirectly primed CD4 T cells have been well studied in transplantation, but little information is available regarding whether indirectly primed CD8 T cells participate in rejection at the effector stage. To address this issue, we placed MHC class I deficient D b K b double knockout (KO) skin grafts onto allogeneic H-2 k SCID recipients followed by adoptive transfer of purified H-2 k CD8 T cells. In this situation, the responding CD8 T cells cannot interact with any antigen on the graft cells because the graft cells lack MHC class I molecules. As a negative control, skin grafts were placed without adoptive transfers. D b K b KO skin grafts placed onto wild type H-2 k recipients were included as positive controls. In this latter situation recipient CD4 T cells were anticipated to reject the grafts via direct interactions with donor MHC class II. The MHC class I deficient grafts were rejected by day 35 in the SCID recipients (n=5) only if given adoptive transfers of purified CD8 T cells. CD8 T cells, but not CD4 T cells, were detectable in the recipient lymphoid organs. The CD8 T cells in these mice did not respond to any antigen directly expressed on D b K b KO stimulator cells as assessed by IFNg ELISPOT assays, but responded to control mitogen stimulation. Immunohistochemical analysis showed that only CD8 T cells were found in the D b K b KO grafts undergoing rejection and furthermore were detected in close proximity to vascular endothelial cells and to recipient infiltrating macrophages, suggesting specific interactions. Control K b D b KO grafts placed onto WT recipients were rejected by day 14 (n=4). Immune cells from these control animals directly responded to D b K b KO stimulator cells in recall assays, and the grafts were diffusely infiltrated with CD4 T cells. These data definitively demonstrate that cross-primed polyclonal CD8 T cells can function as active participants in the effector phase of rejection and suggest that they mediate graft injury via specific interactions with recipient endothelial cells feeding the grafts. The findings confirm and extend previous studies using monoclonal TCR transgenic T cells and shed light on mechanisms of acute and chronic graft injury that are potentially relevant to human transplant recipients. The importance of self-peptide/MHC molecule ligands (self-ligands) for the selection of the T-cell repertoire in the thymus is well established. However, the effects of longterm deprivation of self-ligand stimulation on T cell behavior and function have not been studied in detail. In order to test the role of self-ligand stimulation on T cell responses to cognate antigens in vivo we used a TCR transgenic CD4 T cell adoptive transfer system with MHC class II deficient and sufficient recipients. We demonstrate that naïve CD4 OT-II RAG deficient T cells (OT-II cells) developed a progressive defect in their ability to proliferate following injection of antigen-bearing dendritic cells (DCs). The CFSE content of OT-II cells co-injected with antigen-bearing DCs at the same time was the same in MHC class II deficient and sufficient recipients. However, a defect in proliferation was noted in OT-II cells deprived of self-ligand contact for as little as 24 hrs, and the defect became increasingly profound with additional duration of deprivation. A similar progressive defect was noted in CD69 expression. Further in vivo studies revealed that OT-II cells had a marked defect in their ability to physically interact with cognate antigen/MHC-bearing DCs, which progressively worsens over time. These results suggest a new mechanism of how self-ligand signals are critical for normal function of CD4 T cells. Our results should also be taken into account in interpretation of transplantation experiments that rely on MHC class II-deficient hosts. The conventional interpretation of these data suggests a critical role of indirect antigen presentation in transplant rejection. However, it is noteworthy that the direct pathway is dominant in our system. Indeed, we find no rejection of Act-mOVA full thickness skin grafts in MHC class II-deficient recipients of OT-II cells as compared with rejection in CD4-deficient, MHC class II-sufficient hosts, and wild-type B6 hosts. In fact, injection of OVA peptide-pulsed DCs at the time of transplantation also failed to induce graft rejection. Defective initiation of naïve T cell responses and effector function following self-ligand deprivation offer a potentially important alternative explanation for rejection failure. These considerations offer alternative mechanisms of how major immunosuppressant therapies that block TCR signaling (e.g., calcinurin inhibitors, anti-CD3 blocking antibodies) may work, and help develop future therapeutic strategies that may selectively augment these effects. Recently published data have demonstrated a significant association between the production of donor HLA specific antibodies (dspAbs) and subsequent renal transplant (tpx) failure. However not all patients whose grafts failed produced dspAbs and antibodies were not produced to all tpx HLA mismatches. HLA Matchmaker is a computer algorithm where donor-recipient HLA compatibility is assessed at the structural level. Intralocus and interlocus comparisions are made of polymorphic amino acid triplet sequences, the software then determines which triplets on mismatched HLA molecules are different or shared between donor and recipient. This study aimed to use the HLA Matchmaker to see if it was predictive of post-tpx sensitization. Methods: The study group comprised 35 adult recipients of primary renal tpx (transplanted between 1981 and 1998) whose grafts had failed. Sera taken pre-tpx, 1,3,6 months post-tpx and annually thereafter until graft failure were tested by ELISA for the presence of HLA class I and class II specific antibodies. Antibody specificity was defined by a combination of cytotoxicity, ELISA and flow cytometry techniques. Antigen mismatches (AgMM) were analysed for each locus individually for immunogenic triplets according to the HLA Matchmaker software. Results: All recipients were negative for dspAbs pre-tpx; post-tpx 20 were positive and 15 were negative. The table below summarizes the correlation between triplet mismatches (TrpMM) and dspAbs production. Summary: This study has illustrated that donor-recipient matching by conventional criteria can be further differentiated by analysing at the structural level, as a conventional single AgMM corresponds to a wide range of TrpMM. We have also shown that with an increasing number of TrpMM there is a corresponding increase in the percentage of patients who are antibody positive. This correlation is most evident in the HLA-DR and -DQ loci. Conclusion: For each locus when there is a single AgMM knowledge of the level of triplet mismatching indicates the likelihood of posttpx sensitisation. As post-tpx sensitisation is associated with graft failure this study suggests that HLA Matchmaker might therefore be used for donor selection. Immune competence is generally thought to be a function of TCR diversity but the immune response to transplantation may not be so. Because of the high frequency of responder cells MHC-incompatible grafts in non-immunosuppressed individuals are rapidly rejected regardless of diversity; however, with immunosuppression and in adults (with a large memory population), indirect responses to minor antigens might instead determine graft outcome. In this setting, repertoire contractions may impair responses to minor antigens but also cause expansion of broader cross-reactive memory cells. To which extent repertoire contractions, per se, modify immune responses to allotransplantation is not known. To answer this question we analyzed survival of H-Y incompatible skin transplants in mice with contracted T cell repertoires (see table) . We used skin transplants to avoid the impact of humoral rejection. JH-/-mice lack B cells and have a 1000-fold contracted TCR Vβ diversity, while µMT mice that produce few B cells have a 10-fold contracted TCRVβ diversity compared to the wild type. Our results indicate that the 10-fold contraction of TCR diversity in the µMT mice did not impair H-Y incompatible skin graft rejection. On the other hand, slower rejection by JH-/-mice could be owed to other factors in addition to the contracted TCR repertoire. For example, while JH-/-mice lack follicular dendritic cell (FDC) network in the lymph nodes, our studies revealed that µMT mice have a FDC network like wild type mice. To determine whether decreased TCR diversity or lack of wild type FDC network in the lymph nodes delays skin graft rejection in JH-/-mice, we increased TCR diversity in these mice to levels comparable to µMT mice by administering polyclonal immunoglobulin (Ig), which did not reconstitute the FDC network. The Ig reconstituted mice still showed delayed rejection of H-Y incompatible skin compared with µMT mice. These results show that rejection of grafts across minor antigen barriers is driven by FDC network and not by TCR diversity. TCR Vβ diversity and survival of H-Y incompatible skin grafts in JH-/-, µMT and C57BL/6 mice Strains TCR Vβ β β β β Diversity Time for rejection (days) p value (time for rejection) C57BL/6 6 x 10 5 ± 3.3 x 10 5 15 ± 1.7 µMT 3.5 x 10 4 ± 4.7 x 10 4 17 ± 3.3 p>0.05 JH-/-4.8 x 10 2 ± 7.4 x 10 2 26 ± 5.4 p = 0.0015 JH-/-+ Ig 8.8 x 10 4 ± 1.5 x 10 5 34 ± 18 p = 0.016 We described previously the metalloproteinase (MPase) pathway of soluble HLA (sHLA)-release which may have a role in indirect allorecognition and crosspresentation of antigens. The mechanism involved in processing of surface HLA to soluble forms is poorly understood. We identified a novel protein apparently involved in this mechanism which is recognized by our sHLA-release blocking mAb 7F11. 7F11 was obtained by immunizing mice with human T-cell membranes. Expression of 7F11 was determined in various cultured cell lines and cells from different normal tissues. Since the epitope is not accessible on intact cells, acetone-fixed cells were immunostained with 7F11 and FITC-conjugated goat anti-mouse IgM. Cell lysates were resolved on reducing SDS-PAGE followed by probing with 7F11 in Western blots (WB). Blocking activity of 7F11 was assayed in biotin-labeled and digitoninpermeabilized cells expressing MHC class I and tumor necrosis factor receptor 2 (TNFR2). Labeled cell supernatants were tested in ELISA for the presence of soluble β2-microglobulin (β2m)-free heavy chains (HC) and TNFR2 captured by mAb HC10 and M180R, respectively. The gene encoding the protein recognized by 7F11 was identified by screening an expression library with this mAb and cDNA cloning. 7F11 inhibited the release of soluble β2m-free HC as effectively as the MPase inhibitor BB-2116 (hydroxamate derivative). The blocking of sHLA-release was specific, since release of TNFR2 by ADAM17 from the same cells was not affected. WB and immunostaining with 7F11 revealed that lymphoid, endothelial, epithelial cells and fibroblasts express the 65 kilodalton (kD) protein with a predominant membrane localization. The screen yielded three similar cDNA clones identified by database analysis as BC036469. This gene was originally found in hippocampus, but its protein product was not characterized. In brain, hippocampus expressed the 43 kD protein while cortex was negative in WB. Thus, alternatively spliced and/or differentially glycosylated forms may exist in some tissues. The BC036469 protein identified here is not a MPase. This protein contains a ring Znfinger motif in its C-terminus which suggests involvement in protein turnover, chaperoning and intracellular trafficking and secretion functions. Thus, BC036469 may mediate interactions between the HLA-releasing MPase and its substrate. Its precise role in the MPase pathway of sHLA-release is currently under investigation. Allorecognition via the direct and indirect pathways (IDP) initiate immune reactions after transplantation; however, the relative contribution of each pathway remains unclear. We analyzed the degree of host T cell activation after stimulation with different types of alloantigens in recipients deprived of IDP with liposome clodronate (LC). Methods: CFSE labeled LEW rat unfractionated, purified T, or non-T cells (2 x 10 8 cells) were infused into unmodified or LC (200 mg/kg) treated BN rats. The real-time migration pattern of infused cells was studied by optiscan confocal endoscope (OCE). Host T cell activation was studied by Th1 cytokine mRNA levels and expansion of T cell area. LEW to BN heart transplant (HTx) was performed with or without LC to determine the ultimate roles of IDP. Results: LC eliminated phagocytic cells (macrophages and immature DC), resulting in <25% ED1 + and <5% ED2 + cells in the spleen. Real-time tracing of CFSE + cells in the spleen with OCE revealed that allogenic cells localized at permanent positions in 24 hrs without relocalization. Infused T cells were found in T cell area, while non-T population in B cell follicles and red pulps, regardless of LC treatment. Splenic INFγ mRNA was significantly more upregulated with allogenic non-T than T cells. Recipients deprived of IDP had minimum INFγ upregulation after both T and non-T cell injection. Similarly, expansion of splenic T cell areas was less prominent with non-T than T cell infusion with LC. Heart graft survival without immunosuppression in LC-treated recipients was slightly prolonged vs unmodified recipients. Under short course FK506 (0.5 mg/kg, d0-6), lack of IDP resulted in early lower Th1 cytokine mRNA upregulation, higher CD4/CD8 ratios, and splenic CD25 (IL-2R) reduction after HTx. Long follow-up is underway. Conclusion: Infused allogeneic cells migrate into the predetermined location of host spleen in 24 hrs without reposition. Non-T cells in B cell follicles and red pulps are more efficient for T cell activation probably via the IDP. Altogether, IDP of allorecognition appears to play an important role in host T cell activation. Trust in the medical system and future plans for donation were significantly affected by culturally appropriate health education programs designed for multi-cultural highschool youth. Education programs focusing specifically on behavior change and increasing trust for the medical system were sufficient to significantly increase the number of youth willing to become donors and hold family discussions. Introduction: Immediate renal allograft function following kidney transplantation is most desirable. Cold storage is associated with a significant amount of early posttransplant renal dysfunction. While pulsatile perfusion is more costly and technically demanding for preservation of renal allografts, a significant reduction in the need for post-transplant dialysis with concomitant shorter hospital stays for transplanted patients, offsets the moderately increased costs of preservation. Aim: To evlauate the benefits and outcome of perfusing a kidney for transplant. Methods: There were 112 harvested kidneys that were pumped from June 2001 to Aug. 2003 ., of which 59 were transplanted at our Hospital, 30 kidneys discarded, and the remainder distributed to other transplant centers. No kidneys in this study were discarded due to lack of availability of a suitable recipient. UW perfusate solution was used. The data for the pumped group was provided by One Legacy, an organ procurement organization in Southern California. The control group consisted of 261 cadaveric non-pumped, transplanted kidneys transplanted during the same period. Conclusion: Cadaver renal preservation using pulsatile perfusion with UW has lowered the incidence of post-transplant DGF due to preservation injury from 29.1% to 15.3%. The rate of primary non-functioning in the pumped group was 11.9% vs. 2.7% in the control group. Although use of pump preservation was primarily for marginal donor kidneys, the outcome was comparable to the control group. Pulsatile preservation helped to identify high risk kidney for primary non functioning kidneys by using pump pressure and resistance idex criteria. Allocation of deceased donor kidneys in kidney transplantation (tx) encompasses innumerable practical, ethical, and medical issues. With the knowledge that the quality of the donated organ is an important factor in patient outcomes, we undertook an analysis of registry data to examine the distribution of the quality of donated kidneys by recipient ethnicity and gender. All first solitary kidney txs (n=57,182) from 1995-2002 were utilized in our analysis. A donor risk score was generated from the parameter estimates of significant risk factors for graft loss including donor age, race, BMI, CMV status, gender, history of hypertension, and cause of death in a multivariate Cox model. This score was then delineated into groupings via cluster analysis and distributions examined by recipient race and gender. Results indicated significant differences in risk groupings (Chi-square p-value <.0001) and for overall risk score tested with an ANOVA model for recipient race. Caucasian recipients received a low risk donation in 55.5% of txs, African American recipients in 51.8% of txs, and other race recipients 50.4%. The relative frequency of high risk donations by recipient ethnicity was African Americans 9.2%, other 8.2%, and Caucasian 7.5%. Recipient gender was not significantly different among the risk levels. Overall graft survival was significantly associated with risk group (p<.0001), adjusted for recipient characteristics; relative risks attributed to a mid risk level donation and high risk level donation for graft loss were 1.5 (1.4, 1.5 ) and 2.2 (2.0, 2.3) respectively. There exists a significant disparity in the quality of deceased donor kidneys by recipient ethnicity. Observed decreased graft survival in African American recipients may in part be attributed to receiving lower quality donations. Whether this disparity is systemic, a function of recipient geography, other recipient characteristics, or more subtle causes requires further diligent scrutiny. The acceptance of NHBD kidneys is limited by effects of prolonged warm ischemia on short and long term renal function. The aim of present study was to analyze patient and allograft survival, and renal function of patients NHBD in our hospital in the last decade. Methods. 1667 kidneys allograft were performed since 1981.We studied 97 patients (62 male), median age 47±14 years old, that received NHBD kidneys (77 male), median age 38 ±14 years old. Primary immunosuppressive agents were: 84% cyclosporine, 22% azathioprine and 4% tacrolimus. Secondary agents were 73% azathioprine and 27% mycophenolate mofetil. In 12 patients mono or polyclonal drugs were used. Factors analysed: warm and cold ischemia time, primary non function, delayed graft function, acute rejection in three first months, time of duration of delayed graft function, allograft and patient survival and renal function at 1º, 5º and 10º year. Results: Follow-up time was 51±51 months (1 -170) . Warm ischemia time was 16± 12 minutes and cold ischemia time 23±5 hours. 80 patients ( 82%) showed delayed graft function and 18 (18,6 %) primary non function. Time of duration of delayed graft function: 23±17 days and the time of hospitalization was 39±24days. Acute rejection occurred in 14 patients (14.5%). 16 patients death mainly of infectious causes. Patient survival was 89%, 80% and 79% at 1º, 5º and 10º year. Allograft survival was 74%, 62% and 62% al 1º, 5º and 10º year. Causes of graft loss were 13 thrombosis, 9 acute rejections, 5 chronic nephropathy of allograft and 7 others. Creatinine at 1 year was 2.1±0.9 mg/ dl, at 5º year: 1.7±0.7 mg/ dl and 10º year: 2±1.2 mg/dl. Conclusions. Although the incidence of primary non function and delayed graft function in NHBD was higher, and allograft survival at 1º year was lower, that in heart beating donors, allograft survival at 5º and 10º year and renal function were similar in both groups. The function of the kidney harvested from the cadaveric donor depends on large group factors: donor dependent, accompanying brain death, ischemia, preservation and reperfusion. The aim of the study was to examine cell quantity, viability and phenotype flushed from the organ during preservation and kidney function after transplantation. The preservation solution was flushed from the kidneys after preservation in continuous hypothermic pulsatile perfusion (CHPP) or after static perfusion at the end of preservation in simple hypothermia(SH/SP). 40 samples of the preservation solution were examined, 23 from CHPP, 17 from SH/SP. Number of cells, their viability and type (CD3, CD4, CD8, CD14, CD16, CD15, CD69) were analyzed. Mean cell count in the preservation solution for SH/SP was 10,84x10 6 , for CHPP 31,52x10 6 . Viability of the cells 98%-99%. The cells flushed from the kidney proved to be lymphocytes and monocytes. There were more of these cells in the solution than in the peripheral blood. In the CD3 population, mean percentage of CD4 cells was 40.1% (standard peripheral blood 31%) and mean of CD8 was 60.1% (peripheral blood 41%). We then examined activated T and NK cells measuring the expression of CD69. As well as in the T cells as in NK cells percentage of the CD69+ was higher than in peripheral blood. That suggests that both lymphocytes and NK cells could become activated in the kidney. Percentage of T cells CD69 + was similar in the solution after CHPP and in SH/SP, however NK CD69 + percentage was higher in the SH/SP solution. T cell subpopulation examination proved that there were more CD8 + CD69 + cells in the solution. Furthermore it seems that CD69 + cell concentration is higher in the solution obtained from the kidneys which present delayed graft function after transplantation. Analysis of the cells flushed from the kidney shows that in the organ (in the donor or during preservation) activation of lymphocytes and NK cells starts. In 10 cases of kidneys being stored in CHPP delayed graft function (DGF) after transplantation was observed (43.4%), among kidneys stored by SH 7 cases of DGF was observed (41.1%). No significant statistical influence between delayed graft function for both preservation methods was observed. Analysis of the cells flushed from the kidney shows that in the organ (in the donor or during preservation) activation of lymphocytes and NK cells was observed. Poster Board #-Session: P54-II OUTCOME OF SOLITARY PEDIATRIC (13 MONTH-5 YEAR) DONOR KIDNEY TRANSPLANT IN ADULT RECIPIENTS. Shafiq Cheema, 1 Rafik El-Sabrout, 1 Kahlid Butt, 1 Patricia Hanson, 1 Veronica Delaney. 1 1 Nephrology and Transplant Surgery, Westchester Med Ctr, NY Med College, Valhalla, NY. The transplantation of kidneys from cadaveric donors ≤5 years of age into adult recipients is controversial. The large disparity between donor renal mass and recipient body mass is feared to be problematic. The UNOS database showed that en bloc kidney survival was better than a single kidney from donors 0-5 years. Retrospective analysis of thirty consecutive solitary pediatric kidney transplants into adults. Groups (gp) based on donor age:≤2 years (gp I, n=12) and 2-5 years (gp II, n=18).All recipients received steroids, tacrolimus, rapamune(n=20) or MMF(n=10) and induction with Thymoglobulin(n=28) or daclizumab(n=2). Follow-up period was 3 to 84 months (mean=14.6m) and included serial sonograms. Results: 1-Mean age of donor=34.8 months (range[r]=13-60), mean serum creatinine of donor =0.35 mg/dl (r =0.2-0.8), mean age of recipient =42.6 year (r =19-73), mean weight of recipient =62.2 kg (r =51-78) and mean cold ischemia time was 23hrs (r =15-29). 2-At the end of follow-up, 24/30(80%) allografts were functioning. Mean creatinine(Crt) at 3 months and end of follow-up (mean=16m, range=3-78m) was 1.9 and 1.8 mg/dl respectively 3-Rate of acute rejection = 3/30 (10%). Five (17%) patients developed DGF and 5 (17%) had renal artery stenosis (RAS). Only two patients with RAS required intervention, angioplasty and reimplantation of RA. 4-Mean increase in kidney size at 3 month was 1.26 cm (7.8 to 9.15,16%) . (1), primary non-function secondary to renal vein thrombosis (1) , chronic allograft nephropathy (2) , Severe, untreated rejection (1) and Immunosuppression withdrawl (1) Conclusion: 1-Solitary kidneys from cadaveric donors of age one year and older are suitable for transplantation into adults and provide excellent graft function. 2-Significant renal mass increase occurs in the absence of acute rejection or severe renal hypoperfusion. 3-Given the current shortage of donors and availability of more potent immunosuppressive drugs, larger trials are indicated to confirm our findings and prevent underutilization of small solitary pediatric kidney transplantation into adults. 4-We recommend selecting recipients of small body mass index, induction with thymoglobulin and use of rapamune for transplantation of small solitary pediatric kidneys into adults. A donor scoring system was developed as a quantitative approach to identify kidneys prior to transplantation that are at increased risk for delayed graft function and or failure (Am J Tx 2003:715-21 ). The scoring system was developed from retrospective data, but had not previously been validated in a prospective trial. PURPOSE: To ascertain through a prospective analysis if a donor scoring system was useful in predicting outcomes after deceased donor renal transplantation. METHODS: 188 recipients of a kidney transplant from a deceased donor were prospectively followed for 30 days after transplantation from [2001] [2002] [2003] . Donor score was determined at the time of procurement from the following five donor variables: age, 0-25 points, history of hypertension, 0-4 points; final creatinine clearance before procurement, 0-4 points; cause of death, 0-3 points; and HLA mismatch, 0-3. Donor kidneys were stratified by cumulative score: grade A, 0-9 points; grade B, 10-19; grade C, 20-29; and grade D, 30-39. Primary study endpoints included graft loss, utilization of hemodialysis, creatinine clearance and serum creatinine 30 days post renal transplant. RESULTS: The majority of kidneys were grade A; 47% with grades B, C, and D respectively, 31%, 20% and 2%. Graft loss was 29.2-fold greater for grade D (1 of 3) vs. grade A (1 of 88) kidneys. The need for hemodialysis was 100% for grade D kidneys compared to Grade A, B and C kidneys respectively 19%, 35%, and 39%. Creatinine clearance measured 30 days posttransplant was 2-fold greater for grade A kidneys (72 mL/min) compared to grade D kidneys (38 mL/min) (p= 0.038). Similarly, serum creatinine assessed 30 days posttransplant was higher with grade D kidneys (3.2 mg/dL) compared to grade A kidneys (1.5 mg/dL) (p=0.053). CONCLUSION: The donor score of a kidney from a deceased donor showed a close correlation with its likelihood for posttransplantation hemodialysis, renal function, and graft failure. The donor score appears to be a useful tool for the allocation of deceased donor kidneys for transplantation. Background: Effective resuscitation of deceased donors is critical to the success of organ recovery. Accurate determination of fluid status in brain dead donors is usually confounded by iatrogenic dehydration, hemorrhage, use of vasopressors, and coexisting diabetes insipidus. Maintenance of blood pressure and urine output alone as resuscitation endpoint is inadequate. We hypothesize that correcting base deficit, a measure of tissue perfusion, will facilitate fluid management in these donors and thereby improve renal allograft function. Methods: Consecutive donors over a 12-month period were prospectively studied. Group I consisted of 12 donors whose resuscitation was based on maintaining SBP greater than 100 mm/Hg and urine output greater than 1 cc/ kg/hr. Group II consisted of 15 donors whose resuscitation was based in addition on correcting their base deficits to the normal range (< 2 mmole). Ongoing monitoring was achieved by q2h measurements of base deficit obtained from ABG's. Immediate outcome of kidney allografts was determined by the presence of delayed graft function (DGF, dialysis requirement during week 1) and calculated creatinine clearance at day 7. Results: Age and cause of death in the two donor groups were similar. Base deficits were corrected to the normal range in all donors in Group II by the time of surgical recovery. Outcome data were available from 21 patients who received renal allografts from Group I and 27 from Group II. Recipient demographics and cold ischemic times were similar in the two groups. DGF occurred in 10/21 in Group I (48%), and 5/27 in Group II (19%, p< 0.05, Fisher's exact test). Creatinine clearance at day 7 calculated by the Cockroft-Gault formula was 29 ± 6 ml/min in Group I and 41 ± 8 ml/min in Group II (p< 0.05, T-test). Discussions: Brain dead donors have unique physiology where blood pressure and urine output alone do not adequately reflect organ perfusion. Base deficit is an easily obtained measurement to guide fluid resuscitation in this situation. Moreover, in our group of donors, this approach improves immediate renal allograft function. We recommend using base deficit as a routine to expedite organ recovery and potentially improve function of transplanted organs. 1 Sunil Geevarghese, 1 Philippe Gauthier. 1 1 Center for Abdominal Transplantation, Tulane University, New Orleans, LA. Background: The transplantation of single kidneys from cadaveric pediatric donors into adult recipients is not routine, most being used enbloc. We reviewed our experience with the transplantation of single kidneys from pediatric donors weighing ≤25 kg. Methods: 32 adults were transplanted with single renal allografts from pediatric donors weighing ≤25 kg (study group) between April 1994 and October 2003. They were compared with 30 matched adult recipients of kidneys from adult donors (control group). Results: There was no difference between both the groups in regards to sex, age, recipient weight, HLA mismatch, PRA, type of immunosuppression and duration of follow up. In the study group the recipients were 37.6±14.3 years old with a sex ratio of 15:14 (Male:Female) and had median A-B-DR mismatch of 2-1-1, and a median follow-up of 23 months(range 3 to 96). Mean age of donors in study group was 4.4±2.1 years weighing 15.9± 5.3kg with a donor/ recipient weight ratio of 0.22±0.09. Arterial anastomosis was done with a patch, in all except one, with main arterial lumen size being 4.8±2.3 mm. Ureteral stent was used in 62.5% in study group versus 25.0% of controls, P<0.02. In the study group, surgical complications occurred in 4/24 patients (hydronephrosis 1, hematoma 1 and ureteric stenosis 2) needing surgical intervention in two of them. In the control group, 4/29 developed surgical complications (fluid collection 1, hydronephrosis 1, uretic stenosis 1, uretero-vesical leak 1) needing surgical intervention in two. Serum creatinine reached nadir in 51 days in study group versus 30 days in controls (p <0.01). Serum creatinine at 1 and 3 years were comparable in both groups. In study group 38.9% had proteinuria at 1 year compared to 22.7% in controls (p=0.36). 1-year graft survival was 91.7% versus 92.8% for controls.Conclusions: Though the incidence of proteinuria is more frequent, the surgical complications, 1 and 3 year serum creatinine, and graft survival in adult recipients of single renal grafts transplanted from small paediatric donors are comparable to that of controls. Transplantation of single rather than en-bloc pediatric donor kidneys yields comparable results and has the additional benefit of expanding the donor pool. Poster Board #-Session: P58-II Background: Approximately 16% of deceased donor kidneys each year are transplanted to zero HLA-mismatched recipients in the US. However, the offer of a zero HLAmismatched kidney for any waiting patient is unpredictable and represents an important challenge in managing the growing lists of waiting patients at many transplant centers. Our goal was to examine the dynamics of zero HLA-mismatched kidney transplantation with regard to time after listing. Methods: We analyzed waiting times for recipients of zero HLA-mismatched deceased donor kidneys shared under the OPTN/UNOS national sharing program between [1996] [1997] [1998] [1999] . We also assessed the changing distribution of HLA phenotypes over time among patients who entered the waiting list in 1997 by determining the number of patients who had zero HLA mismatches with at least one donor during 1997, then repeating the analysis for those patients listed in 1997 who were still waiting in 1998 against 1998 donors and so on for up to 5 years. Results: Nearly 75% of the HLA-matched transplants performed between 1996-1999 occurred within 18 months of listing as shown in the figure. Among 18,847 patients first listed in 1997, 37% had no HLA mismatches with at least one donor in1997. Of the 2,852 who still remained on the list in 2002, only 19% had no HLA mismatches with at least one donor in 2002. Thus, over a period of 5 years, common donor HLA phenotypes were depleted from the waiting pool by approximately 50%. Conclusions: We demonstrated that nearly 75% of zero HLA-mismatched transplants occur within 18 months after listing and that patients with common HLA phenotypes are rapidly depleted from the waiting pool leaving patients who are unlikely to be offered a zero HLA-mismatched graft. This information may facilitate ensuring patients are medically ready when a zero HLA-mismatched kidney is offered. BACKGROUND: Organ donation following cardiac death (DCD), continues to grow as a an alternative method to bridge the gap between organ supply and demand. However, most centers are still reluctant to use such organs, due to the high incidence of delayed or primary non-function (PNF) inherent to organ ischemia. We report on the use of extracorporeal membrane oxygenation (ECMO) to limit prolonged circulatory arrest prior to cold preservation. METHODS: Since 1999 we have utilized (ECMO) in controlled DCD, to support organ perfusion in the absence of cardiac activity after declaration of death. ECMO flow rates are maintained at = 2 lts/min. PH at =7. 1 = 7.4 , arterial saturation > 80% and ACT > 500 sec. RESULTS: Between October 2000 and August 2003 a total of 17 patients underwent pre-mortem cannulation and withdrawal of ventilatory support. Mean donor age was 29 years (9-53). Cardiopulmonary death occurred at a mean of 17.9 mins (5-50) after withdrawal and ECMO support was used an average 103 mins (73-121). Six of 26 kidneys were not subsequently transplanted due to serology results and poor perfusion parameters. One of 5 livers procured was not used due to recipient complications. Kidney recipient's age averaged 43.7 years (6-67), delayed graft function (DGF) occurred in 11% (2/19), 1 kidney was lost early due to technical complications, with all remaining kidneys functioning normally. Liver recipients age averaged 52.7 years (41-59), all grafts function immediately, one patient died 125 days post-op due to sepsis. All 3 remaining patients are alive and well, 2 with normal liver functions and 1 with resolving cholestasis at mean follow-up 14 months . CONCLUSIONS: ECMO supported DCD permits controlled withdrawal of support in the intensive care unit without compromising organ viability. The low rate of DGF and the absence of PNF, suggest that ECMO prevents organ damage prior to cold preservation. The positive experience has prompted us to use ECMO in other areas of organ donation including the support of unstable donors that are brain dead. In addition, we are developing protocols to initiate ECMO supported DCD programs at other institutions in our region. Although the size of the study is limited, these data suggest the use of ECMO will significantly decrease the rates of DGF/PNF when compared to other results reported in the literature. As the number of patients far exceeds the number of organs, it becomes imperative to use kidneys with DIC. Methods: Since January 2000 four cadaver donors with DIC gave 8 kidneys. They averaged 31.5 years of age, with admission serum creatinine 1.0 mg, last 24 hours urine output 1050cc. The cause of death was CVA (3) and GSW to the head (1) . Wedge biopsy performed on all kidneys showed extensive glomerular thrombi, but no cortical necrosis. Kidneys were then flushed with 20mg of Activase®, a recombinant plasminogen activator (tPA) and stored in slushed ice for 40 minutes. They were reflushed with Viaspan®. All but one were rebiopsied prior to transplantation. CIT averaged 26.5 hours. Simultaneous CSA, MMF and Prednisone therapy was used. Methylprednisolone was given for rejection. Results: All repeated biopsies showed complete resolution of fibrin thrombi. No recipient developed post operative bleeding or primary non-function. One kidney, which did not undergo post tPA biopsy showed no fibrin thrombi on biopsy performed 20 days later for rejection. One kidney clotted on the 10th day to rejection. Three patients had ATN, for an incidence of 37.5%. Current serum creatinine averages 1.8 mg after 14-38 months of follow up. GS is 87.5% Conclusion: Back table intra arterial tPA flush lyses rapidly microthrombi in kidneys with DIC and makes them successfully transplantable. Pre treatment with tPA should be added to the therapeutic armamentarium of kidneys with DIC. Purpose: Renal Transplant waiting lists continue to grow. The use of expanded criteria cadaveric donors may increase the donor pool. Since 1997 we have liberalized our cadaveric donor criteria and assessed our results. Methods: Between Jan 1,1997 to Sept 30, 2003 we performed 172 cadaveric renal transplants using induction with Thymoglobulin (Thy), Mycophenolate Mofetil (MMF) and Prednisone. Tacrolimus (TAC) (80%), or Cyclosporine Neoral (CsA) (20%) were introduced only when serum creatinine was < 3.0mg/dl. We divided the patients into 2 groups according to the scoring system described by Nyberg et al. (Am J Transplant 2003; 2; 3:715) which was donor age, history of hypertension, donor creatinine clearance, cause of death, and HLA mismatch. Group 1 (optimal donors) had a score of 0-19 and group 2 (expanded criteria donors) had a score of 20-39. We compared the 2 groups for graft survival, acute rejection rate (ACR), creatinine at 1 yr, delayed graft function (DGF, hemodialysis or creatinine>3.0 mg/dl on day 5). Results: Overall, graft survival for all 172 patients was 92.2 and 81.8% at 1 & 5 yrs. Patients in Group 2 had a higher rate of DGF and a higher creatinine at 1 yr, however graft survival at 5 yrs was still relatively good (78 vs 83% p=ns). Patients with DGF (n=61) had a lower 1 and 5 yr graft survival compared to patients without DGF (86 vs 98 % and 72 vs 88% p<0.004) irrespective of the group they belong to. DGF had an equally bad impact on "ideal" donors and "extended criteria" donors. All patients with DGF had a renal biopsy in the 1 st Aim: Outcome of renal transplants from donors <5 years old has traditionally been inferior to that from older donors. Inadequate renal mass, vascular catastrophes, and acute rejection have been the main obstacles to wider application. We retrospectively studied our overall experience with patients who received renal transplants from such young donors to determine the utility of these organs. Patients: 105 patients received transplants from donors <5 years of age between September 1991, and July, 2003 , and were followed-up for a mean of 35 months (range, 5-145) . Patients were divided into 4 groups based on wether they received single or en-bloc kidneys, and wether donors were 100 days) lasted for only 16 days. In autologous vIL-10-HSC administered group, histopathology demonstrated mild arteritis/venulitis (grade 0.7) and mild acute cellular rejection (grade 1.00). Intragraft expression of co-stimulatory molecules (CD80, CD86), cytokines (IL2, IL4, mIL10, IFNγ), and iNOS molecules was markedly lower in tolerant grafts (vIL-10-HSC treated) compared to rejected grafts (vector-DNA-HSC or UE-HSC treated). Further, T lymphocytes derived from vIL-10-HSC treated graft tolerant recipient demonstrated hyporeactivity to donor and 3 rd party antigens in MLR cultures. Conclusion: Administration of autologous vIL-10 engineered HSC prior to organ Tx prolonged cardiac allograft survival significantly. Poster Board #-Session: P69-II Nicolas Degauque, 1 David Lair, 1 Cecile Braudeau, 1 Alexandre Dupont, 1 Fabienne Haspot, 1 Fabien Sebille, 1 Sophie Brouard, 1 Jean-Paul Soulillou. 1 1 U.437, INSERM, Nantes, France. Background. Induction of specific tolerance for donor antigens is one of the most actively explored field in transplantation immunology.In adult rats, long term survival (LTS) of MHC incompatible heart can be obtained by priming the recipients with donor specific transfusion (DST) 14 and 7 days before transplantation.In this study, we analyzed the mechanisms of maintenance of tolerance in LTS rats by focusing on in vitro regulatory patterns of splenic T cells harvested from LTS rats. Direct-pathway type Mixed Lymphocyte Reaction (MLR) were performed by culturing irradiated enriched allogeneic LEW.1W (RT1 u ) dendritic cells with purified T lymphocytes from naive or LTS rats LEW.1A (RT1 A ).Inhibition assays were performed by adding to this readout system purified CD25-or purified CD4+CD25-or purified CD8+CD25-T cells from LTS rats.Transwell culture were used to investigate the need of cell-cell contact.In addition, the effect of IL2 addition on these MLR as well as anti-IL10, anti-TGFb antibodies and IDO and iNOS inhibitors were tested.Finally, Complementary Determinant Region 3 -length distribution (CDR3-LD) was analyzed using Immunoscope. Results. CD25-T cells from LTS rats exhibit a reduced proliferation following directallostimulation compared to naïve CD25-T cells.This hyporesponsivness could be bypassed by addition of IL2.In MLR, CD25-T cells from LTS rats were able to inhibit proliferative response of naive alloreactive CD25-T cells.Addition of IL2 to this coculture also restored proliferative response.In contrast, when CD4+CD25-or CD8+CD25-T cells from LTS were independantly tested, no inhibition of the MLR was detected.Thus, both subsets are needed for optimal proliferation inhibition.Inhibitory property of CD25-T cells from LTS did not involve soluble factors (IL10, TGFb, IDO and iNOS) but required cell-cell contact as shown by the transwell culture system.Finally, analysis of regulatory CD25-T cells repertoire of LTS rats did not reveal alteration of the CDR3-LD.Thus we could not associate the appearance of clonal selection and regulatory properties in the differentiated CD25-subset in LTS rats. Conclusion. This study provides the first evidence of existence of dominant regulatory function in the CD25-T cells subset in long term survivors rats following DST.Functionnal in vitro regulatory properties may be in part responsible of maintenance of heart graft acceptance in this model. Poster Board #-Session: P70-II Background: We have demonstrated previously that tolerance to cardiac allografts can be induced by recipient treatment with anti-CD45RB antibody and that this tolerance is uniquely dependent on the presence of the host thymus and B-lymphocytes. In this study, we evaluated the role of B-lymphocytes in two separate models of tolerance. Methods: In our first model, B cell deficient B6-mMT recipients were reconstituted with purified B cells from normal C57Bl/6 mice. This model was used to track B cells, to monitor their proliferation and to determine their role in graft survival. B6-HEL mice (in which >90% of B cells are HEL specific) were employed as recipients to assess the functional requirement of B cells for tolerance induction. To further understand the role of B cells, we used a B cell deficient TCR transgenic system (TS1/ JH -/-) in which we can study the response to a specific antigen (HA). Transplanted mice were treated with 5 doses of 100 mg anti-CD45RB. For trafficking studies, injected B cells were labeled with CFSE. Results: B6-mMT mice rejected C3H hearts without B cell reconstitution (MST = 9 days; n=6). Anti CD45 treatment alone slightly prolonged the survival but did not induce tolerance (MST = 14 days; n=10). Indefinite survival was obtained when normal B6 B cells were supplied (MST = >100 days; n=10). In tolerant animals, infused B cells were detected at all time points (to >100 days), found in all lymphoid organs including the thymus, and revealed evidence of homeostatic proliferation. That the surviving B cells were of the B cell donor and not derived from graft passenger cells was confirmed by class I labeling. C3H hearts in HEL transgenic mice were promptly rejected (MST = 12 days; n =2). TS1 JH +/+ mice treated with anti CD45RB accepted HA + grafts indefinitely (MST= >100 days; n=18). In contrast, TS1/ JH -/mice acutely rejected HA + grafts (MST = 12 days; n=16) and anti-CD45RB therapy did not prolong survival beyond MST = 30 days (n=6). Conclusion: Our results demonstrate conclusively that B cells are necessary for anti CD45RB induced transplantation tolerance. Also, transferred B cells provided stable reconstitution and were functionally active. Tolerance is not induced in B6-HEL transgenic mice indicating a possible role for cognate antigen presentation or of protective donor-specific antibodies in this system. These findings define a unique role for B cells in anti-CD45RB induced tolerance. Costimulatory blockade is a promising strategy for the induction of transplantation tolerance. Previously, we observed that costimulation blockade induced functional dominance of regulatory CD4 + CD25 + T cells by reduction of the alloreactive effector pool in vitro. It can therefore be hypothesized that regulatory CD4 + CD25 + T cells play an active role in the induction of tolerance by costimulation blockade in vivo. Regulatory CD4 + CD25 + T cells (Treg) constitutively express CTLA-4, but considerable controversy still exists about the importance of engagement of CTLA-4 in the activation of CD4 + CD25 + regulatory T cells. In vivo, we found that additional anti-CTLA-4 antibody (4F10) abrogated prolonged cardiac allograft survival by anti-CD40L and anti-CD86 over a full MHC-mismatch (MST 36 days vs. MST > 90 days respectively). In in vitro MLR we examined the role of CTLA-4 engagement and Treg in this phenomenon. It was observed that anti-CD40L and anti-CD86 resulted in profound inhibition of the alloresponse in MLR (>80% inhibition), whereas additional anti-CTLA-4 partially abrogated this inhibition (<50% inhibition). Subsequently, responder cells were depleted for Treg at the start of MLR. Depletion of Treg impaired the inhibition by anti-CD40L and anti-CD86 to a similar degree as was observed for the addition of anti-CTLA-4. After depletion of Treg, additional treatment with anti-CTLA-4 had no further effect, which suggests an effect of anti-CTLA-4 primarily on Treg. We propose that anti-CD40L and anti-CD86 facilitate immunoregulation by regulatory CD4 + CD25 + T cells in vivo and that engagement of CTLA-4 of these cells is pivotal for their activation. Strategies that are aimed to enhance CTLA-4 signalling of Treg may be useful for immunoregulation in transplantation and auto-immune diseases. Background: Recently, we were able to replicate the clinical NIMA-effect in a mouse model (JI 2003; 171:5554) . Offspring (bxb) from breedings of a B6 (bxb) male and a B6D2F1 (bxd) female are exposed to mothers H-2 d antigens. When transplanted with fully allogeneic hearts from DBA/2 (dxd), a > 50% tolerance rate is observed, while control offspring from NIPA-breedings of a B6D2F1 father and a B6 mother uniformly reject (MST = 10d). Hypothesis: The NIMA-effect is dependent on: 1) the degree of CD4+ and CD8+ T cell anergy induced via the direct pathway, and 2) the development of CD4+ T regulatory cells specific for NIMA induced via the indirect pathway. The strain background is likely to influence these 2 parameters of the allo-immune response of the offspring. Methods: Four NIMA + NIPA control breedings were initiated (Table) . Only homozygous bxb or kxk offspring were transplanted heterotopically with fully allogeneic hearts from DBA/2, C3H or B6 to test the impact of NIMA d, k, or b exposure; GST > 100d was considered tolerant. T-cell function was analyzed by in vivo proliferation of CFSE stained cells in semi-allogeneic hosts and by measurement of cytokine producing T cells in MLC responses using ELISpot. Results: An in vivo NIMA effect was seen only in the two NIMAd models (Table) . CFSE-analysis of CD4 and CD8 T cells from NIMAd1 (bxb) mice showed a significantly reduced proliferation compared with NIPA. A decrease of IFN-g and IL-2 producing T cells and an increase of IL-10 producing T cells in response to fully-and semi-allogeneic stimulators was observed. In contrast, NIMAk exposed T cell responses were only reduced slightly in bxb offspring, and NIMAb mice were sensitized to NIMA, as indicated by significantly higher pretransplant IFN-g alloresponses. Conclusion: Only the exposure to NIMAd was able to induce tolerance. The impact of the NIMA effect seems to be dependent on the balance between sensitization of NIMA-specific effector T cells, on the one hand, and induction of NIMA-specific anergic and regulatory T cells, on the other. 1 Chaorui Tian, 1 John Iacomini. 1 1 Transplantation Biology Research Center, Massachusetts General Hospital, Boston, MA. Background: Reconstitution of lethally irradiated B10.AKM (H-2K k ) mice with syngeneic bone marrow cells infected with retroviruses carrying the allogeneic MHC class I gene H-2K b resulted in stable and lifelong expression of K b on bone marrow-derived cells. More importantly, these mice were specifically tolerant to H-2K b skin grafts while still retaining the ability to reject third party skin grafts. Using CD8 transgenic mice, we discovered that alloreactive CD8 T cells underwent negative selection in the thymus, leading to the absence of these potentially alloreactive T cells in the periphery Methods: Tg361 CD4 transgenic mice expressing a TCR specific for the MHC class I gene H-2K b were utilized to determine the mechanism by which genetic engineering of bone marrow induces donor specific CD4 T cell tolerance after bone marrow transplantation. Lethally irradiated B10.AKM mice were reconstituted with a 6:1 ratio of mock or K b -transduced B10.AKM bone marrow to Tg361 bone marrow. Results: Lethally irradiated recipients that received CD4 transgenic bone marrow in combination with autologous bone marrow transduced with virus encoding H-2K b displayed stable and long-term expression of K b on bone marrow-derived cells. Alloreactive CD4 transgenic T cells were readily detectable in the peripheral blood, spleen and thymus of chimeric recipients; although their levels were 2-fold lower than mock-transduced bone marrow recipients. Despite the presence of potentially alloreactive CD4 T cells in the periphery, chimeric recipients were specifically tolerant to H-2K b expressing skin grafts. Interestingly, chimeric mice which received K b -transduced and CD4 transgenic bone marrow expressed CD25 on 20-30% of Tg361-derived CD4 T cells, compared to only 6% of Tg361-derived CD4 T cells in mock transduced controls. We conclude that genetic engineering of bone marrow induces donor specific CD4 T cell tolerance through deletional and non-deletional mechanisms. We further hypothesize that expression of K b on bone marrow derived cells induces CD4 T cells capable of inhibiting alloreactive T cells. Background: Accumulating evidence that dendritic cells (DC) are important regulators of peripheral immune tolerance has led to the concept that DC may be useful therapeutically. Engineered expression of immunomodulatory genes to create tolerogenic DC has led to promising results in vitro; however, in vivo results have been disappointing. A major obstacle in previous attempts to induce tolerance using genetically modified DC has been the relative inefficiency of previous gene transfer methods. In addition, the gene transfer methods themselves appear to trigger the maturation of DC, antagonizing the effects of immunomodulatory genes. Another problem in the field of DC immunotherapy is the relative paucity of information on how migration patterns of infused DC affect the subsequent immunologic outcome. We have sought to address these issues by 1) developing a robust method for producing pure populations of genetically modified, immature DC and 2) studying how enhanced lymphoid migration of infused DC affects the subsequent immune response. Methods/Results: To circumvent DC maturation, cycling stem cells were transduced prior to differentiation into DC using an MLV-based retroviral vector, sorted, and placed into DC medium. Remarkably, up to 1 x 10 9 transduced DC can be generated from a single experiment, with 90-95% purity. Transduced DC were homogeneously immature by both functional and phenotypic analysis. Using this technique, we have expressed immunomodulatory genes such as vIL-10, IDO and PD-1L on DC and confirmed their ability to down-regulate antigen specific immune responses in vitro (MLR, cytokine production) and in vivo (DTH model) . IV infusion of DC is known to result in poor migration to lymphoid compartments. We therefore tested whether engineered expression of the lymphoid homing receptor CCR7 could improve the migration of immature DC to lymph nodes and spleen. Using both histologic and FACS analysis, we have found a >50-fold increase in homing by DC transduced with CCR7, compared to controls. Analysis of CD4 T cell responses using an adoptive transfer model demonstrates enhanced interaction of CCR7 expressing DC with antigen specific T cells. We are currently studying the role of immunomodulatory genes and potential synergy of targeted DC delivery in various transplant models. We have developed a method for the systematic manipulation of DC gene expression and homing. We hope to elucidate the determinants of immunity and tolerance in response to DC infusion, in order to develop a rational approach to DC based immunotherapy. Liver transplant tolerance in mice is associated with activated T cell apoptosis in liver grafts, suggesting unusual interactions between the hepatic milieu and immune cells. Hepatic stellate cells (HSC) are known to actively participate in the fibrogenesis in liver disease, but little is known of the role in regulating immune responses. In this study, HSC were isolated from B10 (H2 b ) livers, and cultured in plates for 2d (quiescent) or 7-10d (activated) . We have demonstrated that activated HSC fail to stimulate allogeneic T cell (C3H, H2 k ) proliferation, but the addition of activated HSC to a DC/ T allogeneic MLR culture significantly inhibited T cell proliferative responses in a HSC dose dependent manner, which appeared to be related to HSC activation as quiescent HSC had no inhibitory activity. The inhibition was so powerful that 50% of suppression was achieved at T:activated HSC ratio of 1:40, and 90% inhibition at a ratio of 1:20. This inhibition was unlikely MHC restricted because HSC from C3H (syngeneic to T cells) also markedly suppressed T cell proliferative responses. The underlying mechanisms remain unclear. Quiescent HSC expressed very few surface molecules, while activation by culture in uncoated plastics induced strong expression of PDL-1, a ligand for PD-1. Expression of PDL-1 on activated HSC was enhanced by further stimulation, including IFN-γ and activated allogeneic T cells. To determine whether expression of PDL-1 on HSC plays a role in mediating the inhibition of T cells responses, we established a system in which proliferation of C3H T cells was triggered by anti-CD-3ε mAb. T cell proliferative responses were noticeably inhibited by the addition of activated HSC in a dose dependent fashion. Addition of the blocking anti-PDL-1 mAb (BioScience) to the culture reversed the HSC-induced inhibition of T cell proliferation when the Ab concentration was higher than 10 µg/ml. PDL-1 is a ligand of PD-1, a member of the CD28/CTLA4 family expressed on activated lymphoid cells. PD-1 contains an immunoreceptor tyrosine-based inhibitory motif. Mice deficient in PD-1 develop autoimmune disorders suggesting a defect in peripheral tolerance. Our data suggest that that activated HSC are capable to suppress T cell responses, which is, at least partially, mediated by PD-1 ligation between activated HSC and T cells. This may be an important mechanism involved in hepatic tolerance in mice. Poster Board #-Session: P77-II MECHANISTIC STUDIES ON T CELL APOPTOSIS TRIGGERED BY ALLOGENEIC LIVER B220 + DENDRITIC CELLS. Xiaoyan Liang, 1 Lina Lu, 1 Lianfu Wang, 1 John J. Fung, 1 Shiguang Qian. 1 1 Thomas E. Starzl Transplantation Institute, Surgery, University of Pittsburgh, Pittsburgh, PA. Mouse liver allografts are spontaneously accepted, which is associated with activated T cell apoptosis. The underlying mechanisms are unclear. We have identified novel B220 + CD205 + CD11c -CD11bdendritic cells (DC) in mouse livers that were phenotypically mature, but stimulated poor [³H] TdR incorporation in allogeneic T cells. However, cell cycle analysis indicated that proliferation of T cells stimulated by liver B220 + DC was similar to BM-derived myeloid DC. Low thymidine uptake was a result of extensive activated T cell apoptosis. ∼30% of activated T cells were TUNEL positive in liver B220 + DC group (evenly distributed in CD4 + and CD8 + populations), only <10% in myeloid DC group. In contrast to myeloid DC that exacerbated allograft rejection, administration of B10 (H2 b ) liver B220 + DC (2 x 10 6 ) dramatically prolonged survival of B10, but not third party (BALB/c; H2 d ) heart allografts in C3H (H2 k ) recipients (MST 37 days vs. 10 days in control and third party groups). This was associated with a higher incidence of apoptotic cells in draining lymph nodes and spleen. T cell apoptosis induced by gld (FasL deficient) B220 + DC was reduced about 30%, indicating a partial role of Fas ligation. Liver B220 + DC induced similar apoptosis in TNFR (either p55 or p75) deficient T cells, suggesting that TNF and lymphotoxin (LT)a may not be important ligands. We examined the role of LTβ, encouraged by a RNase protection assay result that expression of LTβ in liver B220 + DC was extraordinary high, and found that liver B220 + DC from LTβ -/mice were poor apoptosis inducers, and stimulated profound T cell proliferation, suggesting a critical role of LTβ in mediating the apoptosis. T cell thymidine uptake in a liver B220 + DC/T culture was markedly restored by addition of z-VAD-fmk, a common caspase inhibitor peptide, suggesting an involvement of caspase cascades. To determine the involved pathways, proteins were isolated from sorted T cells at different time point of culture, and incubated with substrates (AC-DEVE-AFC for caspase 3, AC-IETD-AFC for caspase 8 and AC-LEHD-AFC for caspase 9). The caspase activity determined by an OD value showed that liver B220 + DC, not myeloid DC, activated caspase 3 and caspase 8, but not caspase 9 in T cells. RNase protection assay data delineated that B220 + DC, not myeloid DC, inhibited T cell mRNA expression of Bcl-w and Bfl-1 (anti-apoptosi), but enhanced Bak and Bad (proapoptosis) expression. The data suggest an involvement of multiple apoptosis pathways. Thomas Fehr, 1 Yasuo Takeuchi, 1 Josef Kurtz, 1 Megan Sykes. 1 1 Transplantation Biology Research Center, Massachusetts General Hospital, Harvard Medical School, Boston, MA. Aim. To investigate mechanisms of CD8 + T cell tolerance in a model for induction of mixed chimerism and tolerance with bone marrow transplantation (BMT) after nonmyeloablative conditioning involving anti-CD40 ligand antibody and low dose total body irradiation (TBI). Methods. Recipient C57BL/6 mice were treated with TBI (3 Gy, , one injection of anti-CD40 ligand antibody (MR1, d0) and BMT from a fully MHC-mismatched donor (B10.A, d0) to induce mixed chimerism, which was followed by FACS analysis of peripheral blood. Immunologic tolerance was assessed by donor and third party skin grafts followed over 100 days. To assess the role of interferon-γ and Fas, mice deficient for these molecules were used as recipients, and in the case of interferon-γ, also as donors. The role of CTLA4 was explored by anti-CTLA4 antibody treatment. To follow the fate of specific alloreactive cells, syngeneic chimeras expressing the transgenic alloreactive 2C T cell receptor on about 10% of peripheral CD8 + T cells were prepared and subsequently transplanted as described above. Deletion of specific alloreactive cells was followed by FACS analysis of peripheral blood using an anti-clonotypic antibody. Results. Long-lasting mixed chimerism and permanent donor-specific skin graft acceptance was achieved with BMT after conditioning with TBI and anti-CD40 ligand, but no T cell depletion. The same regimen with TBI on d0 instead of d-1 did not reliably induce mixed chimerism, unless CD8 + T cell depletion was added to the regimen. Thus, CD4 + T cells are readily tolerized by the d0 TBI regimen, but CD8 + T cell-mediated alloreactivity is critical to overcome. When alloreactive 2C CD8 + T cells of recipients conditioned with d-1 TBI, anti-CD40 ligand antibody and BMT were followed in peripheral blood, they showed rapid and complete deletion by d7. CD8 + T cell tolerance was dependent on the presence of CD4 + T cells, since CD4 + T cell depletion abolished the achievement of chimerism and tolerance. In contrast, neither interferon-γ nor Fas/ Fas ligand interactions were necessary. CD8 + T cell tolerance could be blocked by one injection of anti-CTLA4 antibody or by cyclosporin A treatment for the first 14 days. Conclusion. CD8 + T cell tolerance was achieved by BMT after conditioning with anti-CD40 ligand and low dose TBI on day-1. It involved rapid peripheral deletion and was dependent upon CD4 + T cells, the calcineurin pathway and CTLA4, but not upon Fas/ Fas ligand interactions or interferon-γ. Background. We previously reported that intratracheal delivery of alloantigen induced regulatory cells in mouse heart grafting model. Immunosuppressive cytokines such as interleukin-10 (IL-10) and transforming growth factor (TGF)-β were down-regulator of immune responses. In several studies focused on mechanisms of induction or function of regulatory cells, IL-10 and/or TGF-β are thought to play critical roles. Here, we investigated the roles of IL-10 and TGF-b in the induction and effector phase of the regulatory cells. Methods. CBA (H-2 k ) mice were pretreated with intratracheal delivery of C57BL/10 (H-2 b ) splenocytes and administration of neutralizing anti-IL-10 or anti-TGF-β monoclonal antibody (mAb). Seven days after the pretreatment, naive CBA mice were given adoptive transfer of splenocytes from the pretreated mice and underwent heart grafting from C57BL/10 mice the same day as the adoptive transfer. To determine the role of these cytokines in the effector phase of regulatory cells, anti-IL-10 or anti-TGF-β mAb was administered weekly into the second recipients with the adoptive transfer. Results. Untreated CBA mice rejected C57BL/10 cardiac grafts acutely (median survival time [MST]: 7 days). Pretreatment with intratracheal delivery of C57BL/10 splenocytes prolonged graft survival significantly (MST: 65 days). In induction phase, administration of anti-IL-10 mAb abrogated prolonged survival induced by the adoptive transfer from mice pretreated with intratracheal delivery of alloantigen (MST: 20 days), whereas concurrent administration of anti-TGF-β mAb could not abrogate this effect (MST: 88 days). In effector phase, secondary recipients which received adoptive transfer plus anti-IL-10 mAb did not prolonged survival of C57BL/10 cardiac grafts (MST, 27 days), whereas anti-TGF-β mAb could not abrogate the function of regulatory cells (MST, 53 days). Conclusion. IL-10 but not TGF-β was required in the induction and effector phase of the regulatory cells by intratracheal delivery of alloantigen. Introduction. Central transplantation (Tx) tolerance through hematopoietic chimerism initially requires peripheral immunmodulation to prevent rejection of allogeneic stem cells or bone marrow (Bm). We investigated the role of the T cell adhesion and costimulatory molecule LFA-1 as a target for immunmodulation at the time of fully MHC-mismatched Bm transfer. Methods. After mild cytoreduction with busulfan, C57BL/6 (B6) recipient mice received 2 x10 7 BALB/C Bm cells, and a one week treatment with single agents or combinations of antibodies against LFA-1, CD40L, and everolimus [RAD, rapamycin]. The kinetics of hematopoietic chimerism formation was monitored by flow cytometric analysis, as was the allo-Bm dependent depletion of host allo-specific Vβ11 + T cells. Central transplantation tolerance was tested with both full-thickness skin grafts and heterotopic heart grafts 3 and 5 months after Bm transfer, respectively. Results. Under these conditions no chimerism was observed with any of the reagents anti CD40L, anti LFA-1 or everolimus when given alone. In marked contrast, combinations of anti LFA-1 + anti CD40L or everolimus + anti CD40L consistently induced high levels of stable multi-lineage chimerism. The combination of anti LFA-1 +RAD resulted in declining chimerism in about 50% of Bm recipients, with either complete loss or stabilization of chimerism at a lower level. Furthermore, these outcomes appeared predictable by an earlier significant loss of Vβ11 + T cells in those Bm recipients that would stabilize but not in those that would eventually loose their chimerism. Skin and heart Tx 3 and 5 months after BmTx, in the absence of any further treatment, revealed effective central transplantation tolerance. Strikingly, even very low levels (<1%, if without busulfan conditioning) of stable T cell chimerism were sufficient to prevent skin graft and chronic heart graft rejection. Combinations of any 2 out of the 3 reagents anti-LFA-1, anti CD40L, and everolimus induced hematopoietic chimerism in fully allo-MHC mismatched Bm recipients. Stable chimerism alone, even at very low levels, prevented skin and heart graft rejection, including protection from vascular intimal thickening ('chronic rejection'). Onoe, 1 Hideki Ohdan, 1 Daisuke Tokita, 1 Hidetaka Hara, 1 Yuka Tanaka, 1 Wendy Zhou, 1 Kohei Ishiyama, 1 Hiroshi Mitsuta, 1 Kentaro Ide, 1 Toshimasa Asahara. 1 Although livers transplanted across MHC barriers in mice are normally accepted without recipient immune suppression, underlying mechanisms remain to be clarified in detail. To identify the cell type and mechanism that contributes to induction of such a tolerance state, we established an allogeneic mixed hepatic constituent cell-lymphocyte reaction (MHLR) assay. Hepatic constituent cells (HCs) were isolated from B6 and Balb/c mice as stimulators, and splenocytes were isolated from B6 mice as responders. Irradiated HCs were co-cultured with fluorescent dye (CFSE)-labeled B6 splenocytes. In the allogeneic MHLR, whole HCs did not promote proliferation of allo-reactive T cells. The MHLR resulted in marked proliferation of both allo-reactive CD4 + and CD8 + T cells only when CD105 + cells, which are exclusively liver sinusoidal endothelial cells (LSECs), were depleted from whole HCs by magnetic cell sorting. Such proliferation of allo-reactive T cells was inhibited by returning LSECs to the MHLR. Physical separation of LSECs from the responder-stimulator cells by using a dual chamber transwell culture system in the MHLR eliminated LSEC-induced inhibitory effects on allo-reactive T cell proliferation. Administration of a third party of LSECs did not affect allospecific T cell proliferation. To test the tolerizing capacity of LSECs toward alloreactive T cells, B6 splenocytes that had transmigrated through the monolayer of either B6 or Balb/c LSECs were restimulated with irradiated Balb/c splenocytes. Nonresponsiveness of T cells that had transmigrated through allogeneic Balb/c LSECs and marked proliferation of T cells that transmigrated through syngeneic B6 LSECs were observed after the restimulation. These findings indicate that allogeneic LSECs have a capacity to induce allo-reactive T cell tolerance through sufficient cell contact. To address how LSECs tolerize allo-reactive T cells, we analyzed the phenotype of the LSEC. Naïve LSECs constitutively expressed FasL. FasL blocking by mAbs eliminated the tolerizing capacity of the LSECs. Consistently, in allogeneic MHLR using whole HCs (including the LSECs) as stimulators, T cells at the early period of cell division expressed phosphatidylserine, which is expressed on the surface of apoptotic cells. Thus, FasL-induced apoptosis participate in tolerization of allo-reactive T cells by LSECs of liver allografts. NF-kB is a key regulator of transcription following TCR and co-stimulatory receptor ligation. To determine the role of T cell-intrinsic NF-kB activation in acute allograft rejection, we have used IkBaDN-Tg mice (H-2b) that express an inhibitor of NF-kB restricted to the T cell compartment. We have previously shown that these mice permanently accept fully allogeneic cardiac grafts and secondary donor skin grafts. Our current study investigates the mechanisms of tolerance in this setting. Mixed lymphocyte reactions and ELISpot assays performed using splenocytes from tolerant animals showed reduced T cell responses, suggesting that alloreactive T cells are either hyporesponsive or deleted. We hypothesized that reduced NF-kB activation in T cells during development may result in increased generation or function of CD4+/CD25+ regulatory T cells (Treg). However, IkBaDN-Tg mice did not have increased number or function of Treg either before or after cardiac transplantation. Similarly, transfer of wildtype splenocytes into tolerant IkBaDN-Tg mice freshly transplanted with a second heart of donor origin led to cardiac allograft rejection. Taken together, these experiments suggest that regulation is not a major mechanism by which IkBaDN-Tg mice achieve tolerance. NF-kB activation has been linked to survival of many cell types in vitro. Therefore, to determine if deletion of alloreactive T cells was one of the mechanisms of tolerance operating in transplanted IkBaDN-Tg mice, these animals were crossed with mice (H-2b) expressing the antiapoptotic Bcl-xL protein as a transgene in T cells. To determine the impact of Bcl-xL transgenic expression in vivo, allogenic hearts were transplanted into wildtype, IkBaDN-Tg, Bcl-xL-Tg and IkBaDN/Bcl-xL-Tg littermates. In contrast to IkBaDN -Tg mice that accepted allogeneic cardiac grafts indefinitely, IkBaDN/Bcl-xL-Tg mice effectively rejected their allografts. Thus, overexpression of Bcl-xL in IkBaDN-Tg T cells was sufficient to promote acute allograft rejection. Together, our results suggest that apoptosis of alloreactive T cells plays an important role and is necessary for the transplantation tolerance observed in mice with defective T cellintrinsic NF-kB activation. Therefore, reduced NF-kB activation in T cells favors transplantation tolerance at least in part by limiting T cell survival rather than by inducing T cell regulation. Liver allografts in mice are accepted spontaneously in all MHC strain combinations without the requirement for immunosuppression. The mechanisms underlying this phenomenon remain largely undefined. Recently, CD4 + CD25 + cells have been shown to represent a unique population of immunoregulatory cells and play an important role in the downregulation of T cell activation and maintenance of transplant tolerance. In this study, we examined the role of CD25 + regulatory T cells in liver transplant tolerance induction by in vivo administration of anti-CD25 monoclonal antibody. Methods: Mouse MHC mismatched orthotopic liver transplantation was performed from B10 (H2 b ) donors to C3H (H2 k ) recipients. The rat anti-mouse CD25 mAb (PC61) was given to the donors or recipients at 250 µg/d, pre-transplant day -6, -4, -2 or to the recipient post-transplant day 0, 2, 4, by intraperitoneal injection. Liver graft rejection was determined by recipient survival. The apoptotic activities of liver graft infiltrating cells (GIC) and recipient spleen cells (SC) were examined by in situ TUNEL staining. Results: Anti-CD25 mAb pre-treatment to the donor did not significantly affect liver allografts survival, whereas the liver grafts from anti-CD25 mAb either pre-or posttransplant-treated recipients were rejected acutely in comparison to the indefinite graft survival (>100 days) of the controls (Table) . Histological evaluation of liver grafts from anti-CD25 mAb treated recipients showed markedly increased graft infiltrating cells in both portal triad and parenchymal areas. The frequency of apoptotic cells was less in treated mice compared with that of control mice by TUNEL staining. Conclusions: The recipients' CD4 + CD25 + regulatory cells play a very important role in spontaneous liver allograft acceptance. Depletion of recipient, but not donor CD4 + CD25 + regulatory cells results in liver allograft acute rejection and associated with reduced apoptosis of liver GIC and SC. We and others have demonstrated that immune damage by alloreactive (CD4independent) CD8 + T cells is difficult to suppress by strategies which readily regulate alloreactive CD4 + T cells. Recently, we have discovered that targeting LFA-1 alone preferentially suppresses CD8-dependent versus CD4-dependent rejection of allogeneic hepatocytes. In addition, short-term immunotherapy targeting both LFA-1 and CD40/CD40L costimulation produced synergistic effects and resulted in suppression of CD8-dependent rejection such that longterm survival (LTS) of hepatocytes up to 90 days was achieved in the majority of recipient mice. The purpose of this study was to determine whether recipient mice with LTS induced by short-term immunotherapy targeting LFA-1 and CD40/CD40L costimulation were resistant to immune damage when challenged with naïve CD8 + T cells. Methods: Two million FVB/N (hA1AT transgenic, H-2 q ) hepatocytes, were transplanted into CD4 KO (H-2 b ) mice and treated with anti-LFA-1 mAb (0.3mg, d0-6) and anti-CD40L mAb (1mg, d0, 2, 4, 7) by ip injection. Hepatocyte survival was monitored by detection of serum reporter product, hA1AT, by ELISA. Hepatocyte recipients with longterm survival (>60 days) were challenged by adoptive transfer of 2x10 6 naive CD8 + T cells. Results: Combined treatment with anti-LFA-1 and anti-CD40L mAbs significantly prolongs hepatocyte allograft survival in CD4 KO mice (N=19) such that 95% of recipients achieved hepatocyte survival >60 days. A subgroup of recipient mice induced to achieve hepatocyte survival >60 days by this combined treatment strategy were subsequently challenged with 2x10 6 naïve CD8 + T cells. Continued survival of longterm hepatocellular allografts (>30 days) despite adoptive transfer of CD8 + T cell was observed in 4 of 5 CD4 KO recipients. Control SCID mice with functioning hepatocellular allografts which were adoptively transferred with 2x10 6 naive CD8 + T cells rapidly rejected hepatocytes with MST of 17 days (N=5). Conclusion: Targeting of both CD40/CD40L and LFA-1 not only suppresses (CD4independent) CD8-dependent hepatocyte rejection, but also appears to induce immunoregulation resistant to challenge with naïve alloreactive CD8 + T cells. Purpose: A previously reported nonmyeloablative conditioning protocol (Standard Regimen) that can induce mixed chimerism and renal allograft tolerance in primates, requires a 6-day preparative period before transplant, and is consequently limited to recipients of living donor allografts. In this study, we aimed to modify the regimen for cadaveric donor application. Method: The standard regimen consists of total body irradiation (TBI), thymic irradiation (TI), antibody treatments and donor bone marrow (DBM), followed by a one-month course of cyclosporine. Various timimg or dosages of TBI, TI and antibodies were studied in six groups. Results: Compression of 3 Gy of TBI and 7 Gy of TI into a 24-hour period led to unacceptable toxicity (Regimens A and C). Delayed irradiation and DBM failed to induce chimerism even with additional aCD154 (Regimen B). A single 3 Gy of TBI induced chimerism in 2/3 recipients without TI when combined with aCD154 and ATG (Regimen D). Nevertheless, all recipients treated with Regimen D eventually rejected their allografts. Addition of TI (4Gy) to Regimen D again resulted in toxicity (Regimen E). An antibody combination of ATG, aCD154 and aCD8 in conjunction with reduced dose TBI consistently induced mixed chimerism, less infectious complications, and prolonged graft survival (Regimen F). A single dose of TBI appears to be more effective for inducing chimerism than fractionated dosages. An antibody combination of ATG, aCD154 and aCD8 combined with TBI 2.5 Gy improved consistency of chimerism induction with less risk for infectious complications and appears promissing for induction of tolerance following a less than 24 hours pre-transplant conditioning protocol. The conditioning regimens and mixed chimerism Regimen TBI (Gy) TI (Gy) DBM antibody chimerism graft survival (days) standard D-6,-5 D-1 (7) Interruption (PTLD or uncompliance) of immunosuppression (IS) in long-term graft recipients usually results in a rejection. However, some recipients who stopped IS keep a good graft function ("operationally tolerant"). We previously showed that kidney recipients without IS and minimally immunosuppressed displayed more CD25 + CD4 + T cells than patients with chronic rejection. In this study, we performed a more exhaustive analysis of potentially relevant T cell phenotypes in these patients versus recipients with chronic rejection. Methods: Four groups were studied: 1) "operationally tolerant" recipients with a functional graft, drug-free (DF) for more than 3 years (n=4), 2) patients under low doses of steroid monotherapy (<10mg)(Ster,n=7), 3) patients with chronic rejection (CR,n=6) and 4) normal individuals (NL,n=5). Using four-color flow cytometry, we analyzed CD25 + CD4 + T cells for major regulatoryassociated molecules and some chemokines receptors in CD4 + and CD4 -T cells. Results: GITR, TLR4 and CD103 are not overexpressed in CD25 + CD4 + T cells, whatever the groups. Same conclusion was obtained when CD25 hi CD4 + T cells were analyzed. In half of DF patients, 30% of CD25 + CD4 + T cells stained positive for intracellular CTLA4 compared to 8.1 and 11% for CR and Ster recipients. In 3 out of 4 DF patients, CCR7 + CD25 + CD4 + T cells were low (12% +/-2 versus 57% +/-19 for CR group, p<0.05). In one DF patient (#60), 70% of CD25 + CD4 + T cells were CCR7 + , mimicking the profile observed in CR patients. Ster patients split in two groups displaying 50% and 18% of CCR7 + . Other Th1/Th2 chemokine receptors (CCR9,CCR5,CXCR3,CCR4) did not exhibit difference of expression. Finally, when CD4 + and CD4 -T cells were examined, no significant difference was observed. However, again, patient #60 had a distinct profile with increased CCR9 + CD4 + T (17%) and CD40L + CD4 + T cells (36.8%) versus all other patients (within 4-5% and 5-6% respectively), a profile which has been associated Abstracts with Tr1 cells. Conclusions: Blood of "operationally tolerant" patients were characterized by an increase in CD25 + CD4 + T cells. CCR7 + CD25 + CD4 + T cells were usually low. Some patients exhibited an increase in intracellular CTLA4 in CD25 + CD4 + T cells. In addition, one DF recipient had a unique blood phenotype with increased CCR7 + CD25 + CD4 + , CCR9 + CD4 + and CD40L + CD4 + T cells. Thus, operationally "tolerant" patients may develop distinct patterns possibly related to mechanisms of unresponsiveness. We have previously reported that sustained elevated levels of systemic IL-10 are associated with the generation and maintenance of specific allograft tolerance in nonhuman primates (NHP) induced by a combination of anti-CD3 immunotoxin (IT) with Deoxyspergualin (DSG). This study was performed to determine the phenotype of the cells responsible for the early IL-10 dominated cytokine milleu. Peripheral blood samples were collected from NHP within 3 months of tolerance induction with IT and DSG, and compared to samples from normal control animals. Serum cytokines were quantified with multiplex kits on a Luminex® and cells were stained with rhesus-reactive fluorescent antibodies for flow analysis. RT-PCR was used to examine mRNA expression in peripheral lymphocytes. Compared to normal animals, the IT and DSG treated allograft recipients had significantly elevated levels of serum p<0.03 ). In addition, significantly more B cells from NHP treated with IT and DSG expressed intracellular IL-10 (15.7% ± 4 vs. 7.8% ± 1, p<0.004). There were similar increases in IL-10 expressing monocytes (72.6% ± 8 vs. 26.4% ± 7, p<0.004) and NK cells (43.9% ± 12 vs. 19.6% ± 6, p<0.008). Of note, only the percentage of individual recipients' IL-10 positive B cells correlated with their serum IL-10 levels (r=0.90, p<0.04). TRAF3 expression was decreased in the mononuclear cells of treated NHP recipients compared to normal controls. Since TRAF3 inhibits the IL-10 promoter, we postulate that downregulation of TRAF3 by DSG unleashes early IL-10 expression to foster a milieu favorable for tolerance development. Moreover, the prominence of IL-10 producing non-T cells suggests that preservation of B cells, monocytes and NK cells is crucial for the synergy between DSG and T cell depletion therapy in promoting tolerance. 1, 2, 3, 4 A. House, 3, 4 A. M. Jevnikar. 1, 2, 3, 4 1 RRI; 2 LHRI; 3 Med, Surg, Micro&Imm, Univ. West. Ont; 4 MOTS, LHSC, London, ON, Canada. Extracorporeal photophoresis (ECP) involves the ex vivo treatment of peripheral blood leukocytes with a photosensitizing agent (8-methoxysporalen, MOP) with UVA before re-infusion into patients. ECP has been used to treat refractory acute rejection in cardiac transplants, suggesting a profound effect on circulating immune cells. Although ECP induces apoptosis of lymphocytes and apoptotic bodies can exert immunosuppressive effects, the immunomodulatory mechanism(s) of ECP in solid organ transplantion remains unknown. We therefore tested the ability of ECP-treated splenocytes undergoing apoptosis to inhibit allogeneic responses. We first confirmed that ECP induced apoptosis in 70% of CD3+ T-cells in spleen cell cultures by 24 hours, using FACS-Annexin V labelling. Although 30% of T-cells remained viable, they were unresponsive to both ConA and alloantigen stimulated proliferation. In contrast, bone-marrow derived dendritic cells (DC) were resistant to apoptosis. We then tested the ability of ECP-treated splenocytes (ECP-S) to inhibit allogeneic responses in MLC. Addition of ECP-S from B6 mice (B6-ECP-S) suppressed allo-specific responses of non-treated B6 responders to BALB/c stimulators (SI=0.8) as compared to control MLC (SI-22, p<0.05). Similar treatment using ECP-S from BALB/c stimulators had no effect. Table 2 details the results of this study. Data were evaluated by non parametric tests (p<0.05 was considered significant). Spleen Tx has a synergistic effect with immunosuppressive treatment. In this design, the spleen cells were administered through the portal vein (isolated cells) or a donor spleen graft was anastomosed to the recipient portal system. Both techniques show a prolonged graft survival when compared to immunosuppression alone (p<0.05) and to controls (p<0.05). Group 4 and 6 have a significantly better hystology compared to controls. Further studies will help to elucidate a possible role of spleen cells in induction and stability of tolerance to unrelated renal grafts. The choice of optimal immunosuppressive drug combinations following human renal allotransplantation remains challenging. Regimens incorporating Alemtuzumab induction with Tacrolimus (TAC) or Sirolimus (SLR) maintenance monotherapy have recently gained prominence, significantly increasing patient and allograft survival. Therefore, we examined functional and transcriptional differences in protocol biopsies and PBMC from renal transplant patients given Alemtuzumab induction with either SLR or TAC monotherapy. Biopsies were processed for cell phenotyping and RNA was extracted from biopsies and PBMC for real-time PCR. Patients treated with Alemtuzumab alone displayed a single depletion-resistant effector memory phenotype (CD3+CD4+CD45RA-CD62L-) that were uniquely prevalent within the first month post-transplant and during rejection. These cells were resistant to steroids, Deoxyspergualin and SLR, in vitro, but were TAC responsive. Transcripts for inflammatory cytokines (IL-2, TNFa) and T H 1, cytotoxic T-cells (T-bet, FasL, RANTES) were significantly upregulated in SLR biopsies despite lymphopenic conditions. However, patients placed on TAC monotherapy had significantly decreased transcripts for costimulatory markers (CD80, CD86, CD154, ICOS), cytotoxic T-cell transcripts (IFNg, FasL, Gr B) and also chemokines (RANTES, MIP1a, MIG, IP-10); without evidence of acute rejection thus far (>6 months). In contrast, biopsies from Alemtuzumab depleted patients on SLR displayed significantly decreased transcripts for Smad-3, TGFb, VEGF and type I collagen at 6 and 12 months post-transplant compared to biopsies from TAC patients. Transcriptional profiling of patient biopsies has shown that TAC decreases leukocyte chemotaxis and activation early post-transplant while SLR has beneficial effects on long-term outcomes by reducing early expression of fibrotic mediators of chronic rejection. Indeed, such molecular analyses and data now advocate for a combined approach to maintenance immunotherapy following pronounced depletion with Alemtuzumab and warrant significant study as the potential new standard of care. DST, 168, 185, 233, 274, >290, >724, >743 168, 185, 210, 231, 20, 29, 86, >519 90, 90, 14, 44, 14, 469 sirolimus alone 9, 11 7, 7 Conclusions. IDEC-131 in combination with DST and sirolimus prolongs renal and skin allograft survival. In addition, alloantibody formation is delayed until the time of allograft rejection. In monkeys that have not rejected their grafts, alloantibody has not developed even in response to a secondary immunologic challenge. There is no evidence of durable tolerance, humoral tolerance, or split tolerance. Labscreen results in combination with MESF quantitation were useful in explaining unexpected crossmatch findings, e.g., negative in highly sensitized patients, or positive results by uncovering veiled specificities. In serum from six patients undergoing late humoral rejection (1 -8 years post-transplant) of their kidney transplants, Labscreen PRA beads consistently identified antibodies to mismatched, donor-specific HLA antigens, whereas the recipients' HLA were nonreactive. Moreover, differential levels of donor-specific anti-HLA antibodies (DSA) and their responses to anti-rejection therapy were observed, and no DSA was detected in six other cases where the biopsies were either negative (N=4) or rejection was read as strictly cellular (N=2). Four patients with low levels of DSA (<40,000 MESF) and a positive FCXM were deemed suitable to transplant in a "high risk" immunosuppression protocol of pretransplant plasmapheresis plus FK506, MMF and Thymoglobulin. Post transplant monitoring showed a post-transplant increase in DSA of 3-to 7-fold above pretransplant levels within two weeks, but then a rapid decline and disappearance of DSA. In conclusion, combination with single HLA antigen beads provides a useful method for assessment of immunologic risk due to anti-HLA antibodies in allosensitized patients and in post-transplantation monitoring. Background: Prior to their application in humans, tolerance induction protocols have to be successfully vetted in nonhuman primate (NHP) models. One particularly promising approach for tolerance induction involves the generation of a state of mixed donor/ recipient chimerism. Application of this approach in a cynomolgus macaque model would be facilitated by the establishment of a reliable technique for the quantitative assessment of peripheral blood mononuclear cell chimerism. Methods: A short tandem repeat (STR) kit (GenePrint® Fluorescent Monoplex STR Systems, Promega Corp, Madison, WI) developed for use in humans was tested to determine if it could be used to detect chimerism in NHPs. Peripheral blood samples were obtained from cynomolgus macaques prior to their use in experimental tolerance induction protocols. Genomic DNA was isolated from the samples and its concentration was determined. A total of 1 nanogram of genomic DNA was amplified with respective primers. The end product was mixed with allelic ladders and the results were read in an ABI PRISM® 3100 Avant Genetic Analyzer. Results: Since it is based on detection of non-expressed genomic sequences, the technique was found to be simple and rapid. Preliminary assays were first performed to define the presence of working discriminating alleles in macaque genome. These pilot assays revealed that the TPOX and CFS1PO alleles exist in both rhesus and cynomolgus species. After definition of the working alleles, we sought to determine whether these STR loci were sufficiently informative to permit discrimination between potential donor/recipient pairs. Out of 40 cynomolgus macaques investigated thus far, 2 pairs demonstrated close identity at both of these alleles and could not be discriminated by the technique. This rate implies that this technique should interfere minimally with the assignment procedure of donor-recipient pairs based upon molecular based tissue typing or highly reactive mixed lymphocyte cultures. In primates demonstrating evidence of chimerism following treatment, a quantitative assessment of the level of chimerism present can be determined by comparison of peaks obtained prior to and following treatment. Conclusion: Assessment of chimerism is feasible in cynomolgus monkeys, using human specific STR kits specific for TPOX and CFS1PO alleles. Introduction: Since T cells are central to the immune response that leads to the rejection of transplanted organs, we have established a novel sensitive pharmacodynamics (PD) assay. This assay monitors the effect of drugs applied both in vitro and as in vivo on Tcell function in whole blood. Thus, measuring the direct effect of a new potential immunosuppressant drug on T-cell function, this novel PD assays may offer a first line approach to correlate drug effects and observations in experimental animal models with clinical outcome. The goal of this study is to test the PD assay to monitor the immunosuppressive effects of NIBR-0071 a novel kinase inhibitor in rat, monkey and human. Methods: Whole blood was stimulated with polyclonal stimuli and then assessed for T-cell activation markers and intracellular effector cytokines by 4-colour flow cytometry. Systematic exploration of stimulation protocols enabled the definition of an optimized PD assay that was then applied in rats and monkeys to analyze the effects of NIBR-0071 on T-cells in vivo. As an alternative readout to flow cytometry, quantitative PCR after in vitro stimulation was employed. Results: In vitro stimulation with PMA/anti-CD28 mAb resulted in fast (<24hrs) and reliable up-regulation of CD25 and CD69 and increased the expression of intracellular IL-2 and TNFa in T-cells. This activation was sensitive to inhibition by NIBR-0071 with IC 50 values in the range of 0.1 mM for rat and 1 mM for monkey and human blood. In rats, the effective exposure level of NIBR-0071 at a dose inducing prolongation of graft survival could be correlated with the inhibition of T-cell activity markers TNFa and CD25. In cynomolgus monkeys inhibitory effects of at therapeutic drug dose of NVP-2 were observed in 3 out of 4 monkeys for CD25, TNFa and IL-2 and in 2 out of 4 monkeys for CD69. Using RT-PCR as a readout, the human T-cell activation markers IL-2, TNFa, CD154, OX40, and ICOS showed suitable stimulation index and sensitivity to standard immunosuppressants. Our studies demonstrate that PD assays can predict drug efficacy after administration in vivo in rats and monkeys. Thus, PD studies using whole blood assays may be useful to select dose and type of immunosuppressants in animals and patients. In addition, this novel PD assay may accelerate the selection of lead compounds as well as foster the discovery of new mechanisms of action for compounds. Background: Immune monitoring performed on peripheral blood from transplant recipients may use flow cytometry or molecular biology techniques. Flow cytometry assays cells that are phenotypically characterized, whereas gene monitoring techniques predominantly start with RNA extraction from unfractionated cell populations. We therefore investigated how the effects of immunosuppressive drugs on cytokine production in stimulated whole blood, as determined by flow cytometry, would correlate with those measured with quantitative real-time PCR (TaqMan® RT-PCR). Methods: Blood drawn from cynomolgus monkeys was exposed to incremental amounts of cyclosporine (CsA; 300, 600, 900 and 1200 ng/ml) or tacrolimus (TRL; 8, 20, 40 and 80 ng/ml) before lectin stimulation in vitro. Blood was in parallel either stained for CD3, IFN-γ, IL-2, IL-4, and TNF-α and analyzed on a flow cytometer with various gating strategies (lymphocyte gate, CD3 + gate, or an extended gate including all distinct populations) or submitted to RNA extraction for quantitation of the above mentioned cytokines mRNA transcripts using TaqMan® RT-PCR. Results: Both methods revealed a parallel dose-dependent inhibition of cytokine production in stimulated and treated blood. The 50% inhibitory concentrations (IC 50 's) for T-helper 1 cytokines ranged from 511-771 ng/ml (CsA) and 15-29 ng/ml (TRL) with flow cytometry, and from 275-529 ng/ml (CsA) and 11-48 ng/ml (TRL) with TaqMan® RT-PCR. Both assays correlated well (r=0.76, Table) . Background: Pharmacokinetic (PK) monitoring and targeted exposure of CsA substantially reduces the risk of acute rejection and minimizes the risk of drug toxicity and renal injury. We postulate that pharmacodynamic (PD) monitoring will add critical information of biological effect that may facilitate the individualization of immunosuppression during the process of host-graft accommodation. Methods: A total of 15 renal transplant patients were studied with combined PK and PD profiling. Those receiving PK-modeled iv. CsA (1.5 mg/kg bid. over 2 hours, days 1-5) were studied on days 3, 7, 14 and 28, and stable patients receiving oral CsA (mean 2 mg/Kg bid) were studied at months 3-12 and > 12 post-transplant. PD analysis was performed by measurement of intracellular IL-2 production in a single-step FACS assay, while CsA concentrations were measured using a specific Mab assay. Results: In patients receiving iv CsA, intracellular IL-2 expression declined rapidly during the first 2 hr of the dosing interval reaching a nadir at the time of maximum CsA concentration (2 hr; 1193± 377 µg/L). On day 3, IL-2 + CD3 + cells fell from 19± 8% predose to 1.1± 0.2% by 2 hr and 0.9± 0.4% by 3 hr before returning towards baseline. Predose IL-2 expression declined slowly throughout the first month (day 28: 12± 10% IL-2 + CD3 + cells), and was uniformly highly suppressed at 2 hr (0.8± 0.6% IL-2 CD3 + cells). There was a marked reduction in pre-dose IL-2 expression in stable patients, resulting in an almost complete disappearance of IL-2 + CD3 + cells from the peripheral circulation throughout the dosing interval. Values at 3-12 months (pre-dose: 4± 3%, and 2hr 3± 3%) and > 12 months (pre-dose: 4± 3% and 2hr 0.8± 0.1%) were similar, suggesting that immunological accommodation occurs by 3 months in quiescent patients and is reflected by sustained inhibition of IL-2 production. Molecular expression of IL-2/cell also declined markedly throughout the dosing interval although was not extinguished, (day 3: 3906± 1538 pre; 1462± 115 2 hr: day 28: 4144± 1216 pre; 1581± 763 2hr; > 12 months: 2633± 1299 pre; 2037± 859 2h). Conclusion: PD measurement of IL-2 production (a) offers a simple, rapid and reproducible measure of biological effect, (b) corresponds closely to the PK concentration throughout the dosing interval in early phase treatment and (c) remains markedly and continuously suppressed in stable patients possibly reflecting immune accommodation; (4) a sub-set of CD3 + cells remain refractory to inhibition by CsA. Their function remains uncertain. HCV-associated liver failure is common indication for liver transplantation, and infection often recurs after transplantation. Histologic evidence of recurrence is apparent in approximately 50% of HCV-infected recipients in the first postoperative year. Exposure to corticosteroids is associated with higher mortality, increased HCV viremia. By contrast, it is not clear whether calcineurin inhibitors such as cyclosporin A (CsA), FK506 or azathioprine affect the histologic recurrence of HCV. The development of anti-HCV agents has been accelerated by the establishment of cell lines in which HCV genome RNA self-replicates efficiently (referred to as HCV replicon cells). We examined the effects of various compounds on the replication of the HCV genome using HCV replicon cell lines, and observed a suppressive effect of CsA on HCV genome replication. Treatment with 1 micro gram /ml CsA for 7 days decreased the amount of HCV NS5A and NS5B proteins to undetectable levels. A similar reduction of HCV protein and RNA synthesis was not observed after treatment with FK506 (1 micro gram /ml). CsA seemed to exert the anti-HCV activity by the pathway independent from calcineurininducible signaling. This was supported by an analysis of HCV replication using the CsA derivatives, NIM811 and PSC833. NIM811 binds cyclophylins but does not inhibit calcineurin or the downstream NF-AT pathway, whereas PSC833 associates with neither cyclophilins nor the CN/NF-AT pathway. HCV replicon cells treated with NIM811 inhibited replication of HCV genome, whereas PSC833 did not affect the efficiency of HCV replication. These data indicate that the inhibitory function of CsA is mediated by a mechanism independent of its immunosuppressive function, but rather by impairment of cyclophilin function. Cyclophilins are evolutionally conserved proteins with peptidyl-prolyl isomerase (PPI-ase) activity, which is essential for protein folding and which recently has been suggested to have a role in the regulation of transcription and differentiation. Mammalian cells contain more than 10 species of cyclophilins, and not all cyclophilin functions are explained by PPI-ase activity. Thus, identification of a cyclophilin that is essential for replication of the HCV genome sheds further light on the mechanism of HCV replication. Furthermore, our data showing an inhibitory effect of CsA on HCV genome replication highlight the importance of selecting an appropriate immunosuppressive agent in liver transplant recipients at increased risk of recurrent HCV infection. Introduction: Immunosuppressive drugs are associated with increased risk for viral infections. Immunosuppression inhibits viral-specific immunity, but this is difficult to quantify. Calcineurin inhibitors such as CsA primarily suppress T cells but have also been shown to inhibit B cells, macrophages and dendritic cells which can affect antiviral immunity. In this study, we examined the effects of CsA and FK506 on anti-CMV and anti-EBV T cell responses using Cytokine Flowcytometry (CFC). Methods: Blood samples from 3 normal healthy subjects who had demonstrable Antiviral T cell responses were used for the study. Inhibition of anti-viral activity by CsA and FK506 was examined at trough and C2 concentrations (CsA 250, 1000 ng/ml; FK 10, 40 ng/ml respectively). Mycophenolic Acid (MPA) and Methyl Prednisolone were used at 20 mcg/ml and 100 mcg/ml respectively. For CFC assays, whole blood was incubated with CMV or EBV lysates and Brefeldin A for 6 hours, which inhibits Golgi secretion of cytokines and allows intracellular staining with Anti-IFNg antibody. All measurements are described as Mean ± SEM values. Percentage inhibition was calculated as the decrease in IFN-g positive population. Results: At trough levels, CsA decreased cytokine positive cells by 2.3 ± 2.3 % in the CD8 T cell population and 7 ± 4.3% in the CD4 T cell population. Similar levels of inhibition was seen with FK (8.3 ± 4.6% of CD8 T cells and 10 ± 5.29% of CD4 T cells The two remaining animals were sacrificed because of graft rejection (PTD 47, 53) , and one was found to have severe systemic CMV disease postmortemly. CMV infections were confirmed by PCR analysis for BCMV DNA on tissues from lung, liver and kidney. Since then, we instituted a BCMV prophylaxis protocol to cover the duration while animals received immunosuppressive mediations. 4 NHPs received either oral valganciclovir (15 -20 mg/kg per day) and occasion intravenous GCV (5 -7.5 mg/kg per day). Only one animal developed CMV disease and was sacrificed because of xenograft rejection (PTD 57). Since the bioavailability of oral valganciclovir was uncertain in NHP, we modified the protocol (as suggested by M. Jonker) to intramuscular injection of GCV (2.5 mg/kg per day), covering only the peri-induction period (PTD -7 to +21). Six NHPs that received this prophylaxis protocol remained free of BCMV infection (no sign of CMV disease and negative PCR analysis for BCMV on these animals' PBMC). Functional porcine islet xenografts were documented beyond 51 days. In addition, no side effects were seen in animals received valganciclovir or GCV. GCV is a safe and effective prophylactic agent against BCMV infection in NHP xenograft recipients. PCR analysis for BCMV is a valuable diagnostic test for BCMV disease in NHPs. Poster Board #-Session: P108-II Cold ischemia time (CIT) and its main consequence, delayed graft function, have been recognised as significant contributing factors for chronic rejection and its hallmark feature, transplant arteriosclerosis (TA). Rapamycin (RPM) has been shown to inhibit growth factor action on immune and non-immune cells. The aim of this study is to evaluate the impact of CIT and RPM on graft aortic arteriosclerosis in a syngenic rat model devoid of immunologic effects. Male Lewis rats (weight 250-300grs) served as donors and recipients of syngenic aortic interposition grafts. Segments of thoracic aorta, 1-1.5cms were transplanted end to end to infrarenal recipient aortas. Grafts were preserved in cold (4°C) Eurocollins solution for 0 or 24 hrs. RPM (2mg/kg) in the form of Rapamune was administrated daily by gavage. Controls received only Rapamune base (Phosal). After 8 weeks animals were sacrificed and computer assisted morphometric studies were performed. Area of intimal thickness (IT), media and its relation to total vessel were calculated. Results are expressed as percent intima or media area / media + intima area (mean ± SD). At least 3 different segments of aorta were assessed in each animal. RPM levels were measured at 12 weeks. Four experimental groups were formed: Grp A: CIT 0 hr (n=4), Grp B: CIT24 hr (n=8), Grp C: CIT 0hr + RPM (n=3), Grp D: CIT 24 hr + RPM (n=5). RPM level at 2 months was 4.02 ± 2.84ng/L. Addition of RPM significantly decreased IT and medial atrophy and was significantly different from controls. Massive hepatectomy and small-for-size liver transplantation carry a high risk of liver failure and mortality. The clinical picture is that of worsening liver function, indicating a progressive injury to the liver. Histologically, this injury is characterized by sinusoidal congestion, terminal portal venous dilatation, and fatty degeneration of hepatocytes. It has been hypothesized that the increased portal venous flow per unit liver mass leads to a "hyperperfusion injury"(HI). Prevention of HI may have great clinical impact by increasing the potential living donor pool, allowing for use of smaller liver segments, and improving the feasibility of split liver transplantation. Previous reports have suggested that creation of a portasystemic shunt can prevent HI. However, these studies were limited by the need to use large animals or, in the rat, the use of subcutaneous splenic transposition to model a portasystemic shunt. To develop a useful pre-clinical animal model, we tested whether a direct surgical portacaval shunt (PCS) placed at the time of massive (95%) hepatectomy would prevent HI in rats. Male Wistar rats underwent either hepatectomy alone (n=12) or side-to-side PCS immediately prior to hepatectomy (n=11). All animals were sacrificed at 7 days. Only 17% of control rats survived 7 days, compared to 63% in the PCS group ( Figure) . All livers were examined at death or sacrifice. Remarkably, no evidence of portal venule and sinusoid expansion was found in the PCS group, with nearly normal liver lobules. Regeneration was maintained in the PCS group, with an approximate 12 to 16 fold increase in liver weight by day 7 in the survivors. We have developed a rat model of HI as well as a pre-clinical model of HI prevention. Hopefully, this model will allow a detailed analysis of the physiological and molecular mechanisms underlying HI, and begin to reveal potential methods for its prevention. Poster Board #-Session: P111-II Background: TAT protein transduction domain (PTD) has been shown to efficiently deliver fused proteins in both in vitro and in vivo models. We have previously shown excellent adenoviral gene transfection to donor liver grafts. Although viral vectors offer effective gene transduction, complications associated with viral vectors remain as the major concern. To further expand the transduction method, this study investigated TAT fusion protein delivery method to liver grafts. Methods: Orthotopic syngeneic LEW rat liver transplant (OLT) was performed with 18 hrs cold preservation in UW. TAT-β-gal marker protein (120 kDa, 73 µM/liver) or anti-apoptotic TAT-Bcl-XL protein (32 kDa, 0.08 -8.0 µM/liver), was injected into liver grafts with clamp technique (CT) where fusion proteins were ex vivo infused into the harvested liver via portal vein and hepatic artery. By clamping outflow, infused TAT fusion proteins were trapped in the liver during the cold preservation. A non-fused TAT or control β-gal (without TAT) was used as controls. Efficacy of transduction was assessed by western blot and immunohistochemistry during preservation period and after OLT. Serum AST and Abstracts histopathology were assessed to confirm non-toxic nature of fusion protein transduction method. Results: Injection of TAT-β-gal to the excised liver induced rapid β-gal protein transduction at 1 hr after injection. X-gal stain was maintained for 18 hrs of preservation period (18±2.5 % at 18 hrs), however, positive staining was decreased to 6.8±1.5 % at 6 hrs after OLT, and no X-gal stain was observed at 48 hrs. Administration of TAT-Bcl-XL also induced marked transduction of Bcl-XL protein in donor liver, which was retained during whole preservation period. Quantity of protein transduction in the liver appeared to be dose-dependent. Immunohistochemistry of X-gal and Bcl-XL revealed that proteins were transduced into both sinusoidal endotherial cells and hepatocytes. There was no positive stain in control TAT-or β-gal-injected liver grafts. Fusion protein injections did not exacerbate cold hepatic I/R injury assessed by serum AST levels and histopathology. Conclusion: These results demonstrate that ex vivo TAT fusion protein delivery to the donor liver with CT leads to successful transduction of not only β-gal control peptide but also of functional anti-apoptotic Bcl-XL protein without hepatic toxicity. Although the transduction period is brief, ex vivo fusion protein delivery system could be a safety therapeutic method for transplantation. Background: It is unclear whether pediatric solid organ transplant recipients maintain serologic immunity to pre-transplant immunizations. Methods: Children undergoing orthotopic heart transplantation (OHT) or renal transplantation (RT) at our institution between 11/97 and 6/02 underwent serologic testing for diphtheria, tetanus, and measles. Pre-transplant, 1 y and 2 y post-transplant titers were measured. Results: Thirty-seven children underwent OHT or RT during this time period (27 OHT and 10 RT). The mean age was 8.9 y (range 1m-20y). 34/37 patients had completed their primary immunization series for diphtheria and tetanus and 32/37 for measles prior to transplantation. Pre-transplant titers to diphtheria and tetanus were available for 29 patients; 28 demonstrated immunity pre-transplant and all 28 maintained serologic evidence of immunity at 1 year post-transplant. At two years, 20/21 and 18/19 maintained immunity to diphtheria and tetanus, respectively. Pre-transplant titers to diphtheria and tetanus were not available for 8 patients, but all had post-transplant serologic evidence of immunity at one or two years post-transplant. Pre-transplant titers to measles were available for 23 patients; 18 demonstrated immunity pre-transplant. Immunity persisted in 17 of 18 and 11 of 11 available at one and two years, respectively. Pretransplant titers to measles were not available for 13 patients. Eight of these were subsequently shown to have immunity at one or two years post-transplant. The 5 patients who did not have post-transplant immunity to measles had never been vaccinated with measles vaccine. Pre-and post-transplant geometric mean titers (IU/ml) were similar for diphtheria (0.78 vs 0.41), tetanus (2.47 vs 3.7) , and measles (2.6 vs 2.3). These titers were not significantly different for the OHT vs RT group. Conclusions: Pediatric patients undergoing solid organ transplantation maintain serologic evidence of immunity post-transplantation. This underscores the importance of pre-transplant vaccination in patients being evaluated for cardiac and renal transplantation. Poster Board #-Session: P113-II All patients who developed antigenemia &/or sypmtoms did so within 6 months of transplant. However 3 kidney transplants and 2 liver transplants had evidence of CMV while on low dose valganciclovir. All 4 liver transplants had only positive antigenemia., whereas 7/12 kidney transplants had both positive antigenemia and symptoms. All 4 liver transplants with CMV had received thymoglobulin whereas 10/ 12 kidney transplants received thymoglobulin. Overall adverse effects included leukopenia (4 patients), pancytopenia (1 patient) and drug fever (1 patient). Conclusion: Low dose valganciclovir was effective in the majority of patients. However patients who were D+/R-and who received thymoglobulin were more likely to develop CMV and in some cases while on therapy. In these patients a higher dose regimen may be more appropriate. BACKGROUND: CMV D+/R-SOT patients are at high risk of CMV, EBV, and other viral infections. The resulting CMV and EBV interaction increases the risk of EBV-PTLD. Oral ganciclovir (OGCV) and valganciclovir (VGCV) prophylaxis were shown in a randomized trial to be effective in reducing CMV replication and disease in CMV D+/R-SOT patients. We performed this follow up study to evaluate the impact of anti-CMV prophylaxis on EBV replication. We also assessed whether OGCV and VGCV prophylaxis modified the natural history of clinical and subclinical human herpesvirus (HHV)-8 and varicella zoster virus (VZV) reactivation, both of which occur in up to 20% of predisposed SOT patients. METHODS: 263 liver, heart, kidney, and kidney-pancreas SOT patients who received OGCV (n=95) or VGCV (n=168) for 100 days were enrolled. Peripheral blood samples were collected from all patients prior to (baseline) and during (days 14, 42, 70, and 100) antiviral prophylaxis, and at months 4, 4.5, 6, 8, and 12 post-SOT and were quantified for EBV, HHV-8, and VZV DNA using a LightCycler-based PCR assay (lowest limit of detection, 1 copy). RESULTS: HHV-8 and VZV DNAemia were not detected in any of the 2,232 blood samples tested. In contrast, EBV DNAemia was common; it was detected in 56% of patients. The overall prevalence of EBV DNAemia was comparable between patients who received OGCV and VGCV and it was highest (23.6%) at day 14 and lowest (9.5%) at the end of prophylaxis. The degree of EBV DNAemia was low-level in the majority of patients; EBV DNA ≥10³/ml was detected in only 7.6%. Patients who received OGCV were more likely than those who received VGCV to have higher levels of EBV DNAemia (EBV DNA ≥2x10³/ml: 6.3% OGCV vs. 2.4% VGCV; EBV DNA ≥5x10³/ml: 5.3% OGCV vs. 0.6% VGCV). CONCLUSIONS: This study highlights the absence of HHV-8 and VZV viremia during the first year post-SOT in a large international cohort of patients who received OGCV or VGCV prophylaxis. It is possible that the anti-CMV prophylaxis have also prevented the reactivation of HHV-8 and VZV. In contrast, EBV DNAemia was common, although higher levels of EBV replication were observed only in a minority of CMV D+/R-SOT patients. Indeed, VGCV may have prevented higher levels of EBV replication. The correlation between EBV replication and other clinical outcomes is being investigated. Poster Board #-Session: P115-II 1 1 VAMC and Univ.of Pitt, Pittsburgh, PA. Background: The detection of GM by Platelia Aspergillus EIA has proven useful for the diagnosis of invasive aspergillosis and has recently received FDA clearance. Crossreactivity with GM of Penicillium spp. has been noted and EIA reactivity with drugs of fungal origin is potentially possible (Mycoses 97). We assessed if commonly used antibiotics (of fungal, nonfungal or synthetic origin) tested positive for GM (Index >0.50) and determined if achievable serum concentrations of these antibiotics based on a normal dosing regimen could potentially result in positive tests. Methods: Antibiotics tested included amoxicillin, ampicillin-sulbactam, nafcillin, piperacillin, piperacillin-tazobactam, cefazolin, ceftazidime, gentamicin, erythromycin and levofloxacin.Drugs that tested positive for GM as undiluted samples were further tested at achievable serum concentrations and minimal serum concentrations. For the latter experiment, the drug was diluted in serum pre-tested and shown to be negative for GM (Index < 0.20). Antibiotic dilutions tested were based on the peak achievable serum level for each drug as the target concentration, with one 2-fold dilution above and 2 serial dilutions below the target level. Results: Undiluted samples of piperacillin-tazobactam and piperacillin tested positive, whereas, those of amoxicillin, ampicillin-sulbactam, nafcillin, cefazolin, ceftazidime, erythromycin, gentamicin, levofloxacin tested GM negative.However,all 3 lots of piperacillin-tazobactam and all bags within each lot tested GM positive with Index value of >5.168( corresponding to an optical density out of the maximal range of absorbance of the plate reader). At achievable serum concentrations ( levels up to 300µg/ ml) however, only 1 of 3 lots of piperacillin-tazobactam yielded a positive GM test; concentrations of 75 µg/ml, 150 µg/ml, and 300 µg/ml tested positive, whereas lower concentrations, mimicking the trough levels( 10 and 5 µg/ml) tested GM negative. Piperacillin at achievable serum concentrations tested GM negative. Conclusions:. Clinicians evaluating the results of the GM test should be aware that achievable serum concentrations of piperacillin-tazobactam could potentially result in a positive GM test in patients receiving this antibiotic.The timing of collection of serum samples may influence the test results with reactivity being less likely in samples collected at trough levels or prior to the administration of the dose. Poster Board #-Session: P118-II The natural history and spectrum of clinical manifestations of the polyomaviruses BK (BKV) and JC (JCV) in SOT patients are not fully defined. While BKV is a cause of allograft dysfunction in renal SOT patients, its impact on the outcome of non-renal SOT is not known. Likewise, while JCV causes the rare progressive multifocal leukoencephalopathy, the incidence and impact of subclinical JCV reactivation have not been investigated. In this study, we assessed the incidence and relevance of clinical and subclinical BKV and JCV reactivation in renal and non-renal SOT recipients. METHODS: 263 CMV D+/R-kidney, liver, heart, and kidney-pancreas SOT patients who received oral ganciclovir (OGCV; n=95) or valganciclovir (VGCV; n=168) Abstracts prophylaxis consented to participate in this longitudinal study, which was designed to quantify BKV and JCV DNA in blood samples that were collected prior to (baseline) and during (day 14, 42, 70 and 100) anti-CMV prophylaxis, and at months 4, 4.5, 6, 8 and 12 post-SOT. BKV and JCV DNA were quantified in 2,232 blood samples using a LightCycler-based PCR assay (lowest limit of detection, 1 copy (103,000; 3,400-250,000) . Patients with initial QnPCR ≥10,000 had a longer 'diagnosis to no viremia' interval (median 23.5 days;10-165) than patients with initial QnPCR ≤10,000 (13.5 days; 7-25) (p=0.08). All patients have been followed for a median of 165.5 days (45-452) after an undetectable QnPCR. CONCLUSION: A high initial CMV QnPCR level was associated with a statistically significant higher rate of end-organ disease. In addition, there were potential associations between high initial QnPCR and prolonged clearance of viremia, liver transplantation, prophylaxis other than valganciclovir, and a prior history of rejection. CMV QnPCR may provide useful prognostic information for patients with CMV infection after SOT. Methods: Pediatric SOT recipients were enrolled in this study if they were < 18 years of age and were ≥ 4 months post-transplantation. The vaccination schedule involved 3 doses of prevnar™, followed by the 23-valent polysaccharide vaccine. Safety data for the first week after vaccination were categorized as 1) local reactions, 2) systemic reactions, 3) effects on allograft function. Data were summarized using descriptive statistics and addressed outcomes related to the 7-valent vaccine. Results: Fifty-six transplant recipients received 116 doses of the 7-valent vaccine (median 3, range 1-3 doses); 35 males and 21 females. The median age of subjects was 5.1 years (range 0.63-17.9 years). There were 25 hearts (45%), 16 liver (29%), 14 renal (25%) and 1 lung (2%) transplant recipients. Vaccination was initiated at a median of 19.6 months post-transplantation (range 6-51.6). Follow-up of 116 doses that were administered have revealed 1 episode of rejection that was temporally associated with vaccination, but which was felt not to be attributable to vaccination. There were no side-effects with 70.7 % of doses of these 116 doses. The most common adverse events were mild local reactions in 12.9%, fever in 7.8% and irritability and crying in 1.7%. Summary: Data from this pilot study indicated that the vaccine was well tolerated by pediatric transplant recipients. The observed adverse events were generally mild and self-limited. These data along with emerging immunogencity profiles will inform decision-making regarding the optimal use of this vaccine in pediatric transplant recipients. Untreated dental disease represents a potential risk for infection in transplant patients, but the vast transplantation literature contains few references of this complication. There is also little information with regard to dental care protocols for patients before and after organ transplantation. To obtain more definitive documentation about the policies that deal with dental care and experience with dental infections, we conducted a survey of US transplant centers. The instrument consisted of eight questions that addressed pre-transplant dental evaluation procedures, incidence of pre-and posttransplant dental infections, and recommendations for antibiotic prophylaxis with dental treatment after transplantation. Questionnaires were sent to 768 medical and/or surgical directors at all U.S. transplant centers. Responses were received from 294 recipients (38%). Among the respondents, 80% routinely requested a pre-transplant dental evaluation, but 49% of these were only for specific organs. The occurrence of a dental infection prior to transplantation that resulted in a postponement or cancellation was reported by 38% of the respondents. Posttransplantation sepsis from a suspected dental source was acknowledged in 27% of the surveys. Prophylaxis with antibiotics prior to dental care was recommended by 83%; 77% indicated that it be used for all dental procedures, whether invasive or not. Most respondents (96%) recommended the 1997 American Heart Association endocarditis prevention regimen. A survey of organ transplant centers has provided some information with regard to pretransplantation dental screening, dental infections and the use of prophylactic antibiotics. Additional studies are needed in order to accrue more definitive data that will assist with the development of standardized and appropriate pre-and posttransplant dental care protocols. Background/Purpose: Pretransplant lymphocyte depletion decreases need for maintenance immunosuppression. However, the timing of drug minimization cannot be determined by current tools for immune monitoring. Methods. Mechanistic monitoring was initiated prospectively in 77 consecutive pediatric recipients of Liver (LTx-41) and intestine (SBTx-36), median age 4 years (0.45-19), follow-up10 months , transplanted with steroid-free rATG preconditioning and Tacrolimus/Sirolimus (TAC/ SRL) monotherapy. Patient subsets within this population, in whom longitudinal data could be analyzed were grouped as rejectors (REJ), and those with reduced immunosuppression , i.e. once daily/alternate day TAC/SRL (IMred). Between group comparisons were made with t-tests, Hill equations were used for effect: concentration relationships between drug levels and immunophenotypic parameters, and Support Vector Machine Analysis (SVM) was used to classify Imred and REJ. Lymphocyte subset distribution and function constituted immunophenotypic endpoints. Results: Compared with REJ, IMred was associated with 1. Significant decreases in the frequencies of TH1-priming-Type 1 progenitor dendritic cell (0.32±0.1 vs 0.17±0.17, p=0.05) at 34±22 days after transplantation, 2. Significantly decreased mixed lymphocyte response to donor antigen (stimulation index 16±11 vs 59±42, p=0.05), but not to third party, HLA-mismatched donors in the first month after transplantation, and 3. the ability of B-cell expression of CD54, CD95, CD86, CD25, CD69, and CD71, when used collectively, to yield a computational algorithm from a test dataset (n=9) that correctly distinguished IMred from REJ in a validation dataset of 6 subjects by SVM analysis. 4. Further more, magnetic bead-sorted, pure recipient CD8+28subpopulations (T-suppressor) induced downregulation of CD86 on donor antigen presenting cell, stimulated with a T-helper cell line transfected to overexpress CD40 ligand in a reference non-rATG subpopulation of 4 subjects who were tolerant (off IM for >1 year). In rATG-treated subjects, T-sup were absent from 3 of 3 subjects with recurrent REJ, and were present in 4 of 8 subjects with IMred. Conclusions: Reduced IM requirement is associated with significantly decreased donor-specific alloreactivity, and TH1-priming pDC1 frequency. Donor-specific T-suppressor cells may identify the timing for safe weaning of immunosuppression in children receiving rATG pretreatment. Poster Board #-Session: P124-II Humanized and chimeric anti-lymphocyte antibodies (Ab) are being used as immunosuppressive agents to prevent or treat rejections. The presence of drugs such as rituximab (RIT) (anti-CD20), daclizumab (DAC) (anti-CD25), and alemtuzumab (ALE) (anti-CD52) in serum interfere with standard antibody detection methods such as complement dependent cytoxicity (CDC) and flow cytometric crossmatch (FCXM). These agents are recognized as anti-human antibody and/or fix complement and cannot be differentiated from alloantibodies. Removal of humanized (Ab) without removing patient Ig would require beads bound to anti-idiotype antibody. A new ELISA crossmatch technique utilizing class I and class II HLA antigens from lysed donor cells, Transplant Monitoring System (TMS) (GTI,Inc.,Waukesha, WI), potentially precludes interference by eliminating the interfering antigens. To test this hypothesis, sera from nonsensitized volunteers alone or sera supplemented with 0.1 µg/mL or 10 µg/ml of RIT, DAC, or ALE (to mimic trough and peak clinical levels) were tested using three crossmatch techniques. T-cell CDC crossmatches used 2 washes with anti-human globulin (AHG); B-cell CDC crossmatches used 1 wash and no AHG; FCXM used a standard 3-color protocol with a mean channel shift of 45 indicating a positive T-cell result and mean channel shift of 150 indicating a positive B-cell result. TMS was performed as per manufacturer's instruction. Equivalent donor cells were used as targets. No reactivity was seen when testing sera with no drug added with any of the 3 crossmatch techniques. At both 0.1 µg/mL and 10 µg/mL RIT interfered with CDC B-cell, but not T-cell crossmatches. RIT at 10 µg/mL, but not 0.1 µg/mL interfered with the B-cell FCXM. No interference was seen with RIT in T-cell FCXM or TMS. ALE interfered with B-cell and T-cell CDC and FCXM, but neither class I nor class II TMS. DAC did not interfere with CDC or FCXM at 0.1 µg/mL, but caused a false positive B-cell result in both FCXM and CDC with some, but not all samples. No interference by was seen DAC with the TMS. While further testing is needed, TMS may be a useful alternative method to differentiate de novo donor specific antibodies after treatment with humanized or chimeric immunosuppressive agents. INTRODUCTION: Small bowel transplant is a modality used to improve the quality of life of subjects with short gut syndromes of various causes. We have demonstrated that transplanted mucosa shows statistically significant morphologic changes compared with native bowel. These include edema, increased apoptosis, increased goblet cells and villus blunting. Additionally an earlier study of transplanted bowel has demonstrated that there is a gradual increase in villus height over time after transplant which plateaus after several months. Calculations at the light microscopic level have demonstrated that these changes significantly impact the absorptive surface area. In this study, we performed a morphometric analysis of sequential graft biopsies on a patient with a living related small bowel transplant. METHODS: Serial mucosal biopsies of the intestinal graft were taken weekly for the first post operative month and subsequently based on clinical indication (such as, to rule out rejection or infectious etiologies). We studied the serial small bowel biopsies performed at 1, 2, 5 and 10 week intervals after transplant which showed no evidence of rejection or infection. Relatively preserved villus areas were photographed at a fixed magnification (10x). Measurements of villus height and width were calculated at five points along the x and y axis. Gaussian-based fitting and height/width based calculations of villus surface area were performed. CONCLUSIONS: The villus architecture was observed to change significantly with time progression. Early changes consisted of shorter broader villi (height/width ratio), with greater variation in height to width ratios. The mucosa was flat with wide spread villus blunting in the earlier biopsies. The villus height increased over time, achieving heights comparable to age-matched controls. The absorptive surface area was calculated using two different techniques; a). the villus as a cylinder b). Gaussian distribution. Our findings showed a significant difference in surface area calculations with the two techniques, up to 20%.We conclude that the significant alteration of villus architecture reaches a plateau and eventually the bowel reacquires its pre-transplant architecture. Composite tissue allotransplants (CTAs) differ from other allografts in their embryologically diverse tissues. Given the infrequent application of CTA, it has not been established whether rejecting tissues have an immunological hierarchy. We have examined specimens from transplanted human limbs in various stages of rejection and compared them with a controlled evaluation of rejecting non-human primate limb CTAs to determine the temporal progression of CTA rejection and evaluate the relative degree to which CTA elements are targeted for immune destruction. Human tissues were received from international transplant centers for histological evaluation. A primate radial forearm flap model was developed for serial histological and transcript RT-PCR analysis. Primate autografts (n=5) and allografts with (n=4) and without (n=4)subtherapeutic immunosuppression have been studied with protocol incisional and excisional biopsies. Primate CTA components studied included native and donor skin, skeletal muscle, artery, vein, nerve and tendon. Human CTA rejection under immunosuppression presented as a rash localized to the allograft, and was characterized by a prominent dermal infiltrate. Given that biopsies were prompted by skin signs, this could indicate that rejection begins in the dermis, or that rejection becomes clinically evident after the infiltrate reaches the dermis. Primates evaluated without immunosuppression had an infiltrate arising within 3 days in the perivenular tissue leading to graft congestion and failure without a prominent dermal infiltrate. Dermal findings were consistent with ischemic injury referable to the early vascular injury. Subtherapeutically immunosuppressed animals rejected in 2 weeks with a marked dermal lymphocytic infiltrate similar ito human cases. The inflammatory lesions in the engrafted tissues were primarily perivascular and CD3+. Transcriptionally most composite tissues showed constituative expression of the regulatory cytokines IL-10 and TGF-B. These increased in native tissues exposed to trauma and in rejecting tissues suggestive of local regulation in response to tissue injury. Allografts were unique in their expression of CD80, CD25, and T-bet consistent with a cytotoxic T-cell mediated event. These was most evident in transplanted artery and skin. Thus, the clinical appearance of a rash occurs relatively late as an extension of vascular alloimmune egress. These data suggest that protocol surveillance of CTAs may be of some benefit in detecting occult alloimmune activity. BACKGROUND: An important patient safety measure at our institution are the liberal criteria for initiation of a "condition", or crisis, which is designed to enact a rapid response to patients in a pre-arrest crisis state to avoid progression to a full arrest. These "condition" criteria encompass alterations in vital signs and mental status. These "conditions" also act as an important reporting system to identify potentially elusive system errors that result in adverse events. The purpose of this project was to evaluate the outcomes and system errors of in-hospital crisis in transplant patients. METHODS: Patients having a "condition" at our institution from July 1, 2001 until June 30, 2003 were stored in a database. The electronic charts of these patients were reviewed by one of a team of physicians, and the cases were then presented to the full Condition Review Committee where contributory factors and errors were identified. The Patient Safety Committee provided institutional approval. RESULTS: 117 patients over the two-year period were admitted to the transplant services and experienced a "condition" or crisis. The hospital mortality of these patients was high (28.2%). 69 major errors were identified in these patients. Medication errors were common (17 patients, 14.5%) and resulted from the use of narcotics (6), immunosuppressives (6), insulin (4) and benzodiazepines (2) . Physician knowledge, judgment or delay in treatment contributed to the events in 19 patients (16.2%). Problems with equipment (13 patients, 11.1%) and problems related to patient transport off of the ward (13, 11.1%) occurred with high frequency. CONCLUSIONS: Despite criteria for the initiation of a rapid response, the hospital mortality of an in-hospital crisis in transplant patients remains high. Our condition database is an effective reporting system for the identification of medical error. Reviewing the course of these patients was an efficient way to identify opportunities for system change to reduce error. Background: Monoclonal antibodies play an important role in immunosuppressive therapy after intestinal transplantation (ITx). The experience with pharmacodynamic monitoring of daclizumab and infliximab application after ITx is presented. Material and Methods: Intestinal transplantation was performed in 12 adult patients for irreversible short bowel syndrome. Initial immunosuppression consisted of tacrolimus, rapamycin, prednisolone and daclizumab (n=10) or tacrolimus, alemtuzumab, and steroids (n=2). Daclizumab (1mg/kg body weight) was administered in an individualized fashion according to pharmacodynamic monitoring using serum sIL-2R levels and CD4+CD25+-T-cells. Infliximab was used in two patients as rescue therapy for OKT3-resistant rejection according to serum TNF alpha and serum-LPS binding protein (LBP). Immunological monitoring included sIL-2R, TNF-alpha, LBP, IL-6, IL8, CRP , and CD4+CD25+ -T-cells. Results: 6-and 12-mths-patient and graft survival were 83% (10/12) and 75% (9/12). Acute rejection (AR) rates were 8% (1/12) within the first 6 mths and 17% (2/12) within on year. One late AR occured after 2 years. All ARs were steroid-and OKT3resistant. Steroid-resistent ARs were accompanied by a significant increase of serum-TNF alpha and LBP (3/3) . In two patients presenting with AR 9 mths and 2 yrs after intestinal transplantation, complete resolution was not achieved despite 5 and 10 days of OKT3 treatment, respectively. They were treated with four, individually timed infusions of 3mg/kg body weight infliximab (chimeric anti-TNF alpha moAb) until complete recovery. Onset of therapy, number of infusions and intervalls between infusions (patient1: 4 wks interval; patient 2: 2 wks interval) were determined according to the course of serum TNF alpha and LBP levels. The results of daclizumab application according to pharmacodynamic monitoring were as follows: Only one patient recieved the recommended number of five daclizumab applications. The other patients recieved four (n=1), three (n=3), two (n=3) or a single daclizumab infusion (n=2) according to sIL2R-and CD4+-CD25+-T-cell monitoring Conclusion: Therapy with IL2R-antagonists can be individualized by pharmacodynamic sIL2R and CD4+CD25+ T-cell monitoring. Infliximab is a valuable therapeutic option in in a selected group of patients after intestinal transplantation experiencing steroid and OKT3 refractory severe rejection. Its administration can be guided by pharmacodynamic monitoring of serum TNF alpha and LBP. Transplant-associated TMA associated with calcineurin inhibitors is a serious complication in solid organ (SOT) and allogeneic bone marrow (allo BMT) recipients. Recently, TMA in pts on sirolimus has been described. Little is known about the relative risks and clinical differences in TMA associated with these agents. Methods: Records of the Apheresis Unit at a single center were reviewed to identify transplant recipients who had severe TMA requiring apheresis. A cohort from1995-2001 was compared with one from 2002-3 (after introduction of sirolimus). Results: In the 1995-2001 group, 6 SOT (2 lung, 2 kidney, 1 heart, and 1 liver) and 14 allo BMT recipients required apheresis. In the 2002-3 group, TMA developed in 8 SOT (4 lung, 4 kidney) and 3 allo BMT recipients. TMA in the earlier group developed at a median of 6 mos post-SOT and 1.5 mos post-BMT; in the later group, 8 mos post-SOT and 1 mo post-BMT. One pt had seizures and intracranial hemorrhage during sirolimus-associated TMA after less severe TMA episodes with the other agents. The table shows the drug which pts were receiving when TMA developed (some pts had more than 1 episode). Of pts on sirolimus, 3 were also receiving tacrolimus concomitantly; the others were on no calcineurin inhibitors. In the earlier group, 3/6 (50%) of SOT vs. 1/14 (7%) of BMT recipients survived whereas in the later group, 5/8 (62.5%) of SOT and 0/3 (0%) of BMT recipients survived. Concurrent or recent CMV infection occurred in 5/6 SOT (83%) in the earlier group, but only 1/8 SOT (12.5%) in the later group (p=0.029). All but 2 of the pts with sirolimusassociated TMA had levels at some point > 20 ng/ml. Conclusions: In the current era, sirolimus-associated TMA requiring apheresis is at least as common in SOT as that associated with calcineurin inhibitors, and may be severe. In both eras, survival in SOT pts with TMA is better than that of allo BMT recipients. CMV viremia was frequently associated with calcineurin inhibitor-associated TMA in SOT but not with sirolimusassociated TMA. Further study of risk factors for TMA will be of interest. Introduction: HLA donor/recipient matching is associated with a better graft outcome both in kidney and heart transplants, although the opposite is found in liver transplants. This study analyzes the histocompatability factors that affect the short and long term outcome of intestinal allografts. Previously we reported a trend to increased frequency and severity of rejection with a positive crossmatch. Again no significance is achieved, but may be due to differential clinical and immunosuppressive therapies in this cohort. HLA mismatch has no bearing, except for an association with HLA-B and chronic rejection, although our relatively small numbers may impede analysis. Clinical and theraputic advances over the time period also confound analysis. However, we do not feel that high PRA, crossmatch status or HLA mismatch should alter donor graft utilization or patient selection. Human alloimmunity may be influenced by environmental stimuli including immunizations. Such stimuli could nonspecifically enhance memory anti-donor immunity and/or specifically prime T or B cells with cross-reactivity to alloantigens. To assess the effects of immunization on alloimmunity in humans we performed serial assessments of peripheral T and B cell alloimmune responses in 2 cohorts of patients undergoing 2 different immunization regimens. 14 patients with adenocarcinomas were immunized 3 times over 3 mo with an experimental murine anti-idiotypic antibody administered in alum. In a second cohort, 12 normal student volunteers were immunized with the Hepatitis B vaccine (3 injections over 6 months). In both cohorts, blood samples were obtained prior to immunization, 1-2 months after the initial immunization and following the last immunization. Peripheral blood lymphocytes (PBLs) were isolated and tested against a panel of allogeneic splenic stimulator cells using an IFNg ELISPOT assay to assess the frequency of alloreactive effector/memory T cells in the peripheral blood. Plasma samples were evaluated for the presence or absence of anti- HLA antibodies (Ab) using flow cytometry screening beads. Baseline studies demonstrated the presence of memory alloreactive T cell immunity (> 15 IFNg ELISPOTs per 300,000 PBLs) in 70% of the study patients. In each of the 2 cohorts, PBLs from 60% of individuals with baseline T cell alloreactivity showed a 1.5-3 fold increase in frequency by 1-3 months post-immunization. The frequency of alloreactive PBLs from concomitant control, nonimmunized volunteers, did not change during the study period. ∼20% of subjects with no alloreactive PBLs at baseline developed new alloreactive PBLs by 1-3 months post-immunization. In addition, 2 of 14 adenocarcinoma patients had baseline positive anti-HLA Abs that were significantly increased following immunization. 4 of the remaining 12 patients developed new anti-HLA Abs during the immunization period. These data show that protective immunizations can have complex effects on human alloimmune repertoires. Both nonspecific activation/expansion of alloreactive memory immunity and cross-reactive priming of new alloreactive T and B cells can occur. The data further suggest that immunizations may affect the risk of rejecting a subsequently transplanted organ and bolster the argument for monitoring cellular and humoral alloimmunity in this setting. Regression analyses showed that caregiver burden was a significant predictor of transplant patients' state anxiety (p ≤ .05) and quality of life (p ≤ .05). Patients who had highly stressed caregivers had higher state anxiety and lower quality of life (mental health). Hierarchical regression analyses were conducted to further examine possible mediating effects. Patient coping strategies characterized by denial and behavioral disengagement mediated the relationship between caregiver burden and high anxiety, as well as between caregiver burden and lower quality of life. Patient coping strategies characterized by optimism and acceptance mediated the relationship between caregiver burden and anxiety and between caregiver burden and quality of life. This study suggests that high levels of caregiver strain is associated with higher psychological distress and lower quality of life in transplant patients. Caregivers should also be a focus of evaluation during the pre-transplant period and interventions should be developed to facilitate their successful adaptation to transplantation. To date, the primary strategies for avoiding rejection have been to minimize antigenic differences between donor and recipient by matching HLA and using potent immunosuppression. As an alternative approach, we propose to condition the graft cells by delivery of sequences encoding small interfering RNAs (siRNAs) targeted against HLA, thereby decreasing donor immunogenecity and recipient's immune response. To this end, we have designed several siRNA sequences directed against nonpolymorphic regions of HLA Class I, and identified U6 promoter-driven siRNA expression cassettes exhibiting the most potent inhibitory activity by direct transfection into human cell lines, followed by flow cytometric and Western blot analyses. Next, for long-term suppression of HLA expression, we have constructed and tested thirdgeneration self-inactivating HIV-based lentivirus vectors for efficient and long-term gene delivery of these siRNA cassettes. We have found that these lentivirus vectors, which readily infect quiescent non-dividing cells, are capable of highly efficient gene transfer to a wide variety of primary human cell types, including hematopoietic progenitor cells, differentiated epithelial cells, fibroblasts, myocytes, endothelial cells, and islet cells. Furthermore, permanent integration into the genome of the host cell is achieved. Currently we are conducting studies to determine the percent and durability of siRNA-mediated inhibition. Initial experiments, using beta 2 microglobulin siRNA sequences, we were able to obtain complete inhibition of expression of beta 2 microglobulin in a human cell line, but with less than 100 % efficiency; as shown by flow cytometry. We anticipate that lentivirusmediated gene transfer of siRNA sequences will be useful for reduction of graft immunogenicity by treatment of donor, donor organs, and islet cell transplants. Purpose: Cyclosporine A (CsA) and tacrolimus inhibit lymphocyte proliferation via inhibition of calcineurin. In several systems, these compounds were shown to stimulate the production of profibrotic cytokines such as transforming growth factor-β (TGF-β) and endothelin (ET). Current evidence indicates that both cytokines play a role in the pathogenesis of chronic allograft nephropathy. Here, we investigated the interdependence between drug exposure, lymphocyte proliferation, TGF-β1 expression and ET-1 production in stable renal allograft recipients. Methods: Patients received double-immunosuppression consisting of either CyA + mycophenolate mofetil (MMF) (n = 34) or tacrolimus + MMF (n = 30). CyA (EMIT) and tacrolimus (IM x ) trough levels (both groups), as well as C2 levels (CyA group) were measured in whole blood. Simultaneously, lymphocyte proliferation was determined by MTT-test following incubation of peripheral blood mononuclear cells (PBMC) with anti-CD3 mAb (10 ng/ mL). Additionally, TGF-β1 and ET-1 plasma levels were analyzed by ELISA. Results: The mean drug levels were 98.3 ± 32.8 ng/mL (C0) and 562.2 ± 171.3 ng/mL (C2) in the CyA group, and 9.1 ± 2.3 ng/mL (C0) in the tacrolimus group. Mean lymphocyte proliferation at C0 was significantly lower in the tacrolimus group as compared to the CyA group (0.26 ± 0.27 vs. 0.32 ± 0.24, P < 0.05). TGF-β1 plasma concentration at C0 was significantly higher in the tacrolimus group (18.8 ± 7.2 ng/mL vs. 11.1 ± 7.6 ng/ mL, P < 0.01). In contrast, ET-1 plasma concentration was significantly higher in the CyA group (1.39 ± 4.2 fmol/mL vs. 0.84 ± 2.5 fmol/mL). In the CyA group, a negative correlation between drug levels and lymphocyte proliferation was found (C0: r = -0.439, P < 0.05; C2: r = -0.769, P < 0.001; AUC: r = -0.677, P< 0.001). Conclusions: Our results show that depending on the immunosuppressive maintenance regimen differences exist with respect to the effect on lymphocyte proliferation and the expression of profibrotic cytokines. Compared to tacrolimus treated patients, lymphocyte proliferation was less reduced in CyA treated patients. Additionally, lymphocyte proliferation nicely correlated with drug levels in this group. Concerning the expression of profibrotic cytokines, our results make clear that it may not be sufficient to investigate a single profibrotic cytokine, in order to estimate the profibrotic profile of immunosuppressive drugs. Poster Board #-Session: P137-II Thirty-nine patients were enrolled in the control arm (17 kidney, 22 liver) while 12 patients received the intervention (10 kidney, 2 liver). Use of the CD3 guided ATG dosing strategy led to a 57% reduction in the mean total ATG dose per patient (704 mg vs. 302mg; p = 0.001) and a significant cost savings (mean cost per patient CDN$5531 vs. $2576; p = 0.001; includes cost of CD3 monitoring). Subjects receiving CD3 guided ATG dosing experienced significantly less hematologic toxicity. There was no difference in the incidence of acute rejection, CMV or bacterial infection between the groups. Implementation of a CD3 guided ATG dosing protocol may achieve dose reductions and cost savings compared to traditional dosing. Although this appears to be a promising approach, further analysis is required to fully assess the efficacy and safety of this protocol. One-year mean serum creatinine was higher in CYA-based than TAC-based regimens for first and second transplants (data not shown After correcting for various variables the relative risk for graft loss was not different in CSA vs TAC treated patients. If censored for death with a fuctioning graft (DWFG), there was a minimally increased risk for graft loss in the TAC-based regimens (p=0.0408). There was a marked improvement in the relative risk for graft failure with the use of MMF-based regimens (p<0.0001). Conclusions: With each transplant year there has been a decrease risk of graft failure in all treatment groups. There is no difference in two year graft survival in TAC vs CSA treated patients. Use of MMF markedly decreased the relative risk for graft loss. Poster Board #-Session: P142-II The use of tacrolimus compared to cyclosporine has been associated with approximately twice the rate of post-transplant diabetes mellitus (PTDM). Tacrolimus dose reduction has been suspected to lessen the risk of developing PTDM. We sought to evaluate these conjectures in a large immunosuppression prescription data base in non-diabetic recipients of renal transplants. METHODS: Data were drawn from the United States Renal Data System (USRDS) data base. The diagnosis of PTDM was identified with ICD-9 codes, and immunosuppression regimens were identified with prescription payments in Medicare billing records supplied by the USRDS. Recipients of first, single organ renal transplants between 1995 and 1998 with no evidence of diabetes mellitus prior to 30 days posttransplantation were included in the analysis if a calcineurin inhibitor prescription payment was recorded by 30 days post-transplant. Multivariate, time-varying techniques were used to estimate the time dependent risks of PTDM. The study endpoint was the diagnosis of diabetes. Subjects were censored from analysis at the time of their last recorded immunosuppression prescription, graft failure or death. RESULTS: 6,715 patients were prescribed cyclosporine and 1,421 patients were prescribed tacrolimus by 30 days post-transplant were studied. Patients prescribed tacrolimus compared to cyclosporine were associated with a 75% (P=0.0008) increased risk of PTDM. Conversion to tacrolimus in patients initially treated with cyclosporine was associated with a 74% (P<0.0001) increased risk of PTDM. The magnitude of tacrolimus or cyclosporine dose reductions were not associated with the risk of developing PTDM. CONCLUSION: Tacrolimus based immunosuppression was associated with increased risk of PTDM relative to cyclosporine both when used as the initial immunosuppressant and when used in conversion. This risk did not decline with dose adjustments. We can not support the conjecture that tacrolimus reduction lessens the risk of PTDM. African American kidney transplant recipients (AAs) are at high risk for developing posttransplant diabetes mellitus (PTDM), especially when treated with steroids and tacrolimus. We compared the incidence and timing of PTDM in 60 consecutive AAs treated with prednisone, tacrolimus (target trough levels, 5-8 ng/ml), and sirolimus (target trough levels, 10-15 ng/ml) to those of 19 AAs transplanted in a earlier era and treated with comparable doses of prednisone, tacrolimus (target levels 8-12 ng/ml), and mycophenolate mofetil (MMF). The incidence of PTDM, defined as the need for insulin or oral hypoglycemic agents for more than one month, was 32% in each group despite a trend toward lower trough tacrolimus levels in the sirolimus-treated AAs. The onset of PTDM was 3±3.7 months in sirolimus-treated patients and 11.0±8.6 months in MMFtreated patients (p=0.003). Eight patients in the sirolimus-treated group were withdrawn from prednisone between 3 and 5 months after transplantation. With follow-up after cessation of steroids ranging from 12 to 37 months, all 8 patients remained on treatment for PTDM, consisting of insulin (n=2), an oral agent (n=4) or both (n=2). One patient presented with diabetic ketoacidosis prior to steroid withdrawal and had a C-peptide concentration of 0.2 ng/ml (normal range, 0.5 to 3.0 ng/ml), suggesting new onset of type 1 diabetes mellitus. C-peptide levels measured in 6 of the other 7 patients were either normal (n=4) or frankly elevated (n=2). We conclude that use of prednisone, tacrolimus and sirolimus in AAs is associated with a variant of PTDM that occurs relatively early after kidney transplantation and is resistant to withdrawal of steroids. The variable C-peptide levels in our patients suggest heterogeneous pathophysiologic mechanisms, ranging from irreversible islet cell toxicity to a state of insulin resistance. . There were no differences in demographics and incidences of delayed graft function between the two groups. Maintenance immunosuppressive regimens were similar between the two groups with majority of the patients receiving tacrolimus, mycophenolate mofetil, and steroids. The total accumulative dose of Thymo was 7.0 ± 3.6 mg/kg. There were no differences in patient survival, 92% in the Thymo group and 87% in the No-Thymo group, p=0.16 (log-rank). However, graft survival rate was significantly higher in the Thymo group (75%) than in the No-Thymo group (62%), p=0.04 (log-rank). Thymo was also associated with a lower incidence of acute rejection (13%) than the No-Thymo group (29%), p<0.01. There were no significant differences in serum creatinine levels at 3 years between the two groups 1.7 ± 0.8 mg/dL and 1.8 ± 0.9 mg/dL in the Thymo and No-Thymo group, respectively. There was only one case of PTLD, in the Thymo group. The incidence of malignancies was very low, 0.04% in the Thymo group and 0.03% in the No-Thymo group. The incidences of opportunistic infections were similar between the two groups. Conclusions: Thymo induction is associated with a very low incidence of posttransplant malignancies and opportunistic infections in renal transplant recipients. Moreover, Thymo induction was associated with a lower incidence of acute rejection and improvements in graft survival. Poster Board #-Session: P148-II Campath (Alemtuzumab) has recently begun to be utilized for induction therapy in renal transplantation. Although Campath has been shown to have significant effects on hematologic parameters when it has been used for treatment of hematologic malignancies, its effect with dosage regimens used in renal transplantaion has been poorly defined. We began utilizing Campath for induction therapy in renal transplantation in May 2003. The dosage of Campath used was 30 mg IV given pre-operatively. The patients subsequently were treated with maintenance therapy of tacrolimus or mycophenolate mofetil. In addition all patients were treated with a regimen of rapidly tapering prednisone. We found that 18/20 (90%) of the patients experienced a decrease in their platelet count, with the lowest platelet count noted of 61,000 cells/microliter. The average time to development of thrombocytopenia was 1.1 days post-op. The average duration of thrombocytopenia was 14.5 days (range 2-85 days). In addition to the thrombocytopenia, absolute lymphopenia (<1000 lymphocytes/microliter) was noted in 20/20 (100%). The median time to development of lymphopenia was 1 day post-op. The range of nadir lymphocyte counts was 0-140. To date no patient has returned to a lymphocyte count greater than 1000 (longest duration of follow-up 154 days). In contrast absolute neutrapenia (<1000) did not occur in any patient. Furthermore there was no association of the hematologic abnormalities encountered with any of the maintenance immunosuppression used. In summary Campath induction therapy for renal transplantation can result in significant and prolonged thrombocytopenia and lymphopenia. Such hematologic suppression may necessitate alteration in other medications typically utilized in the post-transplant period. HLA-identical living-related kidney transplant patients may still receive standard doses of immunosuppression. We wondered why these patients should be exposed to the adverse effects of immunosuppression any longer. We tapered HLA-identical living-related renal transplant patients on azathioprine (AZA) in combination with prednisone to half of their AZA dose and 5-10 mg/day prednison. We questioned whether the in vivo load of immunosuppression influenced their donor-specific T-cell reactivity, defined as the reactivity against minor histocompatibility antigens (mHag's). Patients (n=15) who were at least 2 years (median 4.3 years, range: 2.3-15.2) after transplantation, were reduced from 100% AZA (median: 1.7 mg/kg AZA, range 1.0-2.2) in two steps to 50% of their AZA dose (median 0.7 mg/kg AZA, range: 0.5-1.1). The reactivity against mHag's was measured by IFN-γ Elispot-assay as published recently (Transplantation, 2003) , and the reactivity before and after tapering was compared with the reactivity at 3 months after HLA-identical living-related kidney transplantation (n=16). Three months after transplantation, we found a frequency of donor-specific IFN-γ producing cells in the range of 5 to 115/10 6 PBMC (median: 30/10 6 PBMC). At least 2 years after transplantation, before reduction of immunosuppression, the frequency of IFN-γ producing cells was significantly lower (median 0/10 6 PBMC, range 0-320) (p=0.04). Tapering of immunosuppression did not affect the frequency (median 5/10 6 PBMC, range 0-540), and was still lower than in the early period after transplantation (p=0.03). These patients did not suffer from acute rejection after tapering immunosuppression. From these results we conclude that HLA-identical living-related kidney transplant recipients can safely be reduced to 50% of their AZA dose without affecting the immune response. Our Elispot data indicate that these patients remain over-immunosuppressed, and that the immunosuppression could be reduced further or even stopped. 1 1 Dept. of Nephrology and Renal Transplantation, University Hospitals Leuven, Leuven, Belgium. Background: In this prospective study we compared the incidence of late acute rejections (acute rejection more than one year post transplantation) and changes in serum creatinine over time between compliers and non-compliers with immunosuppressive therapy more than 1 year post transplantation and explored the relative contribution of noncompliance and other known risk-factors in the occurence of late acute rejection episodes. Methods: Using a prospective design, 146 adult renal transplant recipients (56% males; median age 47 years, IQR: 19) varying in time post transplantation (median 4 years; range: 1-18 years) were followed during a five-year period. Patients were interviewed at inclusion in the study regarding their intake of immunosuppressive medication and categorized as non-compliers if they admitted to have skipped immunosuppressive medication on a regular basis during the previous 12 months. The occurrence of a late acute rejection during the 5-year follow-up period was recorded. Results: The sample consisted of 22.6% non-compliers of which 21.2% experienced a late acute rejection compared to 8% in the group of compliers at 5 years post-inclusion (p<0.05). Kaplan Meier survival analysis showed a decreased rejection free time in non-compliers compared to compliers (p=0.03). Non-compliant patients had a 3.2 higher risk of late acute rejections (Cox regression analysis, p=0.005). Non-compliers experienced a higher increase in serum creatinine over time (Linear Mixed Models, p<0.001). Conclusions: Non-compliance in renal transplant patients more than 1-year post transplantation is associated with an increased risk for late acute rejection and a higher increase in serum creatinine during the following 5 years. The characteristics that were independently associated with graft outcomes were SRL concentration, African-Americans, and PRA >20%. CONCLUSION: Average SRL concentration appears to be an important factor in determining graft outcomes in patients converted to SRL with CI minimization. Anemia and erythrocytosis are common after kidney transplantation. Antiproliferative agents such as mycophenolate mofetil (MMF) may play a role in the pathogenesis of anemia. The influence of sirolimus on posttransplant erythropoeisis was examined in 214 kidney or kidney-pancreas (KP) recipients transplanted between 1999 and 2002 and treated either with sirolimus-(N=87) or MMF-based (N=127) therapy. We excluded patients who either 1) lost their grafts or 2) changed their antiproliferative drug during the first 12 mos. All patients received either cyclosporine (CsA) or tacrolimus (FK). Anemia was defined as hemoglobin (Hb) <12 g/dL in women and <13 g/dL in men. Posttransplant erythrocytosis (PTE) was defined as hematocrit >51% at any time. At 6 mos, the prevalence of anemia was 42% in MMF patients vs 57% in sirolimus patients (p=0.024); at 12 mos, the prevalence was 31% with MMF and 57% with sirolimus (p<0.001). At 12 mos, Hb concentration was 13.5±2 g/dL in MMF patients and 12.1±2 g/dL in sirolimus patients (p<0.0001). Side effects of Sirolimus, the last immunosuppressive drug introduced in organ transplantation, are mainly dyslipidemia, diarrhea, anemia, thrombopenia, arthralgia, lymhoceles and wound healing problems. A few cutaneous events have also been mentioned such as acne, edemas, aphtosis and skin infections. However their frequency and relationship to sirolimus therapy are still unknown. We conducted a phase-IV study to evaluate the frequency and the severity of skin, hairs, nails, and mucous cutaneous adverse events in renal transplant recipients (RTR) on sirolimus base therapy in a single renal transplantation center in France. Eighty consecutive RTR on sirolimus base therapy (60% of male; middle age, 48 y; median duration of graft, 2.2 y) have been evaluated. Sirolimus was used as first line therapy from the time of transplantation in 45% of patients and switched from CNI to sirolimus in 55% of cases. Median duration of sirolimus treatment was 12 months (range: 0.75-84 months). On average 99 % of patients complained of 7 cutaneous adverse events each. The most frequent cutaneous adverse events were pilosebaceous apparatus involvement, observed mostly in male (acne-like eruption (46%), scalp folliculitis (26%), hidradenitis suppurativa (12%)), edematous phenomenon (chronic edema (55%) acute and recurrent edema (15%)), mucous membrane disorders (aphtosis (60%), epistaxis (60%), chronic gingivitis (20%) and chronic fissure of the lips (11%)) and nail disorders (chronic onychopathy (74%), periungueal infections (16%)). The imputation of sirolimus was considered probable for these four groups of symptoms as they have appeared on sirolimus therapy, have either an unusual aspect or a higher incidence than what is usually observed on calcineurin inhibitor therapy. In addition, they always disappeared after sirolimus withdrawal. Twenty-five percents of patients complained of serious cutaneous adverse events leading to SRL cessation in 6%. Conclusion: Skin disorders are frequent in RTR. However, it is probable that their frequency has increased since the introduction of sirolimus therapy. They are a frequent reason for withdrawing sirolimus, either because of their severity or more often because of their social and functional consequences. The choice of induction immunosuppression for kidney transplantation in elderly recipients is dictated by consideration of infection risk, as well as efficacy, in prevention of acute rejection, thus allowing to reduce the subsequent maintenance immunosuppression and its attendant long-term side effect profile. We present data on 183 elderly kidney transplant recipients who were older than 60 at the time of transplant (mean 66±5 yrs) over the last 12 years. These patients received induction with either anti-lymphocyte globulin (ATGAM), muromonab (OKT-3), basiliximab (Simulect), and basiliximab, followed by steroid free maintenance immunosuppression. We compared incidence of delayed graft function (DGF), acute rejection (AR), side effects, patient/ graft survival and costs of immunosuppression. Differences between groups were tested for significance using chi-square. Results, as shown below, indicate lower AR and DGF rates in both basiliximab groups vs ATGAM vs muromonab. Complete steroid avoidance did not result in increased AR rates. Basiliximab based induction was free of side effects, typically encountered when polyclonal, or monoclonal, antibodies are used, such as the need for central line, complications thereof, and first dose reactions. 90%] for all other groups without reaching significant differences between groups. Death with functioning graft was the most common cause of graft loss. Post-transplant hospital stay was 7±3 days for both basiliximab groups, 11±3 days for muromonab group and 15±6days for ATGAM group. Cost of induction therapy was $2400±100 for basiliximab, $6000±1600 for muromonab and $9000±2200 for ATGAM. We conclude that basiliximab is preferrable agent for induction in the elderly kidney transplant recipients. Additionally, basiliximab eliminates the need for steroid use in maintenance immunosuppression. Poster We have adopted a systematic approach to identify polymorphic variants in genes which may influence individual responses to immunosuppressive therapy. We have selected genes encoding products involved in the absorption, action or metabolism of the most commonly used immunosuppressive agents (including cyclosporin, tacrolimus and mycophenolate mofetil). Assays have been devised for genotyping polymorphic variants of the selected genes. Priority has been given to genotyping genetic variants known to exert a functional effect on the encoded gene product. In addition to this, all non-synonymous polymorphic variants, and single nucleotide polymorphisms within the promotor region of the gene were also tested. Objective: The objective of this study was to assess the population distribution of variant alleles within the following genes: MDR-1, FKBP12 and IMPDH-1. Methods: In our centre, the most prevalent ethnic groups are UK Caucasoids, and Asian (Indo-Pakistani). 100 individuals from each of these populations were genotyped in this study. Using SNpShot, PCR-SSP and PCR-RFLP based methods, we analysed eight polymorphisms in MDR-1, two polymorphisms in FKBP12 and five polymorphisms in IMPDH-1. Haplotype analysis was performed on data obtained to determine the significance of any linkage across each gene. Results: Two polymorphisms of the MDR-1 gene in exon 22 (C3435T) and exon 26 (silent) were significantly linked in UK subjects. C3435T has been associated with low levels of expression of P-glycoprotein. There was also significant variation in the distribution of C3435T alleles between Caucasoid and Asian populations, with the T allele at this position having a higher frequency in Asian populations (p=0.05). Alleles of both FKBP12, and all five IMPDH-1 markers were present at high frequency in our populations with no significant variation between Caucasoids and Asians. The lack of variation in genotypes for FKBP12 and IMPDH-1 suggest that although these variants may influence individual drug responses, they are unlikely to account for adverse effects noted commonly within specific ethnic groups. The ethnic variation in genotypes observed for MDR-1 suggests that C3435T variant may be a potential candidate for causing adverse reactions observed more commonly in patients of Asian origin. The long-term efficacy of induction therapy with either a monoclonal anti RIL2 receptor antibody (Simulect®) or with polyclonal anti-lymphocyte antibodies (Thymoglobuline®) is still a matter of debate. In a multicenter study including 99 renal graft recipients at low immunological risk (PRA<30%), 49 patients were randomized to receive Simulect® (Group S) and 50 patients were treated with Thymoglobuline® (Group T). Long term immunosuppression included ciclosporin (Neoral®), mycophenolate mofetil (Cellcept®) and corticosteroids, which were progressively withdrawn between months 6 and 9. We previously reported comparable one year patient and graft survival between Groups S and T (98% vs 100% and 94% vs 96%, respectively). The incidence of biopsy-proven rejection was low in both groups (8%). We report here the 3 year patient and graft survival as well as renal function in the study population. No further graft loss occurred between one year and 3 years in either group but one patient died having lost the graft within the first year (group T), giving 3 year patient and graft survival in groups S and T of 98% vs 98% and 94% vs 96%, respectively (p=ns Although Rapamune® (sirolimus) has proven to be a potent immunosuppressive agent and is used in many steroid-free or -sparing regimens, reports of impaired wound healing, abscesses and increased lymphocele rates have dampened the enthusiasm for its use. We hypothesized that induction with Thymoglobulin ®, elimination of a sirolimus loading dose and in selected cases a 3-4 week window of sirolimus avoidance would minimize the incidence of these complications without increased risk of early rejection or graft loss. We now report our results using this approach in primary kidney (KTA) and pancreas In conclusion sirolimus can be safely used in steroid-sparing regimens with acceptable rates of patient and graft loss, rejection, wound complications, lymphoceles, and ureteral/ pancreatic-duodenal anastomotic leaks. Our data compares favorably to reports of others using standard steroid based immunosuppression with or without lytic induction. We report our experience with desensitization and the renal transplant outcome in seven highly sensitized individuals with positive T & B cross-match (XM) (group I) and in three individuals with positive T XM (group II). Desensitization is defined as abrogation of positive T-cell IgG XM by complement dependent cytotoxicity (CDC) and flow cytometry (FC). All transplants were from living donors. Desensitization protocol for group I consisted of: (1) Historically, ciclosporin monotherapy has produced excellent long-term graft and patient survival rates when used as either initial or maintenance (>1 yr) immunosuppression. Conversely, addition of steroids results in a dose-related reduction in survival figures. However, amongst patients started on Neoral alone, rejection rates are relatively high and fewer than 50% remain steroid-free in the long-term. In order to investigate the utility of CD25 antibody (anti-IL-2R) as a strategy for avoidance of steroids or other additional immunosuppression, we conducted a prospective, multicentre, randomised, double-blind, placebo controlled, 12 month study of basiliximab induction on 108 kidney transplants receiving ciclosporin for microemulsion (Neoral) monotherapy. Patients were randomised pretransplant to receive a two dose course of basiliximab (n=52) or placebo (n=56). Requirement for oral steroids at any time in the study was lower in the basiliximab group (33% vs 61%; P=0.004). Maintenance steroid use was lower with basiliximab than placebo at 6 months (26% vs 60%; P<0.001) and at end of study (25% vs 61%; P<0.001). More basiliximab patients than placebo patients were still maintained on Neoral monotherapy at 6 months (52% vs 29%; P=0.018) and at the end of study (46% vs 27%; P=0.046). 73% of the basiliximab group and 61% of the placebo group continued with Neoral as sole agent or with adjuncts until the end of study. The main reasons for changing from Neoral monotherapy were acute rejection and delayed graft function. Rejection occurred in 29% basiliximab patients and 43% placebo patients (P=0.17). One year graft and patient survival were 88% and 98% for basiliximab and 88% and 96% for placebo. The mean and median values for serum creatinine were consistently lower in the basiliximab group at every timepoint in the study: at 12 months median creatinines were 141 vs 164 µmol/l for the basiliximab and placebo groups respectively (P=0.55) Conclusion. This is the first reported study of basiliximab induction with Neoral monotherapy immunosuppression. This strategy of Simulect induction significantly reduced the need for added maintenance immunosuppression allowing approximately 50% of the patients to be maintained on Neoral monotherapy, and 75% to be maintained on steroid-free, tailored immunosuppression at 1 year post transplant. Early corticosteroid elimination (ECE) in high immunologic risk (HIR) renal Tx patients has been avoided due to previously published reports of increased risk of rejection. However ECE in HIR may be possible with newer IS agents (tacrolimus, sirolimus, MMF) particularly with T cell depleting antibody induction. Therefore, we have conducted the first prospective study of ECE in patients at risk for rejection. Methods: 25 pts were enrolled prospectively in an IRB approved HIPAA compliant protocol. IS consisted methylprednisolone 7 day taper, tacrolimus(target level 4-8ng/ml), sirolimus(target level 8-12 ng/ml), and MMF(2 gm/day). Induction with daclizumab 2mg/kg on POD 0, 14 was administered to the first 10 patients, then changed to thymoglobulin on POD 0 and 2 and daclizumab on POD 14. Recipient inclusion criteria were repeat Tx or pts with peak PRA≥25%. Data was analyzed for acute rejection (AR), graft loss, and death. Results: 25 pts with median followup of 402 (range 42-720) days(d) were analyzed. Recipient demographic characteristics were mean age 42yo, 40% AA, 28% male, 68% repeat Tx, 68% cad. Pretransplant immunologic markers revealed 36% with a peak flow PRA >25%, and a median 3 HLA AB MM and 1 HLA DR MM. Median time to therapeutic tacrolimus and sirolimus levels was 4 and 11 days respectively. 72% of the pts are currently CS free, 36% have experienced biopsy proven CI toxicity with 1 requiring CI discontinuation due to HUS, and two pts have required sirolimus discontinuation. The median MMF dose is 1.25(range 0.5-3g/d). The pts experienced rates of AR, graft survival, and pt survival of 40%, 88% and 96% respectively. Graft loses were due to patient death (infectious), chronic allograft nephropathy, and recurrent FSGS. Six of 10 pts (60%) with daclizumab induction alone experienced AR, but AR rates fell to 27% when Thymoglobulin was introduced, (p-value=0.1). Mean serum creatinines at 1, 3, 6, and 12 months are 1.5mg/dL, respectively. Thrombocytopenia is the most frequent hematologic disorder attributed to sirolimus (SRL). Anemia and leucopenia are also frequently observed after renal transplantation in patients (pts) on SRL base therapy. However, SRL responsibility for anemia has never been demonstrated because of its usual use in combination with other myelotoxic drugs such as mycophenolate mofetil or azathioprine. We report 8 cases of anemia in stable transplant renal recipients related to SRL therapy. 8 renal transplant pts with biopsy-proven chronic allograft nephropathy (CAN) have been switched from calcineurin inhibitor to SRL base therapy to avoid nephrotoxicity. Corticosteroids dosage were not modified. Azathioprine was withdrawn at the time of SRL introduction to avoid myelotoxicity in 7/8 pts. SRL trough level target was 12-25 ng/ml (HPLC). All pts had stable renal function before the switch (mean calculated creatinine clearance was 41 ±12 mL/min), and none of them were given EPO. Before the switch mean hemoglobin level was 12.2 ± 1.2g/dl, mean white cell count 5087±2100 / mm3 and mean platelets count 225000 ±83000 / mm3. Mean duration of SRL therapy before anemia lowest level was 11.5 ± 3 weeks. The lowest mean level of hemoglobin was 8.2 ± 1.34 g/dl. Biological features of the anemia are summarized in table 1. Anemia was associated with SRL-related interstitial pneumonitis in 3/ 8 patients. Platelet and leucocyte counts were also slightly decreased (mean platelet count 163 000 ±66000/mm3, mean leucocyte count 3800 ±1200/mm3). Anemia improved in all patients and resolved in 7 patients within 10.5 ±4.7 weeks after SRL withdrawal. This favorable outcome occurred despite the reintroduction of Aza (1 pts) or MMF (2 pts) and without using EPO therapy. Finally we observed a positive rechallenge test in one patient. Conclusion: SRL induces microcytic aregenerative anemia without iron deficiency similar to inflammation-induced anemia. Withdrawal of SRL lead to rapid improvement and complete resolution of symptoms. The rationale for perioperative administration of mono-or polyclonal antibodies used as induction therapy in organ transplantation is based on the fact that activation of the host immune system begins immediately on revascularization. Generally these immunosuppressive antibodies are given for the first 7-10 days postoperatively. The aim of this study was to asses the safety and efficacy of high single dose (9mg/kg ) ATG Fresenius S given immediately before revascularization to kidney allografts recipients receiving triple drug immunosuppresion ( Neoral, steroids and Cellcept which was substituted by azathioprine 4 months after transplantation). Seventy-nine, adult first cadaveric kidney recipients were included into the study. Patients were randomised to receive ATG or not. There were no differences between two groups regarding donor factors, degree of sensitisation, no of mismatches and cold ischemia time. No serious side effects or serious adverse events connected to ATG administration were observed.5 years follow up results are shown. Induction Therapy with high single dose of ATG seems to be safe and efficacious in kidney transplantation. The exclusion of corticoid steroids from chronic immunosuppressive regimens could avoid their long term secondary effects of glucose intolerance, dysmorphism, osteoporosis and alteration of the endogenous production of suprarenal hormones. Nonetheless, the immunosuppressive effect and safety of steroids is well known. They are easy to prescribe and follow; are inexpensive and thus widely used by clinicians. In this study we question the long term need for steroids in recipients of live-donor renal grafts. Patients and methods: In this retrospective analysis we compared a standard regimen (SR) of MMF(2g/day), tacrolimus (levels of 10-12 ng/ml) and prednisone VS a Rapid Steroid Elimination Protocol (RSEP) consisting of i.v. solumedrol (day 0,1 and 2 only), basiliximab (20 mg on day 0 and 4), MMF (2g/day) and tacrolimus (levels of 10-12 ng/ml). Study group RSEP n=69, 2:1 male/female ratio, age(mean)= 49.9. Control group SR n=72, 1:1 male/female ratio, age (mean)= 49 years. The groups were statistically indistinct regarding HLA and DR mismatches, ESRD diagnosis, comorbidities, related vs unrelated grafts and PRA%. Follow up was > then one year. Results: At one year 16 patients on the RSEP group had been placed on chronic steroids: 2 patients had recieved a pancreas after kidney transplant; 14 had biopsy proven acute cellular rejection. The RSEP group had 2 deaths, both with functioning grafts, from CVA. One graft was lost from recurrent FSGS. The SR group had 2 deaths, both due to sepsis. One graft was lost due to recurrent FSGS. Conclusions: Both groups of patients that were placed on chronic immunosuppressive regimens with or without steroids exhibited similar graft and patient survival at one year, with comparable rejection rates and serum creatinine levels. This suggests that after brief induction with i.v steroids and basiliximab, 77% of the recipients of live donor renal allografts do not require chronic steroids. Further follow up will answer if the steroid free group is susceptible to late (>2 years) rejection episodes and the effect of steroids (or no steroids) on chronic allograft nephropathy. Background: High dose maintenance prednisone (> 9 mg/day) and cumulative prednisone exposure have been identified as risk factors for increased bone loss after renal transplantation (RT). At our program, the maintenance prednisone dose in new RT recipients was reduced from 10 mg to 5 mg a day in all patients transplanted after January 1, 2002.We report the first year bone mineral density (BMD) results in RT recipients who received maintenance prednisone at 10 mg a day (historical control) and 5 mg a day. Methods: All patients who received RT from April, 2000 (start of the program) to November 30, 2002 and had pre-transplant and one-year BMD (measured by DEXA imaging) were retrospectively reviewed. Results: 54 patients were treated with predisone 10 mg a day and 51 with prednisone 5 mg a day. The majority of these patients also received tacrolimus and mycophenolate mofetil. The table displays the percent change in BMD from pre-transplant to one-year post-transplant measured at the femoral neck and the lumbar spine. There was no significant difference in parathyroid hormone levels pre-transplant or at one-year post-transplant between the prednisone groups. Conclusions: Reducing maintenance prednisone dose from 10 mg to 5 mg a day did not significantly reduce bone loss at one-year post-transplant. The number of patients with osteoporosis or osteopenia at one-year post-transplant was similar between the groups. Bone loss continues to be a problem after RT on newer immunosuppression protocols. Statins are associated with increased bone mineral density (BMD), in the general population and in transplant recipients. We examined the effect of statin therapy on BMD in transplant recipients treated with a bisphosphonate (BisP) for established osteopenia or osteoporosis. BMD was measured using a Lunar DPX-L bone mineral densitometry unit (Lunar, Co.; Madison, WI) (DEXA). Statins were prescribed for hyperlipidemia and BisP were prescribed for osteoporosis defined as T-score ≤ -2.5 (WHO criteria) and for osteopenia (T ≤ -1.5) based on intial DEXA at the treating physicians discretion. The standard immunosuppressive regimen included cysclosporin, Cellcept and steroids with steroid taper to 5 mg a day maintenance dose by 3-6 months post transplant. Oral calcium and vitamin D were administered routinely. Baseline DEXAs were analyzed for the effect of six months or more of statin therapy on initial BMD. Follow-up DEXA scans were analyzed for the effect BisP with or without statin therapy on lumbar spine (LS) and femoral neck (FN) BMD. Means were compared using a t-test or kruskal wallis as indicated. Categorical variables were compared with Chi-square or Fisher's exact test. A total of 439 DEXAs in 133 patients were analyzed. The demographic characteristics did not differ among the studied groups. The initial mean BMD of the LS was 1.206 g/ cm 2 in the statin (S+) vs 1.205 g/cm² in the no statin (S-) group (P=NS) with mean Tscores of -0.1 and -0.2 respectively. The initial mean BMD at the FN was 0.865 g/cm 2 in (S+) and 0.863 g/cm 2 in (S-) (P=NS) with mean T-scores of - Osteopenia expressed by low BMD is a frequent complication after kidney transplantation and appears early after the procedure. Most of the data available are cross-sectional studies or with short-term follow-up. The purpose of the present study was to investigate prospectively the evolution of lumbar BMD on a population of renal transplant on low-dose steroids. Methods: In 65 patients with functioning graft, 15 on treatment with Cyclosporin (CsA) and 50 with Tacrolimus, serum biochemical markers of bone metabolism and BMD at the lumbar vertebrae L2-L4 and in the femoral neck were prospectively evaluated in at least four serial examinations ( at transplant, 1y, 2y and 3y thereafter At one-year biopsies, only one pt with borderline subclinical rejection at 3 months biopsy, the amount of fibrosis increased from mild to moderate. In the rest of the biopsies the amount of fibrosis remained unchanged. Conclusion: When using thymoglobulin induction in combination with TAC and MMF with or without steroids, the rate of subclinical rejection is very rare (1%). Protocol biopsies in the modern immunosuppressant era may not be necessary to detect subclinical rejection. However, protocol biopsies may provide valuable information for monitoring fibrosis (CAN) and chronic drug toxicity and may prove useful in directing modifications of immunosuppression. Clinical trials to assess novel therapies are essential to advance organ transplantation (tx). It is unknown whether participation in clinical trials impacts upon tx outcomes. Methods: Our cohort comprised of 373 unsensitized adults undergoing primary living donor kidney transplantation (LDKT) between 1997-2001, divided into study (SPs) and non-study patients (NSPs). Recipient and donor demographics, tx characteristics, and frequency of clinic visits, readmissions, graft ultrasounds and biopsies during the 1st post-tx year were compared using Fisher's exact, Chi-squared, and Mann-Whitney tests. Patient and graft survival and rejection within 1yr of tx were determined by Kaplan-Meier analysis. One-yr creatinine (Cr), Cr clearance, and delta-Cr (1 yr -6 mo Cr) were compared using the Mann-Whitney test. Results: Nearly 1/3 (122 of 373) of all LDKT recipients were enrolled in studies. Compared to NSPs, SPs tended to be male (p=0.001) and less well matched with their donor (p=0.003) ( Table 1) . During the 1st post-tx year, SPs had more clinic visits (p=0.02) and readmissions (p=0.01), but did not undergo more graft ultrasounds or biopsies. SPs had higher 1 yr Cr compared to NSPs (p=0.002) but there were no significant differences in Cr clearance (p=0.06) or delta-Cr (p=0.38) ( Table 2 ). Finally, patient survival, graft survival and rejection during the 1st post-tx year were entirely comparable for SPs and NSPs (Table 2) . Conclusions: A substantial proportion of our adult LDKT recipients participate in clinical trials. SPs, who frequently receive novel therapies, enjoy comparable outcomes to NSPs receiving full-dose standard immunosuppression. Participation resulted in a modest intenstification of post-tx follow-up but did not increase biopsy frequency. These results should encourage tx centers to vigorously approach tx candidates to participate in clinical trials. Background: The purpose of this study is to determine whether Daclizumab (Dac) induction in combination with Mycophenolate Mofetil (MMF) enables a low dose Cyclosporine (CsA) regimen. Safety, efficacy and impact on early renal function of this regimen are compared to standard immunosuppression (standard-dose CsA and MMF). Methods: This multicenter international, prospective, randomized study was conducted in 14 centers in 3 European countries. Patients were randomized into two groups. Standard group: CsA trough levels: 150-250 ng/ml with MMF (2g/day). Dac group: 2 mg/kg Dac first dose, followed by 4 additional doses of 1 mg/kg every 2 weeks, MMF (2g/day), and low dose CsA which was defined as 50% of the standard trough levels in the individual centers ranging from 75 to 125 ng/ml. Steroids were tapered identically in both groups. A total of 156 patients were enrolled in the study. Only data from 121 patients with a follow-up of at least 3 months are presented in this interim analysis. Results: 59 patients were analyzed in the Dac group and 62 in the standard group. Baseline characteristics were similar for both groups. CsA levels in the Dac group at day 7, 28, and 95 were reduced to 53%, 65%, and 68%, respectively, compared to the standard group and thus did not reach exactly the intended reduction of 50%. Patient survival at 3 months was similar in both groups with 98% and 100% in the Dac and in the standard group, respectively. After three months follow-up 5 grafts were lost in the standard group versus 1 graft lost in the Dac group. The incidence of biopsy proven acute rejection episodes was 7% following Dac induction compared to 27% in the standard group (p=0.0035). Mean creatinine values were 1.5 ± 0.5 mg/dl in the Dac group versus 2.0 ± 1.4 mg/dl in the standard group at week 12. There were no differences in the incidence of either serious adverse events or major infections. Conclusion: Daclizumab induction in combination with MMF is a safe and efficacious regimen in kidney transplantation that allows a profound primary sparing of CsA. There is a tendency towards a better renal function after 3 months in the Dac group. With respect to graft survival and acute rejection episodes it is superior to standarddose CsA in combination with MMF. To improve the use of older donor kidneys we participate in the ESP where kidneys are allocated locally to older recipients with short cold ischaemia time regardless of HLA compatibility. As decreasing immunity in elderly patients has been reported, initially, we used a calcineurin-free immunosuppressive protocol. Because of numerous rejections, we switched to a Basiliximab/ FK506/ steroid-based protocol with good results. In order to learn more about the immune system in the elderly recipients, we compared several immune functions pre and post Tx in ESP and in the younger patients allocated by the regular allocation programms (ETKAS). The results of 11 ESP (67.4 yr.; donors: 67.6 yr.) were compared with 17 ETKAS patients (51.1 yr; donors: 34.2 yr.). The acute rejection rate,1-year graft and patient survival were comparable in both groups. We determined the numbers of T, B and NK cells as well as monocytic HLA-DR/CD86 expression by flow cytometry pre-Tx and in the follow-up post Tx. The secretion of IFNg/IL-4 and TNF-a/IL-1b was measured 24 hr after Con A and LPS stimulation, respectively. HCMV-specific and allospezific IFN-g producing T cell frequencies were determined by flow cytometry and by ELISPOT Assay, respectively. CD4+ T cell counts and monocytic HLA-DR expression were significantly higher in the ESP patients in comparison with ETKAS. We found no significant differences between the groups regarding CD8+ T, NK and B cell counts as well as LPS induced TNF-a/IL-1b production before transplantation. Both groups showed also comparable Con A induced IFN-g memory T cell response and comparable frequencies of donorspecific and CMV specific IFN-g producing T cells pre Tx. In the early phase post Tx both groups showed a similar T lymphopenia. The recovery of T cells, particularly of CD4 + T cells, as well as recovery of ConA induced IFN-g T cell response post Tx was reduced in the ESP patients. LPS induced TNF-a secretion and monocytic HLA-DR/ CD86 expression after Tx were comparable between the groups. Elderly graft recipients do not show signs of "relative immunodeficiency" compared with younger graft recipients pre Tx. A protocol based on Basiliximab/FK506/steroid turned out to be succesful in preventing early post Tx complications. Despite powerful prevention of acute rejection and delayed recovery of T cell function, ESP patients on this immunosuppressive protocol had no enhanced risk for infections. Our data support a powerful initial immunosuppression in ESP patients. In non-heart beating donor (NHBD) kidney transplants, it is desirable to reduce the calcineurin antagonist dose to avoid deleterious effects on the kidney grafts, which show a high incidence of acute tubular necrosis. The present study was designed to examine whether a therapy regime based on anti-CD25 monoclonal antibody induction plus mycophenolate mofetil will allow the use of low-dose tacrolimus in the immediate post-transplant period, without increasing the number of acute rejection episodes. Methods: 177 consecutive NHBD renal transplant recipients were treated as follows: Group I (N=21), cyclosporine (8 mg/kg/day) plus azathioprine plus steroids; Group II (N=65), low-dose cyclosporine ( We have shown that the combination of GI complications and MMF discontinuation during the first year following renal transplantation are associated with later graft failure. It was not clear in that analysis if MMF withdrawal occurred after GI complications and we could not examine MMF dose reductions. Here we examine the association with graft failure of MMF dose reduction or withdrawal following a GI complication. METHODS: Patient records for all adult renal transplant recipients were drawn from the United States Renal Data System (USRDS) registry database between 1995 and 2000. Diagnoses of GI complications were drawn from ICD-9 codes and MMF prescriptions were drawn from payment records contained in Medicare billing data supplied by the USRDS. Patients were included in the analysis if a diagnosis of a GI complication was recorded following transplantation and prescription records indicated MMF at the time of GI diagnosis. The study interval began with the GI diagnosis and ended at graft failure with censoring at three-years post-transplant, last expected followup record, and last recorded immunosuppression payment. Associations were estimated using multivariate time varying methods. RESULTS: Outcomes of 3,324 renal transplant recipients with a GI diagnosis were examined. Patients prescribed an MMF dose reduction were associated with an increased hazard of graft failure. The relative hazard of graft failure was 2.2 (P = 0.03) during intervals of dose reduction less than 50% and 2.5 (P = 0.003) during intervals of dose reduction greater than 50%, referenced against a relative hazard of 1.0 which would indicate no association between dose reduction and graft failure. The relative hazard of graft failure increased further to 3.9 (P < 0.0001) during intervals of MMF discontinuation. CONCLUSION: MMF dose reduction and withdrawal following the diagnosis of a GI complication are associated with considerably heightened risk of graft failure in renal transplant recipients. Patients prescribed MMF dose modifications following GI complications should be managed with great care. Poster Board #-Session: P176-II Background: The combination of SRL and TRL has proven to be effective therapy in renal transplantation. However, most published series are limited by small sample size and short term follow-up. We report the largest single center series using this combination with a long term follow-up of at least one year. Patients: A retrospective analysis of all transplants performed between January 2000 and December 2002 that were maintained on such combination therapy was done. Two hundred and seventy patients (176 males and 94 females, mean age 46 years) received deceased (DD; 60%) or living donor (LD; 40%) allografts. Sixty patients (22%) were African-American, and 39 (14%) were recipients of a second or subsequent transplants. The average cold ischemia time for DD grafts was 28:36 hours, and of these grafts, 91 (56%) experienced delayed graft function. Induction therapy was used in 96% of DD recipients vs 54% of LD recipients. Results: Patient and graft survival data are shown in (table I). In the DD recipients, 15 grafts (5.5%) were lost due to primary non-function. The one year acute rejection rate was not significantly different in the DD vs LD groups (14.7% vs 12%). Average serum creatinine levels at one and two years were 2 and 1.9 mg/dl, and 1.6 and 1.9 mg/dl, in the DD and LD recipients, respectively. Various complications (with incidence of) occurred: lymphocele (6%), superficial wound infection (7%), wound dehiscence (6%), deep vein thrombosis (7%), ureteric obstruction/damage (5%), de novo transplant diabetes mellitus (17%), and verious post-transplant malignancies (mainly skin cancers, 8%). Other complications included fungal (n=9), and viral (n=13) infections and hemolytic uremic syndrome (HUS) developed in 3 patients. In 29 patients, SRL, and in 11 others, TRL was discontinued. Purpose: Steroid free immunosuppression regimens are currently used in clinical practice. The purpose of this study is to evaluate the effect of glucocorticoids on immediate renal allograft function after kidney transplantation. Methods: From April 2002 to February 2003, ten patients were enrolled in a steroid free immunosuppressive regimen pilot study (tacrolimus, mycophenolate mofetil, and prolonged course of daclizumab). The immediate renal allograft function (daily serum creatinine levels from post-operative day 1 to 6) were compared to 12 case controlled patients in the same time interval (treatment with tacrolimus, mycophenolate mofetil, and standard steroid taper which consisted of methylprednisolone 125mg in the OR, then 100mg, 80mg, 60mg, 40mg, 30mg, 20mg daily followed by a prednisone taper. Statistical analysis included Mann-Whitney U-Test was performed. Results: In the study period, there were 10 cadaveric and 12 living donor kidney transplants performed. In the cadaveric group, 6 were treated with steroids and 4 were not, and there were 6 patients in each treatment strata in the living donor group. The combined groups' serum creatinine (mg/dL) with and without steroid treatment can be seen in Table 1 . The serum creatinine is generally lower in the steroid treatment group and reaches statistical significance by the fourth post-operative day and remains as such afterwards. There was only one patient with biopsy proven acute rejection on post-operative day 8 in the steroid-free living donor group. Conclusion: In this study, complete steroid avoidance was associated with a slower decline of the serum creatinine after kidney transplantation. Anti-inflammatory effects of steroids may help minimize the ischemia-reperfusion injury. Peri-operative steroid utilization may thus be advantageous over steroid avoidance in that regard. Background: Recurrent hepatitis C (HCV) after liver transplantation may accelerate hepatic fibrosis and lead to graft loss and patient death. This accelerated natural history is likely related to immunosuppression, particularly OKT3 and Solumedrol pulse therapy. Recent studies have suggested faster disease progression and inferior survival with recent years, partly because of stronger immunosuppression. Since 1988, we have had three distinct eras of immunosuppression with marked differences in utilization of OKT3. Specific Aim: 1. Compare patient survival for patients transplanted for HCV during three distinct eras in immunosuppression in our center. 2. Determine whether prevalence of OKT3 usage during these eras adversely impacts survival. Methods: Using data from our center's Liver Transplant Database, we performed a retrospective cohort study of 261 HCV Ab-positive patients out of a total of 799 transplants between the years 1988-2003. Three eras of immunosuppression were identified: I, cyclosporine (CSA)-based with late steroid withdrawal (SW) (after 1 yr) from 11/1/88 to 8/8/95; II, very early SW (14d) with sequential assignment to either CSA or tacrolimus (TAC) with or without mycophenolate mofetil (MMF) from 8/9/95 until 1/28/00; and III, very early SW (3d), sirolimus (RAPA)-based with sequential assignment to either CSA or TAC from 1/29/00 to 4/1/03. Patient and graft survival were estimated using the Kaplan-Meier method with comparison between groups using the log-rank test. Results: Demographics of patients with HCV, including age, donor age, BMI, race, and sex were similar between eras. Patient survival was similar between eras: Hepatitis C associated cirrhosis is the most common indication for liver transplantation. Recurrence following transplantation is universal. Recently published data suggests that HCV recurs earlier and is more severe in LDLT recipients than in cadaveric recipients (CAD). AIM: To compare HCV recurrence rates in adult right lobe LDLT vs. CAD recipients. METHODS: Retrospective analysis of HCV infected patients who underwent liver transplantation from 9/99 to 8/03. 71 patients underwent LDLT in this period, 39 with HCV related cirrhosis. Patients enrolled in HCV-related clinical trials, and those who died within 30 days of surgery were excluded. Data from 33 LDLT and 52 CAD recipients transplanted during the same period were analyzed. Groups were matched for age and sex. Immunosuppression consisted of low dose steroids and tacrolimus. The diagnosis of recurrent HCV was based on histologic findings. Biopsies were performed for clinical indicaton (increased AST/ALT/GGT) and not on a protocol basis. RESULTS: Average MELD scores were significantly lower in LDLT recipients. Mean follow-up was 20.6 months in LDLT and 19.3 months in the CAD group. One or more rejection episodes were observed in 10 (30.3%) patients in the LDLT group and 10 (19.2%) in the CAD group (p=0.20). Rejections were treated with steroids as per our protocol. Histologic recurrence of HCV occurred in 19 (57.6%) patients in the LDLT group and 32 (61.5%) of the CAD recipients (p=0.72). Average time to recurrence was 163 days in LDLT compared to 186 days in the CAD group (p=0.70). In the LDLT group, biopsies in 8 (24.2%) patients showed recurrence with progression to portal fibrosis while 10 (19.2%) patients in the CAD group demonstrated fibrosis (p=0.58). Two LDLT recipients developed a late fibrosing cholestatic hepatitis, both at 14 months. One patient in the CAD group developed an early fibrosing cholestatic hepatitis. Recurrent hepatitis C led to graft loss in 1 (3%) LDLT recipient and 4 (7.7%) CAD recipients (p=0.37). CONCLUSION: In our study, there was no significant difference between the number of HCV recurrences or the time to recurrence between LDLT and CAD recipients. Although it appears that portal fibrosis occurs more in the LDLT group, the difference was not statistically significant. Graft loss due to hepatitis was more common in cadaveric recipients. While this study demonstrates that HCV recurrence rates are similar in LDLT and CAD recipients a large prospective trial needs to be undertaken to study this problem. Orthotopic liver transplantation (OLT) for end-stage hepatitis-C-virus (HCV) infection is commonly complicated by recurrence of HCV and a significant number of patients will develop severe graft hepatitis after OLT. However, the relevance of HLA-matching in the recurrence of HCV is still under discussion. In this study we investigated the effect of HLA-compatibilities on outcome and fibrosis progression of HCV positive patients after OLT. In a retrospective analysis 165 liver transplants in HCV positive patients with complete donor/recipient HLA typing were reviewed for recurrence of HCV and outcome after OLT. Follow up ranged from 1 to 158 months (median=63,3 months). Immunosuppression consisted of either CsA based quadruple induction therapy including ATG or an IL2-receptor antagonist or with Tacrolimus. Protocol liver biopsies were performed after 1-, 3-, 5-, 7-, and 10 years and staged according to the METAVIR scoring system. The overall 1-, 5-, and 10 years graft survival figures were 81,8%, 69,11 and 62%, respectively. There was no correlation between number of HLA-compatibilies and graft survival in the study population. The number of rejection episodes was significantly increased in patients with less HLA compatibilities (p<0.05). In contrast to this the fibrosis progression was signicantly faster in patients with 1 or more HLA compatibilities when compared to patients with no HLA compatibility. In conclusion, HLA-matching does not influence graft survival in patients after OLT for end-stage HCV infection. However, despite less rejection episodes the fibrosis progression was increased in patients with more HLA compatibilities within the first year after OLT. Cirrhosis due to hepatitis C virus (HCV) is a leading indication for orthotopic liver transplantation (OLT). Following OLT for HCV, as many as 80 to 90% of all allografts become re-infected. As many transplant programs are utilizing extended criteria donor (ECD) livers, the impact of this practice on recurrent HCV infection remains unknown. PURPOSE: To analyze rates of HCV re-infection in OLT recipients receiving ECD livers compared to a control group of standard criteria donor (SCD) liver allografts. METHODS: All OLTs performed between Jan 2002-Oct 2003 were categorized as having received either an ECD or SCD liver at our center. ECD grafts were defined as having any one or more of the following: donor age > 60 yrs, biopsy exhibiting >40% macrovesicular and/or >75% microvesicular steatosis, any donor down time (±CPR), requirement of 2 or more pressors within 12 hours of procurement, or liver function studies > 5 x upper limit of normal. All recipients underwent liver biopsy at 90 days or earlier, if there was an appreciable rise in baseline liver function tests (LFTs). The two groups (ECD and SCD liver recipients) were similar in regards to: MELD score (match and lab), gender, age, and graft ischemic (total and warm) times. Ninety OLTs were performed during the study period: 42 ECD livers and 48 SCD livers were transplanted. HCV-related cirrhosis accounted for 24/42 (57.1%) of the OLTs in the ECD group and 20/48 (41.7%) of the OLTs in the SCD group (p=0.14). Biopsy-proven recurrent HCV infection was noted in 21 (23.3%) of all allografts. The ECD group was found to have a 33.3% recurrence rate (8/24) while in the SCD group a recurrence rate of 65% (13/48) was noted (p=0.04). No significant differences were observed between the groups in LTFs of patients requiring biopsies. CONCLUSION: The aggressive use of extended criteria donor hepatic allografts was noted not to lead to an increased rate of HCV re-infection in the recipient allograft. Indeed, there was a statistically significant increase in HCV re-infection in those recipients receiving standard criteria donor livers. This suggests that perhaps a "stressed donor allograft" is protected from viral re-infection. Introduction: Hepatitis C viral infection (HCV) is the commonest indication for liver transplantation (LTx). Recurrence of HCV is uniform after transplantation. Combination of pegylated interferon (peg-INF) with Ribavirin, which is not without the side effects, has been used as anti HCV treatment. It has been recommended that patients with renal dysfunction should receive lower doses of Ribavirin in order to reduce the incidence of hemolysis. Aim of the present study is to examine the role of measurement of plasma and whole blood concentration of Ribavirin to determine this relationship to hemolysis, renal dysfunction, and treatment response. The left over 34-blood sample drawn for tacrolimus level from 23 post LTx patients (who were on Ribavirin) were used to estimate Ribavirin concentration in plasma (all samples) and blood (17 samples) by HPLC. Patient's clinical history and concurrent medications were evaluated. Results: There was a wide variation in the Ribavirin concentration ranging from 1.8 µ/ l to 122.1 µ/l with mean plasma concentration of 48.4± 10.2 µ/l. There was a close correlation between the plasma and whole blood Ribavirin level (mean plasma level 42.7± 21.8 µ/l; mean blood level 37.5 ± 17.3 µ/l.) No correlation was found in the response rate (biochemical, histogical or HCV viral load), hemolysis, renal function or length of Ribavirin treatment on subsequent estimations. However the concentrations were higher when subjects were on antiretroviral agents (5 samples mean 87.3± 39.4 µ/l) and lower when patients were on proton pump inhibitors (11 samples mean 29.8±20.3 µ/l). Despite of the fact that Ribavirin is highly bound to red blood cells, the concentration in whole blood and plasma were similar. Higher concentrations were observed with concurrent use of anti retroviral agents and lower with proton pump inhibiter. The toxicity to the concentration could not be established in this pilot study. It appears that Ribavirin, as a parent compound may not be responsible for therapeutic effect or hemolysis. It is possible that its phosphorelated metabolite of Ribavirin may be determining factor to response rate and toxicity. Further more studies are needed to elucidate the role Ribavirin with peg-INF in recurrent HCV post liver transplantation. Purpose: To assess the incidence of HCVR 5 years (y.) post OLT in patients (pt) treated (Rx) with MMF for different periods. Methods: We have previously shown that HCV-OLT have a low incidence of HCVR at 1-and 2-year of follow up (f/up) when Rx with MMF in a dose-and a time-dependent fashion. However, it is not known for how long should HCV-OLT remain on MMF, since its availability in the market is relatively recent. MMF was used as rescue agent for steroid-resistant acute rejection (SR-ACR) or as a renal-sparing drug in cases of CSA nephrotoxicity. HCV-OLT 1993 -1997 Conclusions: Patients kept on MMF at all times presented lesser HCVR and lesser HCVR progression at 5-y and their graft function (not shown) was similar to controls who did not have SR-ACR or nephrotoxicity. Patient on MMF for < 3 years had higher incidences of HCVR and HCVR progression. We conclude from this limited initial experience that MMF is an important component of HCV-OLT IS and pt should remain on it for long terms. Poster Board #-Session: P185-II Introduction:Cholestatic Hepatitis C (CHC), a severe variant of recurrent HCV following transplantation is defined by a serum total bilirubin greater than 10mg/dl, liver biopsy consistent with portal expansion and ductular proliferation, and the absence of extra-hepatic biliary tract obstruction. However, HCV associated cholestasis with similar histologic and laboratory findings may occur in the presence of extra-hepatic biliary obstruction. The incidence and natural history of this phenomenon remains incompletely described, particularly when comparing deceased (DD) and living donor (LD) liver transplant recipients. Methods: A retrospective analysis of all 102 (71 DD and 31 LD) HCV infected patients transplanted at our center between 7/99 and 6/03 was performed. The incidence of extrahepatic biliary obstruction (EBO) defined as any anastamotic or ampullary stricture requiring ERCP or PTC placed stent or surgical revision was determined, and the evolution of HCV in these patients was assessed. HCV associated cholestasis was defined as a total bilirubin greater than 10mg/dl, with similar histologic findings as CHC. Results: The overall incidence of EBO was 23%, 19% in DD and 32% in LD ( p = NS). When comparing the incidence of HCV with cholestasis in patients with EBO, 42% of patients developed this complication while 58% did not (p = ns). There was no difference in the rate of HCV associated cholestasis associated with EBO when comparing LD to DD. In patients who developed HCV with cholestasis in the setting of EBO, outcomes were poor; graft failure and death occurred in 60%, 30% had severe cholestasis despite therapy, and 10% recovered following biliary stenting and therapy with Interferon and Ribavirin Conclusions: 1) HCV recurrence with cholestasis may occur in patients with extrahepatic biliary obstruction post transplant. 2) This syndrome has identical laboratory findings and similar histopathology to cholestatic hepatitis C, a syndrome which occurs in the absence of biliary obstruction. 3) HCV associated cholestasis in the setting of extrahepatic biliary obstruction portends a poor prognosis, despite attempts to relieve biliary obstruction and treat HCV. Hepatitis C virus (HCV) reinfection is the rule following orthotopic liver transplantation (OLT) for HCV-related cirrhosis. In this retrospective study we compared the efficacy, the safety and the overt HCV recurrence rate following induction therapy using either ATG or anti-CD25 monoclonal antibodies (basiliximab or dacluzimab) in combination with steroids and tracolimus. We included 31 consecutive OLT patients ; those who died before day 30 were excluded from that study. They were transplanted between 01-99 and 12-02. The patients were follow-up until day 180. Overt HCV recurrence was defined as an increase in liver enzymes with histological evidence of HCV-related lesions, in the absence of acute rejection. There were 20 men, 11 women of a mean age of 57 ± 6 years. All were HCV RNA positive at the time of OLT ; genotype 1(1b) was found in 74 % (64.5 %) of patients. The first 16 patients received ATG induction (Group I) (1 mg/kg/d for 3 consecutive days, then adapted to TCD2(+) lymphocyte count [to be less than 50/mm3] and to tacrolimus trough levels). The following 15 patients (group II) received induction therapy with either basiliximab (20 mg twice at day 1 and 4) or dacluzimab (2 mg/kg at day 1 and 1 mg/kg at day 8). Tacrolimus was started on average at day 1. Steroids were started per op. The rate of acute rejection was similar both in group I (37.5 %) and in group II (20%) (p=ns). The rate of bacterial (50% in group I vs. 40% in group II, p=ns), viral, i.e. cytomegalovirus (25% in group I vs. 33% in group II, p=ns), and fungal (18.75 in group I vs. 26.6% in group II, p=ns) infections was similar in both groups. Renal function as well as hematological parameters were Abstracts similar in both groups during the study period. Liver enzyme levels, i.e. aspartate and alamine aminotransferase, gamma glutamyl transpeptidase and bilirubin were similar in both groups from day 8 until day 180. Overt HCV recurrence occured in 56.25 % in group I and in 80 % in group II (p = ns). HCV RNA viral load was not statistically different during follow up between the 2 groups ; however there was an increase from baseline values. We conclude that induction therapy with ATG in OLT HCV(+)/RNA (+) recipients is efficient, safe as compared to induction with anti-CD25 antibodies, particularly with respect to HCV recurrence and overall infections. HCV recurrence after liver transplantation is a serious complication. We investigated whether steroid-free immunosupression with Thymoglobulin® induction reduces HCV recurrence following liver transplantation (OLT). METHODS: We enrolled twelve patients between 11/02 -10/03. 36 prior OLT recipients with HCV cirrhosis served as control. The immunosuppressive protocol consisted of: Thymoglobulin® 1.5 mg/ kg/d on days 0-3, Azathioprine (AZT) 100 mg/d for 3 months, Cyclosporine (CyA) starting on day 4 (to maintain levels of 200-300ng/ml). RESULTS: After a mean followup of 8 months, nine patients are currently on single-drug-regimen (CyA or Tacrolimus). Six patients had abnormal liver function tests (LFT) due to: alcoholic hepatitis; HCV recurrence; biliary complication (2 pts); rejection (now on AZT and Tacrolimus); and cholestasis of unknown origin (HCV or rejection, now on Tacrolimus only). All but the first patient had biopsy-proven diagnoses. The first three patients have normalized their LFT; the last three are improving. Six patients have undetectable HCV-RNA by PCR after completing 3-6 months of anti-HCV therapy (pegylated interferon and Ribavirin) and five have moderate to high HCV-RNA and are not tolerating anti-HCV therapy due to severe side effects. One patient is in the early post-operative period and has not started anti-HCV therapy yet. In the steroid-free group the incidence of high viral load (HCV RNA > 100,000 IU/mL) was lower than the control group [42% vs. 85%] (Fig. A) , and the incidence of rejection-or HCV-related liver dysfunction episodes [25% vs. 50%] was also lower (Fig. B) . Steroid-free immunosupression with Thymoglobulin® induction in OLT for HCV cirrhosis leads to low incidence of rejection, lower viral replication and decreased HCV recurrence rate in the allografts. The diagnosis of acute rejection (AR) versus recurrent hepatitis C (R-HCV) after liver transplantation (LTx) is important since treatment of either may result in reciprocal consequences. We aimed to assess the reliability of liver biopsy (bx) to discriminate AR from R-HCV. Methods: Two LTx pathologists from high-volume centers reviewed 38 blinded bxs obtained from HCV+ recipients between 4 wks and 6mos after LTx: 20 bxs were from 1993-94 (Era1) and 18 were from 2000-01 (Era 2). Historical diagnoses were mild or moderate AR, R-HCV, or both. Each pathologist scored their present-day readings for AR and R-HCV: 1=Unlikely; 2=Possible; 3=Probable; 4=Likely. An experienced clinician scored both present-day and historical readings. Using the clinician's scores, interrater agreement and agreement with historical diagnoses were measured using weighted kappa (wK) statistics. Frequencies of exact (ExAg) and relative (RelAg) agreement (difference=0 or 1) were determined. Results: There were 33 bxs for cause and 5 protocol bxs. The clinician was an excellent interpreter of the presentday readings of both pathologists (wK P1=0.80; wK P2=0.72) (Table 1a) . Interrater agreement is modest for AR (wK = 0.18) and good for R-HCV (wK=0.33)( Table 1b) . Comparison of present-day and Era1 readings showed poor agreement for both AR (wK=0.02) and R-HCV (wK=-0.05). Comparison of present-day and Era2 readings showed poor agreement for AR (wK=-0.03) and modest agreement for R-HCV (wK=0.18) (Table 1c) . Conclusions: Today, 2 LTx pathologists reading early bxs after LTx for HCV exhibit fair agreement on the diagnoses of AR and R-HCV. However, their present-day readings correlate poorly with historical readings suggesting that interpretation of histologic findings diagnostic for AR and R-HCV have evolved over time. Liver bx is not a reliable mechanism to discriminate AR from R-HCV. George J. Netto, the Hepatitis C Three Trial Group. Pathology, Baylor University Medical Ctr, Dallas, Tx. Purpose: The primary objectives of the Hepatitis C three trial are to compare the rates of hepatitis C (HCV) recurrence, acute cellular rejection (ACR), and treatment failure (death, graft loss or more than one dose of corticosteroid for presumptive rejection) among three immunosuppressive treatment regimens. Design: The trial is an open-label, prospective, randomized multicenter study involving adult patients receiving OLTX for end stage HCV liver disease randomized to three Rx Arms. Arm 1: tacrolimus (TAC) and steroids (Pred); arm 2: TAC, mycophenolate mofetil (MMF) and Pred and arm 3: TAC, MMF and daclizumab (steroid free arm). A total of 312 pts from 17 institutions will be randomized at a ratio of 1:1: 2. Per protocol, liver biopsies are obtained at days 1, 90, 365 and 730. All biopsies are reviewed by a central pathologist. HCV recurrence is evaluated according to Batts and Ludwig schema (Batts et al. Am J Surg Pathol 1995 . ACR is evaluated using the 1997 Banff schema (Demetris et al. Hepatology 1997: p658) . Primary composite end points are defined as a recurrent HCV of stage ≥ 2 during the first year or grade ≥ 3 at any point post transplantation and or an ACR of Banff global grade ≥ 2 with RAI score ≥ 4. The current abstract summarizes the histologic findings of 150 biopsies obtained from 74 pts with a F/U range of (6 -380 days) post OLTX. All hepatitis C recurrences and rejections were biopsy diagnosed. This submission is based on the data safety monitoring board from November in order not to violate the trial. HCV recurrence: Histologic HCV recurrence as defined above was established in 20/ 74 pts (27%). 13 of the HCV recurrences were diagnosed on or before day 90 post OLTX biopsy (18 % three months recurrence rate). The shortest and longest durations to HCV recurrence were 37 and 260 days post OLTX respectively. ACR: Histologic ACR as defined above was established in 6/74 pts (8%). Only one pt had severe ACR (Banff global grade 3). The RAI scores range was 4-7.The earliest and latest ACR occurred on days 3 and 48 respectively. Conclusions: At three months post OLTX, 18 % HCV recurrence rate and 8% ACR incidence is histologically demonstrated in 74 pts with complete data and histology reported to the study center in the Hepatitis C three trial. Accrual is expected to be complete by March 2004. An updated report on histologic findings in all pts completing three months F/U will be presented. Valganciclovir (VGC) has not been approved for CMV prophylaxis in liver transplant recipients. The PV16000 compared oral ganciclovir to valganciclovir for primary prophylaxis for CMV and showed liver transplant patients who took valganciclovir had a statistically significant increased incidence of tissue invasive CMV disease in comparison to oral ganciclovir. We sought to determine the safety and efficacy of VGC for prophylaxis in our liver transplant patients. METHODS: We performed a chart review of all adult liver transplant recipients from October 2001 to June 2003. Patients with D+/R-, antilymphocyte antibody treated acute rejection, plasmapharesis, or clinical justification were classified as primary prophylaxis. Secondary prophylaxis patients were treated with VGC after a documented CMV infection. CMV infection was defined as CMV pp65+ with symptoms of infection or a biopsy positive for CMV. RESULTS: 93 patients were assesed, 24 were identified as receieving primary (n=18) or secondary (n=6) prophylaxis. Primary prophylaxis included patients with D+/R-(n=11), antilymphocyte antibody treated acute rejection, (n=3) plasmapheresis (n=2), or clinical judgement (2). Three patients had a temporary VGC discontinuation (6±1.5 days) for thrombocytopenia or leukopenia. An allergic reaction, D-/R-serostatus, and physician discretion resulted in the three permanent discontinuations. The only episode of tissue invasive disease proved to be a ganciclovir-resistant strain probably due to subtherapeutic dosing. CONCLUSION: VGC appears safe and effective for primary and secondary CMV prophylaxis in liver transplant patients with appropriate dosing. Background: Cytomegalovirus infection is a common infection after solid organ transplantation occurring in from 25-85% of patients depending on prophylactic strategies and immunosuppressive protocols. Valganciclovir (VG) is a relatively new antiviral medication that can be administered once daily, and is indicated for cardiac, pancreas and renal transplantation. Little data is available regarding its effectiveness after liver transplantation (LT). Methods: We retrospectively reviewed the charts of all patients undergoing LT from 11/2001-9/2003. Variables included type of immunosuppression, incidence of rejection, donor and recipient CMV status, time to development of CMV, and diagnosis. Prophylaxis against CMV included oral VG (900mg once daily) for 3 months followed by acyclovir (200 mg twice daily). Symptomatic CMV infection was diagnosed by the pp65 antigen assay. Immunosuppression consisted of tacrolimus or cyclosporine, MMF, and steroids. Approximately 50% of the patients also received daclizumab (1-2 mg/kg intraoperatively and on post-operative day #4). Results: A total of 71 LTs were performed. 21 LTs were excluded from analysis because of either death or re-LTs within 60 days (n=14), non-use of VG (n=4), or lack of known pre-transplant CMV status (n=3). A total of 50 LTs in 49 patients were analyzed. Seventy percent of all patients were CMV + prior to LT. The overall incidence of CMV infection was 14% (n=7) and is shown divided by donor/ recipient CMV status in the table below. Group CMV (+) n, (%) CMV (-) n, (%) D-/R- 1 (25) 3 (75) D+/R- 3 (27) 8 (73) D-/R+ 2 (17) 10 (83) D+/R+ 1 (4) 22 (96) Median time from LT to development of CMV was 146 days. At the time of diagnosis only one of the seven patients was on VG; the others were on acyclovir. Treatment of CMV was with VG (900 mg orally, twice daily)(n=4), intravenous ganciclovir (n=2), or decrease in steroids (n=1). Patients who experience rejection had greater than twice the incidence of CMV(p=0.22). Daclizumab did not influence CMV infection. Conclusion: Once daily oral VG is effective in the prevention of CMV infection after LT, and leads to similar rates of CMV post-LT when compared with published series. The low rate of CMV infection post-LT in pre-operative CMV + patients suggests that eliminating prophylaxis in this group may be safe. Abstracts CONCLUSION: Although previous outcomes in VRE-positive liver transplant recipients were devastating, the recent advent of anti-VRE antibiotics has improved mortality remarkably. The use of these new antibiotics will change the future approach to liver transplantation of VRE-positive patients. Poster Board #-Session: P194-II Postoperative bacterial infections continue to be a significant cause of morbidity and mortality after solid organ transplantation. There have been few studies on surgical prophylaxis in liver transplantation and no clinical standards have been developed. Herein, we present short-term follow-up for 96 liver recipients transplanted between January 2000 and October 2002. The original disease for liver transplant recipients consisted of primarily hepatitis C virus 56 (58%) and primary sclerosing cholangitis 9 (9%). Immunosuppression consisted of steroid induction 96 (100%) with a calcineurin inhibitor 89 (93%) or sirolimus 7 (7%) based maintenance immunosuppression. Nineteen (20%) patients received daclizumab for immunosuppression during renal insufficiency. Eighty one (84%) had cadaveric donors and 15 (15%) had living donors. Seventy-four (77.1%) received at least one dose of cefoxitin and eight (8.3%) patients received vancomycin alone intraoperatively. Average duration of antibiotic prophylaxis was 4 ± 1.6 days with a median of 5 days. Results. The overall patient and graft survival were 88 (92%) and 90 (93%), respectively. Bacterial infection was identified in twenty-nine (30%) patients and six (20%) patients died as a result of infection. Bacterial infections developed in 14 (48%) patients less than 5 postoperative days, 19 (66%) patients after 5 postoperative days, and 5 (17.2%) patients in both time periods. The most common infections were pneumonia 19 (19.8%) and peritonitis 10 (10.4%). The most common isolated organisms were enterobacter (10.7%), pseudomonas (10.7%), and enterococcus (10.7%). Overall, eleven (11.5%) patients returned to the hospital with an infection within the first postoperative month. Factors associated with bacterial infection were hospitalization prior to transplant (31%), length of time on ventilator (225 ± 267 vs. 23 ± 15 hrs), MELD (21 ± 12 vs. 15 ± 8.3), and ICU stay (224 ± 267 vs. 85 ± 60 hrs). All patients that were retransplanted developed infection. Bacterial infection post-transplant is still a major problem and early infections are related to prior hospitalization, duration of mechanical ventilation, and ICU length of stay, and living related transplantation. Overall, MELD and retransplantation are associated with postoperative infections. Our study suggests that patients with these factors need broader antimicrobial coverage for surgical prophylaxis. Poster Board #-Session: P195-II This study investigates the impact of lamivudine started before liver transplantation and followed by combined lamivudine and specific immunoglobulins posttransplantation on the risk of HBV recurrence. Methods: 35 consecutive recipients with HBV-related chronic liver disease and with positive serum HBV-DNA were studied. Lamivudine (100 mg/day) was administered for a mean of 12 months (range 1-32) before transplantation. Patients were actively listed once serum HBV-DNA was negative by standard assay. At transplantation, serum HBV DNA was detected using a highly sensitive in house nested polymerase chain reaction (PCR). Lamivudine-resistant mutations at the YMDD motif of HBV P gene were determined by direct sequencing of PCR positive products. After transplantation, lamivudine was continued and anti-HBV immunoprophylaxis was added, aiming at anti-HBs trough levels of at least 200 U/l. The mean post-LT follow-up was 37 months (range 6-80). Results: At time of transplantation, despite DNA negativity by standard assay 29 patients (82.8%) were found HBV-DNA positive by in house PCR, and 12 of them (46%) were harbouring YMDD mutations. Notably, none of these 12 patients developed a clinical HBV recurrence. Overall, three patients (two with YMDD mutations at transplantation) died of non-HBV related complications (8, 25 and 28 months after transplantation), and 2-years actuarial survival rate was 87%. In conclusion, Combination prophylaxis is effective in preventing clinical recurrence in HBV viremic patients undergoing liver replacement in whom HBV-DNA could be reduced before transplantation by lamivudine, and even in presence of genotypic YMDD mutations pre-transplant. Introduction: Hepatitis C (HCV) associated graft injury following liver transplantation involves a complex interaction between donor, recipient, and viral factors. However, the severity of histologic injury in the liver explant has not been adequately assessed as a factor associated with histologic recurrence of HCV following transplant. The innate immunologic response to HCV in the recipient manifested as pre-transplant histologic injury may influence post-transplantation histologic recurrence. Methods: All 103 HCV infected patients (72 Deceased donor (DD) and 31 Living donor (LD) recipients) transplanted at our institution from 7/99-6/03 were included in this analysis. Histologic injury due to HCV following transplantation including the incidence and time to recurrence was assessed and correlated with the degree (grade) of histologic injury in the liver explanted at the time of transplant. Elevated serum transaminases, positive HCV RNA, and liver biopsy consistent with histologic evidence of HCV defined histologic recurrence. Grade of inflammation (grades 1-4) and Stage of Fibrosis (1) (2) (3) (4) were assessed using a modification of Scheuer's system. An expert Hepatopathologist (JL) reviewed all liver biopsies. Results: The overall rate of histologic recurrence of HCV was 78%; 70% in DD and 87% in LD (p = 0.008). The overall time to histologic recurrence of HCV was 18.4 weeks: 17.26 in DD and 19.17 in LD ( p = NS). The incidence of histologic recurrence of HCV stratified by degree of inflammation (Grade) in the liver explanted at the time of transplantation was not different when comparing Grade 1 (G1) to G2 (66% vs 70%). However, 93% of G3 developed histologic recurrence of HCV ( p = 0.027 G3 vs G1 and G2). When comparing DD to LD, there were no differences in time to recurrence as well as incidence of recurrence stratified by grade of inflammation on explant. Conclusions: 1) Histologic evidence of recurrent hepatitis C following transplantion was common in this patient population. 2) The severity of necroinflammatory activity (grade of inflammation in the explant) prior to transplantation was positively correlated with the incidence of post-transplantation histologic recurrence. 3) Factors which explain this phenomenon including innate immunologic response to HCV and virus specific factors (including quasispecies) require further study. .5) and fibrosis scores (0.9 vs 0.9) and proportion with adv fib (15% vs 18%) were similar. The rate of fibrosis progression was higher in those following RT compared to those with ESRD pre-RT (0.057 vs 0.055 fibrosis units/yr) with an increased rate of 0.0056 fibrosis units/yr following RT suggesting that RT increases the rate of fibrosis progression. In support of this, in the 5 patients with paired biopsies (mean duration to LBx after RT was 36 months), total HAI (5 vs 9), liver inflammation (3.8 vs 7.0) and fibrosis scores (0.9 vs 1.2) all increased with a rate of fibrosis progression of 0.1 units/ yr. Conclusions: Worsening inflammation and progressive fibrosis is observed in pts with HCV and ESRD who undergo RT. The impact of this accelerated fibrosis on the natural history and long-term impact of HCV prior to and following RT requires further study. Background: Anti-donor alloreactivity, defined as the number and phenotype of alloreactive precursors in the recipient, can be used to monitor rejection or for withdrawal of immunosuppression. Mixed lymphocyte reaction using intracellular fluorescent dye 5,6-carboxyfluorescein diacetate succinimidyl ester (CFSE)-labeling technique (CFSE-MLR) enables determination of the number of proliferating cells in response to allogeneic stimulation and simultaneous determination of the phenotype of proliferating cells by use of multiparameter FCM analysis. We investigated the clinical usefulness of the CFSE-MLR as reliable immunological monitoring after living-donor liver transplantation (LDLT). Methods: Eighteen patients undergoing LDLT were enrolled in this study. CFSE-MLR assays were performed at regular intervals as follows; CFSE-labeled peripheral blood mononuclear cells (PBMC) from recipients were used as responders. Irradiated autologous, donor, and third-party PBMC were used as stimulators. After coculture, the responder cells were stained with either CD4 or CD8 mAbs-PE together with CD25 mAb-APC. To determine the stimulation index, the reactive precursor frequency and CD25-expression in each CD4 + and CD8 + T cell subsets, the cells were analyzed by FCM. Results: One of the 18 patients had an episode of rejection reaction within three months after LDLT. Before making a clinical diagnosis of rejection, a more vigorous proliferation of CD8 + T cells was observed in anti-donor MLR, when compared with those in antithird-party MLR. The CD8 + T cells responding to donor stimulators raised their expression of CD25 in parallel with proliferation. In two patients, a mild increase in transaminases, which hindered the reduction of immunosuppressants, sometimes occurred. Such episodes were always associated with a more marked proliferation of CD4 + T cells in anti-donor MLR than those in anti-third-party MLR, regardless of the proliferation of CD8 + T. One patient, in whom immunosuppressants had been satisfactorily withdrawn, showed a complete lack of proliferation of CD4 + and CD8 + T cells in anti-donor MLR and normal proliferation of CD4 + T cells in anti-third-party MLR. Conclusions. The proliferation of CD8 + CD25 + T cells in response to anti-donor MLR would be a useful index of ongoing acute rejection. In contrast, the proliferation of CD4 + T cells would represent an immunosuppressive state and be a useful marker to regulate immunetherapy. Intravenous administration of cyclosporine (CsA) avoids the problem of CsA absorption dysfunction that can occur in the first few days after liver transplantation. However, 24hour infusions using C 0 monitoring have proven unfavourable in terms of efficacy and toxicity. To our knowledge, this is the first study to assess use of C 2 monitoring of CsA using 4-hour intravenous infusions, or to evaluate starting dose of intravenous CsA based on graft function. Methods: Four-hour intravenous infusions of CsA (2mg/kg/ day) were given twice daily to 20 consecutive recipients of a liver allograft for a minimum of 4 days. Full CsA pharmacokinetic profiles were undertaken on day 3 post-transplant in all patients and also on day 7 in five patients. Patients received basiliximab and mycophenolate mofetil (n=11) or steroids (n=9). Results. Across all 20 patients, mean AUC 0-12 was 4,500ng.h/mL. The period of greatest exposure was during the period 2-4 hours after the start of infusion (32% of total AUC 0-12 ). The correlation between C 2 and AUC 0-12 was 0.91, while the correlation between C 0 and AUC 0-12 was 0.43. The CsA dose/kg body weight and total CsA dose correlated only poorly with AUC 0-12 (r=0.44 and r=0.61, respectively). When patients were stratified according to initial or delayed graft function, there was a good correlation between total CsA dose and AUC 0-12 (initial graft function, r=0.84; delayed graft function, r=0.93). In patients with initial graft function, AUC 0-12 could be predicted by the formula (24.5 x CsA dose) + 454. For patients with delayed graft function, the formula was (37 x CsA dose) + 1293. Conclusions. C 2 monitoring provides a more reliable indication of CsA exposure than C 0 monitoring with twice-daily 4-hour intravenous CsA. Calculating starting dose of intravenous CsA based on presence or absence of initial graft function is a more precise technique than use of body weight. Chronic allograft rejection (CR) after liver transplantation (OLT) remains a common indication for retransplantation and cause of graft failure. This may be a rising problem since long-term complications of immunosuppressive drugs led to low immunosuppressive maintenance therapy in OLT recipients. Therefore we reviewed factors to identify patients at risk to develop CR after OLT. Data from 1200 liver transplants was prospectively analyzed. Immunosuppression was commenced as either Cyclosporine A (CsA) or Tacrolimus (Tac) based therapy in different protocols. Follow-up period ranged from 24 to 168.8 months (median 78 months). Risk factors were identified by using multivariate analysis of graft, recipient, and posttransplant variables. The actuarial graft survival was 88% after one year, 82% after three years, and 78.8% after five years. It was similar in Tac-and CsA treated patients. The overall incidence of histologically proven CR was 21/1200 (1.75%). In the multivariate analysis only CsA as primary immunosuppressant (p<0.05, CsA 16/496, 3.2% vs. Tac 3/704, 0.42%), history of an acute rejection episode (p<0.01), and transplants for primary sclerosing cholangitis (PSC) (5/43, 11%, p<0.05) contributed to an increased risk for the development of CR. In the analysis neither crossmatch nor HLA-compatibilities had a significant influence on the prevalence of CR. However, when analyzing only transplants for PSC more HLA-DR compatibilities were associated with an increased risk for CR. The incidence of CR has decreased after the introduction of Tac in recent years. In contrast to this the morbidity and mortality caused by side effects of immunosuppressive therapy is increasing. This study identified patients with PSC and a history of acute rejection at risk for the development of CR. When minimizing long-term maintenance immunosuppressive regimens in these patients, early diagnostic is mandatory to prevent severe complications from underimmunosuppression and CR. BACKROUND: Adherence to antiviral therapy for recurrent hepatitis C (HCV) has been poor due to hematologic toxicity. AIM: To increase adherence to interferon (IFN) and ribavirin (RBV) in patients with recurrent HCV after liver transplantation (LT) by starting at lower drug doses and adding growth factors. METHODS: We enrolled 10 consecutive patients with recurrent HCV after LT. IFN was started at 1 MIU tiw and increased by 0.5 MIU biweekly to a maximum dose of 3 MIU tiw. RBV was started at 200 mg bid and increased by 200 mg biweekly to 1000-1200 mg daily for a total of 48 weeks. Those with end-of treatment viral response (ETVR) also received an additional 6 months of RBV monotherapy. Filgrastim was started at 300 ug three times weekly if absolute neutrophil count decreased to ≤1,500/cc, and erythropoietin was started at 40,000 U weekly if hemoglobin decreased to ≤ 10 g/dL. Liver biopsies were performed at baseline and at end of therapy. RESULTS: Ten consecutive patients (1F/9M, mean age 47, 60% genotype 1, viral load >2 MIU, tacrolimus-based immunosuppression) with recurrent HCV on liver biopsy (grade 2-7, stage 0-6, Ishak score) were enrolled. Seven (70%) tolerated 12 months of combination therapy. Two discontinued for decompensation of liver cirrhosis at weeks 16 and 36, and one other discontinued for depression at week 32. No episodes of acute cellular rejection or discontinuation of therapy for hematological side effects were observed. All patients received >80% of the expected doses of RBV and IFN, but 70% required filgrastim and 80% required erythropoeitin. Four had negative HCV-RNA at the end of treatment (40% ETVR, ITT). Two relapsed while on RBV monotherapy. Two continued to be HCV-RNA negative during and 6 months following RBV monotherapy (sustained viral response or SVR: 20%, ITT). Nine paired biopsies showed stable or improved fibrosis scores in 6 patients (2.0 ± 1.5 vs 1.1 ± 0.5) and worsening fibrosis in 3 patients (1.6 ± 0.5 vs 3.5 ± 1.0). There was no correlation between viral response and fibrosis changes. Background: Hepatitis C viral infection has been shown to suppress host cellular immune response. A new assay (ImmuKnow™, Cylex Inc., Columbia, MD) assesses quantitatively the cell-mediated immune response by measuring the ATP release in CD4 + T cells when stimulated by phytohemagglutinin. A pilot study with this assay was undertaken to assess baseline immune response of patients with end stage liver disease. The cell-mediated immune response in Hepatitis C versus non-Hepatitis C patients awaiting liver transplantation was determined. Methods: Blood from Hepatitis C and non-Hepatitis C patients with end stage liver disease awaiting liver transplant were obtained and assayed for their cell-mediated immune response using the ImmuKnow™ assay. ATP release was classified as: low (<225ng/ml); moderate (226-524ng/ml); and strong (>525ng/ml). Healthy controls without liver disease were included in each assay run. Introduction: The phenomenon of persistent donor cells in peripheral blood, lymph nodes and bone marrow, even for years after transplantation is called microchimerism (MC). Based on this observations in long-term survivors it has been hypothesized that the development and persistence of donor-specific MC may play a role in the induction of allograft tolerance and hence result in a reduction of rejection episodes and immunosuppressive drug therapy. Other studies failed to support this assumption, so the role of MC is still on debate. Patients and method: We enrolled 50 patients, each 25 patients with/without MC and followed all recipients for 2 years after OLTx. Detection of MC was based on STR amplification technique of 9 STR regions plus the amelogenin gen in the CD3+ and CD34+ cell populations. Amplification-products were determined by calculating the proportion of peak areas of donor/recipients signals using an ABI PRISM 310 genetic analyser (5). Patients were 49.3/45.2 years old (16 to 70 years); 27 of 50 were men. We analyzed graft/patient-survival, laboratory data as ALT/AST, AP/ GGT and Bilirubin at 0.25, 0.5, 1, 2, 3, 6, 12, 24 months after OLTx as well as drug-levels of immunosuppressive therapy and frequency of rejection episodes. Results: There was a lower incidence of acute rejection episodes in the MC group. Due to maximal donorspecific proportion we divided the MC groups into (1) MC < 10 % and (2) MC > 10 % and could show a significant (p = 0.022) difference in the incidence of acute rejection. We could as well demonstrate significant lower tacrolimus levels (p = 0.0245) in the MC group. Conclusion: Cells of donor origin can be found in recipient's peripheral blood. We could also demonstrate MC in the CD34+ population. The clinical relevance of MC is not defined yet. Some authors suppose that MC simply reflects a sufficient immunosuppression and is more the cause than the effect. Despite this our results suggest that recipients derives advantage from spontaneous and especially high proportion of MC: patients without MC had significant more episodes of acute rejection than patients with MC, they also need higher blood-levels of immunosuppressive drugs. Background: Late renal dysfunction and eventual renal failure is an increasing concern and problem in liver transplant recipients. Long-term CNI use is believed to play a contributing role in many of these cases. There is increasing interest in CNI withdrawal from maintenance regimens in an attempt to improve renal function. We instituted a protocol of CNI withdrawal in liver transplant recipients with deteriorating renal function. Patients who were at least 6 months posttransplant with a rising serum Cr and no recent acute rejection (AR) episode were converted from a CNI maintenance protocol to a regimen consisting of full-dose MMF (1g bid) and low-dose prednisone (5 mg/day). Recipients that were off prednisone, were restarted on it at the time of the conversion. Results: Since Jan '99, we have converted 13 recipients using this protocol. Mean recipient age was 57.3 yrs. Causes of liver failure were Hep C (3), PBC (2), Hep B (1), ETOH (4), and others (3) Introduction: The induction of tolerance following liver transplantation in animal models is associated with the rapid and exhaustive activation of the recipient immune system. There is evidence that immunosuppression given immediately posttransplantation inhibits this form of tolerance. This single center, nonrandomized, retrospective study demonstrates the feasibility of delaying immunosuppression for 48 hours postoperatively to allow activation of the recipient immune system in adult cadaveric liver transplantation. Methods: Between July 2001 and October 2003, 226 patients underwent liver allografts. Exclusion criteria included: perioperative death, and previous liver allograft recipient, excluding 20 patients. Patients were divided into two groups: no delay (ND) with 106 patients and delayed immunosuppresion (DI) with 100 patients. The protocol with delayed immunosuppression consists of no immunosuppressive drugs during the perioperative period (48 hours), 3 doses (2 mg/ kg/dose) of rabbit anti-thymocyte globulin (RATG, Sangstat) given on postoperative day (POD) 2, 4, 6, steroids with rapid taper started on POD day 2, and tacrolimus started on POD day 3 (trough levels 10-12). ND received the identical regimen except steroids were initiated immediately postoperatively, and RATG was started on POD1. The chi-square test and Student's t test were used for univariate analysis. Results: There was no difference in patient (ND: 96% vs. DI: 92%; p=ns) or graft survival (ND: 96% vs. DI: 91%; p=ns). The rejection rates were identical in both groups (10%). Two patients from DI were treated with anti-lymphocyte therapy for severe rejection, while the rest responded to steroids and reversal of weaning. There was no difference in serum transaminases, total bilirubin, and alkaline phosphatase between either group at any interval. No patient has developed post-transplant lymphoproliferative disorder. Recurrence of HCV hepatitis (biopsy proven) was 31% (ND) vs. 43% (DI) (p=ns). CMV infections (ND: 5% vs. DI: 2%; p=ns) were treated with oral valgancyclovir. Conclusion: Prospective trials evaluating delayed immunosuppressive therapy in liver transplantation for tolerance induction can be performed without increasing the incidence of rejection, graft loss, or patient death. Poster Board #-Session: P210-II Introduction: Renal dysfunction is a major risk factor after liver transplantation (OLT), a predictor of waitlist mortality. Immunosuppressive regimens aimed at sparing renal function might improve patient outcomes post-transplantation. Objective: To evaluate the efficacy of delayed tacrolimus, with or without induction therapy, in patients with renal insufficiency undergoing OLT. Methods: Between January 1998 and December 2003, 60 patients had renal dysfunction (as defined as serum creatinine ≥ 1.6 mg/dl at the time of transplant) and received a renal sparing immunosuppression protocol of which 58 were eligible for this review. Group 1 (n=32) patients received MMF 1.5 gm, PO, BID, steroids starting on POD #0, and delayed Tacrolimus, 0.05 mg/kg PO BID starting on POD#4-7. Group 2 patients (n=26) received Simulect™ (N=8) induction 20mg IV POD #0 and POD #4 or Thymoglobulin™ (N=18) induction 2.0mg/kg IV intra-operatively, and 1.5mg.kg IV on PODs #1 and #3 along with steroids and delayed Tacrolimus. Goal trough levels were similar in each group. 30 day data were collected and analyzed using student's t-test. Results: There was a higher incidence of post-transplant dialysis in the non-induction group (56% vs 27%, p=0.0265). There was also a higher incidence of opportunistic infections in Group 1, most common being CMV and fungal. Three patients in Grp 1 converted to Cya due to neurotoxicities, where as 1 patient in Grp 2 converted to FK secondary to neurotoxicity. Conclusions: Antibody Induction therapy with delayed Tacrolimus and steroids, is a safer immunosuppressive protocol for patients with renal dysfunction undergoing liver transplantation than delayed Tacrolimus, MMF and steroids, and appears to provide clinically-apparent renal protection post-transplant. There were no dose-related or statistically significant differences between treatment groups and the incidence rate of efficacy failure or its single components. A trend towards lower incidence of death was observed in the two highest dose levels of E. All graft losses and most deaths were secondary to typical post-transplant complications. None were attributed to the study medication. The incidence of hypercholesterolemia was proportional to the dose of E given and was 3%, 7%, 10% and 10% (PLA, E 1.0, 2.0 and 4.0 mg per group). Renal dysfunction was higher in patients receiving E (7%, 21%, 10% and 16% in the PLA, E 1.0, 2.0 and 4.0 mg per group). Neoplasms were rare in all groups. Viral infections appeared to be associated with E dose and were highest in the two highest E dose levels (13%, 14%, 23% and 23% in the PLA, E 1.0, 2.0 and 4.0 mg groups respectively). From M3 onward, mean values of alkaline phosphatase appeared inversely associated with E dose and were 298, 223, 174 and 146 IU/L in the PLA, 1.0, 2.0 and 4.0 mg groups at M12. After an initial increase, mean creatinine values remained relatively stable from M1 in all treatment groups. Creatinine clearance decreased in all groups with relatively stable values from M1 onward and no treatment-related trend. Mean total cholesterol increased from baseline in all treatment groups and maximum levels were generally attained at M6. Changes from baseline were generally dose-related, but were not significant between treatment groups. Mean triglycerides revealed doserelated increases and maximum values were obtained also at M6. This study demonstrates that the use of everolimus in liver transplant patients is safe and effective with a dose-associated effect on creatinine, increases of cholesterol and triglyceride. These studies provide data for additional studies in this indication. BACKGROUND: Long-term side effects of immunosuppressive therapy with calcineurin inhibitors (CNI) in liver transplant patients are major causes of morbidity. PATIENTS AND METHODS: We undertook a prospective study to assess the safety and efficacy of CNI withdrawal and replacement by mycophenolate mofetil. 33 patients with a minimum follow-up of 2 years after liver transplantation were included in the study. They were all on monotherapy of one of the two CNI. Of these 30 had renal dysfuction attributable to suspected CNI toxicity and 3 had hyperlipidaemia. 10 of these patients had both renal dysfunction and hyperlipidaemia. 20 of these patients had also arterial hypertension. Renal function, blood pressure and lipid profile were measured before and 12 months after study entry. A sequential renal scintigraphy was also performed on every patients before and 12 months after study entry, to appraise renal damage and possible improvement. Side effects of medication and graft function were recorded during the study. RESULTS: At the end of the study there was a significant decrease in serum creatinine (by 28%) and urea levels (by 36%). Blood pressure improved significantly with a systolic decrease of 20% and diastolic decrease of 12%. There was also an improvement of cholesterol (decrease of 21%) and triglyceride (decrease of 56%). None of the patients had to stop the study because of side effects of the new therapy. No episodes of active graft rejection occurred during the convertion period and after remaining on mycophenolate mofetil monotherapy. CONCLUSIONS: Substitution of CNI by mycophenolate mofetil can improve renal function, blood pressure and cholesterol and triglyceride concentration of liver transplant patients without an increased rejection risk with mycophenolate mofetil monotherapy. Poster Board #-Session: P214-II Renal failure related to the use of calcineurin inhibitors (CIs) is an increasing problem following orthotopic liver transplantation (OLT). Sirolimus, a potent immunosuppressive agent, has less nephrotoxicity than CIs and is a potential alternative immunosuppressive agent in patients who develop renal insufficiency. Aim: 1) To determine if switching from CIs to sirolimus results in improvemed renal function following OLT and, 2) To determine the side-effect profile of sirolimus. Patients and Methods: Between May 1998 and August 2003, 41 liver transplant recipients received sirolimus as part of their immunosuppressive regimen. Sixteen patients received sirolimus immediately after liver transplantation (either alone or in combination with tacrolimus) and 25 patients went on sirolimus either discontinuing or reducing CI doses at least one month or more post-OLT. Trough sirolimus levels of 5-10 ng/mL were maintained. Renal function was assessed by serum creatinine levels measured before starting sirolimus and at one month or more after starting sirolimus. Results: The median age at OLT was 57 years (range 13-71, 64% males). The most common etiologies of liver disease were alcohol (24%) and alcohol and hepatitis C (20%). The median follow up after starting sirolimus was 102 days (range 38-307). Common side effects were hyperlipidemia 25% (9), infections 22% (8), anemia 16% (6), leucopenia 11% (4), mouth ulcers 8% (3), obesity 5% (2), thrombocytopenia 5% (2) Results: Two patients died and other was lost follow-up before second randomization at six months. Two additional patients were excluded due abnormal liver tests and MMF intolerance. Renal dysfunction was less frequent and glomerular filtration rate were higher at six months among patients that received MMF although the differences were not significant. The incidence of rejection was similar.There was no statistical significance increase in infection rate either. CI withdrawl was attempted in 13 patients. In three of them with persistent increase in liver enzymes Tac was partially wean (50% the initial dose) however liver biopsy was rejection free but showed steato-hepatitis in one and HCV recurrence in other. In 7 patients CI were discontinued and all of them had alteration in liver function tests (5 with biopsy proven acute rejection). All but one respond to CI re-introduction. The patient that did not improve had moderate rejection with progressive cholestasis associated with infection and graft failure causing death. Due to this case we ended the study before including 4 cases in the second randomization and did not wean Tac in 3 others patients previously random. Conclusions: MMF in addition to Tac and St as early routine immunosuppression was well tolerated after LTx without increasing side-effects although the acute rejection rate was similar.The MMF efficacy in renal function and HCV recurrence need further consideration. In this study CI withdrawl couldn't be safely attempted in the first year after LTx. Introduction. Cyclosporine (CyA) C 2 monitoring improved the management of posttransplant immunosuppression. Nevertheless, there is a lack of pharmacokinetics (PK) in patients receiving marginal livers (age>65 yrs and/or histological macrosteatosis>15%). Methods. We prospectively compared 24 liver transplanted patients: 10 pts received marginal graft (M group) while 14 received standard graft (control, C group). All patients received Neoral® as primary immunosuppression, with tapered steroids and azathyoprine. On post-operative day 3 (d3) and 10 (d10), 12 blood samples were collected during 12 h post drug administration (respectively at 0, 15, 30, 45, 60, 75, 90, 120, 240, 360, 480 and 720 minutes). Rejection episodes (followup 3 months), graft and renal function data were registered. CyA concentrations were measured by fluorescence polarisation immunoassay and non-compartmental PK analysis was applied. Results. Considering samples taken between 60 and 180 min, we found increased CyA concentrations from d3 to d10 (99-204% in C group and 32-106% in M group). In such interval, C max raised in C group from 537 to 1016 ng/ml (p<0.001) and AUC 0-12 from 3779 to 5307 ng*h/ml (p<0.05), while in M group C max raised from 524 to 749 ng/ml (p<0.05) and AUC 0-12 from 3836 to 4049 ng*h/ml (p=ns). T max progressively shortened from d3 to d10 in both groups (from 3.5 to 2 h in M group and from 3 to 2 h in C group). According to literature data, on d3 C2 provided best AUC 0-4 estimate in C group (r²=0.95), while C3 was best for M group (r²=0.88); at d10 C2 is a good surrogate marker of AUC 0-4 (r²=0.89 and 0.88, for C and M groups respectively). No correlation was found between clinical data and PK parameters. No difference was noted between groups about graft and renal function, probably due to small subjects number. 43% of C group patients had graft rejection, compared to 20% of M group patients; 11 episodes occurred, 9 in C and 2 in M group. Conclusions. Patients receiving marginal graft showed different PK behaviour compared to control patients. As previously reported, largest PK variations occured during absorption phase: compared to C group, absorption in M group is poor and delayed at d10. All these informations could have a clinical impact for immunosuppressive management in the early posttransplant period. Introduction. In renal transplantation basiliximab is a safe and effective induction agent allowing the delay of tacrolimus in the presence of delayed graft function. Delaying a calcineurin inhibitor in this context following liver transplantation has not been reported. We describe the use of basiliximab induction and tacrolimus delay in liver transplant recipients with renal insufficiency. Methods. Basiliximab was administered to 42 patients with renal insufficiency prior to their transplants. These were compared to 82 that received no induction and immediate tacrolimus. In the basiliximab group, tacrolimus was delayed for up to 14 days. Maintenance immunosuppression was based primarily on tacrolimus and prednisone. Resuslts. The mean serum creatinine on POD 0 was higher in the basiliximab group, 1.7 vs. 1.1 mg/dl (p=0.02). The mean number of days to achieve therapeutic tacrolimus levels was 6.2 in the basiliximab group, compared to 1.8 in the control group (p=0.002). Overall 3 year patient and graft survival was 75% and 71% respectively, similar for both groups. 56% of the patients had hepatitis C (HCV), all genotype 1a or 1b, similar for both groups. The cummulative recurrence of HCV was similar for both groups (figure 1). 30 month biopsy-proven HCV recurrence was 55% in the basiliximab group compared to 61% in the control group (p=0.613). the mean grade and stage of recurrent HCV was 2.0 and 0.83 respectively for the basiliximab group compared to 2.14 and 1.00 for the control group (p=0.858 and 0.719 respectively). There was no difference in viral load between the groups. 30 month cummulative rejection was lower in the basiliximab group, 5% vs. 11% in the control group (p=0.467). Conclusion. Basiliximab induction in liver transplantation allows the safe delay of tacrolimus in patients with renal insufficiency. This is associated with a low rejection rate and there is no increase in HCV recurrence at 3 years. Chronic renal failure has emerged as a significant cause of morbidity and mortality after liver transplantation. Although this has been attributed mostly to long-term exposure to nephrotoxic immunosuppressive therapy, recent evidence indicates that the development of acute renal failure in the immediate post-operative period is a major risk factor. Thymoglobulin (Thymo) induction with delayed introduction of maintenance calcinerin inhibitor (CNI) therapy may reduce the risk of early renal failure and improve long-term renal function. To further define this strategy in adult right lobe LDLT we compared renal function of 32 patients (group 1) that received Thymo with 29 patients (group 2) that received no polyclonal antibody induction therapy. Methods. Thymo was initiated within the first 6 hrs after LDLT and continued for 7-10 days; after the initial dose of 1.25 mg/ kg body weight, daily dosage was adjusted to maintain absolute lymphocyte counts of <0.2. All patients received a tapering dosage of methylprednisolone and tacrolimus (Tac; group 1, 92%; group 2, 80%) or Neoral cyclosporine (Cya; group 1, 8%; group 2, 20%; p=NS). Five patients in group 2 received 2 doses of basiliximab. In group 1, initiation of Cya/Tac was routinely delayed to the 3 rd -5 th post-op day. Target levels for Tac/Cya after the first week were identical in both groups. Pre-and post-operative serum creatinine was measured, and the % increase over preoperative values was calculated. Results. Demographic characteristics of the two groups were similar with respect to age, sex, etiology of liver disease, medical status at transplantation, and preoperative serum creatinine values. Temporary dialysis was required in 1 patient in group 1 and two patients in group 2. Table 1 summarizes the results. Conclusion. Following LDLT, induction therapy with polyclonal antilymphocyte induction and delayed introduction of CNI appears to reduce the severity of postoperative renal dysfunction. This may be particularly valuable in patients with evidence of preoperative renal dysnfunction. Utilization of IL-2Rα-Ab therapy in renal transplantation as a CI sparing strategy is well established but data in liver transplant (LT) patients with renal dysfunction (RD) is limited. We evaluated our experience using IL-2Rα-Ab therapy in patients undergoing LT with perioperative RD. Results: From Sept 1, 2002 to Oct 30, 2003 patients (21M;11F) undergoing 36 liver or liver/other organ transplants with perioperative RD defined as need for renal replacement therapy (RRT) or Cr >1.5 mg/dl with oliguria were treated with prednisone, MMF, and basiliximab (Simulect® Novartis), at an initial dose of 20 to 40 mg followed by repeat dose 4 and 14 days later if required. CI was introduced if RRT was discontinued, creatinine decreased to < 2 mg/ dl without oliguria or acute cellular rejection (ACR) requiring therapy occurred. Mean patient age was 58 yrs (range 43 to 71). Mean pre-transplant MELD score was 23 (8 to 40). 3 grafts lost within 72 hours due to primary non-function (2) and early patient death (1) from multiorgan system failure (MOSF) were excluded from the analysis. Patients received a mean of 1.9 (1-3) infusions with mean basiliximab dose of 40 mg (20-60). Duration of RRT, initial ICU and hospital stays were 5.5 (0-114), 4.6 (0-23) and 25.6 (4-86) days respectively. Time to therapeutic immunosuppression averaged 12.5 days . Mean day 21 and 120 creatinines were 1.6 (0.6-2.5) and 1.5 (1-1.9) mg/dl. 5 patients developed ACR responsive to a single solumedrol infusion. 11 patients required sirolimus but only 5 patients remained on sirolimus as primary immunospuppression at follow up. Major infections due to CMV (19%), bacteria (42%) or fungus (6%) infections were common. 2 graft losses due to arterial complications required retransplant. 2 patients died of MOSF beyond 21 days post LT and 1 patient died after being lost to follow up. 3 patients had residual RD defined as need for RRT or creatinine >2 mg/dl at mean F/U of 211days (range 42 to 431). Summary: IL-2Rα-Ab therapy allows delayed introduction of CI with a low incidence of ACR allowing for renal function recovery in LT patients with perioperative RD. Frequent infectious complications need to be further analyzed but may be due to the critically ill status of these patients. Poster Board #-Session: P224-II Introduction: Induction immunosuppression with Thymoglobulin, a potent antithymocyte policlonal antibody, might allow a tolerogenic regimen of recipient pretreatment and low-dose immunosuppression. The effect of this novel approach on HCV recurrence after liver transplantation has never been investigated. This analysis aimed to discover a relationship between Thymoglobulin induction and pattern of HCV recurrence after liver transplantation. Methods: we used Thymoglobulin induction + Tacrolimus monotherapy in a group of 22 HCV+ patients receiving liver transplantation (Thymo group); 32 historical HCV+ patients with different Tacrolimus based immunosuppression represented the comparison group (non-Thymo group). Results: Patients survival is equal in both groups, with 1-year survival rate of 82% (Thymo group) versus 86% (non-Thymo group) (p=ns). Five patients (22,7%) in Thymo group experienced mild acute rejection versus 10 patients (31,3%) in non-Thymo group; among all Thymo patients, rejection grade was generally mild, requiring steroid recycle in 3 patients (13,6%), whereas 60% of patients in the non-Thymo group experienced moderate or severe acute rejection, requiring steroid recycle in all cases and OKT3 administration in one. Clinically HCV recurrence rate was equal in both groups (57% vs 55% of patients), although pattern of recurrence was distinct. With respect of mean ALT and AST elevation, patients in Thymo group reached higher levels than in the non-Thymo group (respectively mean ALT level: Thymo 326± 65 UI/mL, non-Thymo 94± 78 UI/mL, p=0,001; mean AST level: Thymo 202± 71 UI/mL, non-Thymo 152± 81 UI/mL, p=0,03). Patients in Thymo group expressed earlier increase of HCV RNA load in a median 96 days (range 21-284 days) when compared with non-Thymo group (187 days; range 13-838; p=0,044); interestingly the mean peak RNA load was lower in Thymo patients than in the non-Thymo patients (1494 U/L vs 3516 U/L; p=0,017). Histological recurrence of HCV was also earlier in Thymo patients with a median diseasefree survival of 113 days (range 25-204) when compared with the other group (327 days; range 31-752, p=0,001). However no significant difference was observed in mean Ishak's histological grading (4 in both groups) and staging (Thymo S=1, non-Thymo S=2) of HCV recurrence. Conclusion: Induction immunosuppression with Thymoglobulin in liver transplant recipients is effective in protecting against rejection whereas demonstrated a peculiar relationship with HCV recurrence that deserves further investigations. Renal dysfunction is a well-known complication in liver transplant patients receiving calcineurin inhibitors. Serum creatinine (Scr) overestimates the glomerular filtration rate (GFR). Several formulas have been developed to estimate the GFR. However, their application in liver transplant patients is not well described. Purpose: To determine the correlation between the radionuclide GFR and serum creatinine (Scr), and different formulas accepted to estimate the GFR in other settings. Methods: This is a sub-study of a Canadian multicenter randomized study that evaluates the safety and efficacy of daclizumab induction plus mycophenolate mofetil, tapered corticosteroids and delayed and low-dose tacrolimus vs. a standard-dose tacrolimus-based regimen in adult liver transplant patients (n=148). Baseline (pre-transplant) and 3-months GFR was performed in 32 and 35 patients, respectively. The formulas used to estimate the GFR were the following: Cockroft-Gault, Levey (MDRD, modification of diet in renal disease), and 1/Scr. Results: Results (means±SD) are shown in the tables. Conclusion: Although Scr was the least appropriate measure to estimate GFR, the use of any "accepted formula" provided a poor correlation with the radionuclide GFR during the first 3-months post-liver transplant. A modification of these formulas is therefore required to better assess the GFR after liver transplantation. Positive (pos) T-cell flow crossmatch (X-match) results in the liver transplant population have been associated with early acute rejection, biliary complications, and a lower allograft and patient survival. Although there is little evidence to support waiting for X-match results before liver transplantation, there is still the question of whether pos results have a clinical significance. Purpose: To evaluate if T-cell flow X-match pos OLT recipients demonstrate any greater degree of rejection, biliary or vascular complications, or lower allograft survival compared to X-match negative (neg) recipients. Methods: Demographic, morbidity and mortality data were reviewed for OLTs performed at our center over a one year period to assess the influence that pos flow X-match has on clinical outcomes. Biliary (stricture) and vascular (hepatic artery thrombosis and stenosis) complications were tabulated. Results: 80 patients were identified who underwent 81 liver transplants over the study period. Mean follow-up was 6.2 months. Fifteen (18.5%) pos flow X-matches and 66 (81.5%) neg flow X-matches were noted. The pos X-match group consisted of 4 men (26.7%) and 11 women (73.3%); the neg X-match group comprised 48 men (73.9%) and 17 women (26.1%). (p<0.005) There were no significant differences between the pos and neg X-match groups with respect to age, match and laboratory MELD, CTP, ABO blood type, or etiology of liver disease. Conclusion: A higher number of women have positive X-matches compared to men. There was a similar rate of early rejection in both groups. We did not find any difference in biliary or vascular complications or in allograft survival between the positive and negative X-match groups. Therefore, it would appear that the clinical implications of a positive T-cell flow X-match in the early post-OLT period is minimal. Background: Sirolimus (SIR) is a new immunosuppressant agent in liver transplantation. We have previously demonstrated favorable outcomes in over 200 patients who received SIR as part of a primary immunosuppressive regimen for liver transplantation. We have noted that many of the patients in our cohort have mild persistant elevations in alanine aminotransferase (ALT). We report the frequency and severity of this abnormality and speculate on its clinical significance. Methods: All patients without hepatitis C who received SIR as part of their primary immunosuppressive regimen were included in this study. Patients were censored from analysis following discontinuation of SIR. The control group included all non-HCV liver transplantation recipients from our institution between 1997 and 1999 who did not receive SIR as primary immunosuppression. These patients are designated "noSIR." Results: While the mean AST levels were not significantly different between SIR and noSIR patients at any of the intervals, the mean ALT levels were significantly higher in SIR patients at months 12 and 15, as shown in Table 1 . The percentage of SIR patients with abnormal ALT (> 47 IU/L) was significantly greater at month 12 and approached statistical significance at months 15 and 18, as shown in Table 2 . Fewer than 5 % of SIR and noSIR patients had ALT > 94 IU/l (greater than two-fold normal) at any of the monthly intervals (data not shown), p = ns for each interval. SIR levels were (mos 3, 6, 9, 12, 15, 18, 21) : 6.4, 6.0, 6.1, 4.7, 5.6, 5.7 and 5.8 ng/ml. Progressively elevated ALT values were not seen in any of the patients on SIR. Conclusions: 1) Liver transplantation recipients without hepatitis C who receive SIR as primary immunosuppression have mild elevations in ALT. 2) We believe that this observation represents a drug effect of SIR and not hepatotoxicity, since progressive ALT elevations were not seen. 3) Since ALT is one of the laboratories used to monitor for acute cellular rejection, physicians administering SIR to liver transplantation recipients should be aware of these findings. Living kidney donation (LD), as a therapeutic intervention for patients with end stage renal disease has markedly increased over the past decade. Despite the relative benefit of this donation type, well documented risk factors remain associated with characteristics of these organs. To assess the cumulative impact of these factors we performed an analysis to measure the relative risks associated with identified levels of quality of the living donated kidney. We examined all first solitary LD transplants from 1995-2001 from the SRTR database in order to create a model for measuring the impact of the donated organ. We randomly assigned two-thirds of the records to a subset in order to construct a model, and tested the group risk designations on the remaining third representing the naive cohort. Parameter estimates of significant donor risk factors deriving from a Cox model for overall and death censored graft survival (including age, CMV status, relation to recipient, gender match, race, and HLA matching) were combined to generate risk scores. These scores were then assigned to three levels of risk via cluster analysis. Univariate and multivariate models confirmed a significant distinction between the low, mid, and high level risk groups. Kaplan-Meier plots displayed a significant association (p<.0001) for overall graft survival between risk groups in the test data set. The multivariate model, adjusted for recipient factors, found that with the low risk group level as reference, the mid group level incurred a 1.52 hazard, 95%CI (1.39, 1.66), and the high risk group a 1.80 hazard (1.57, 2.05). The utility of such a rating system may be applicable in situations in which recipients and clinicians have the luxury of choosing among several living donor candidates, to help assess levels of therapeutic intervention, and for identifying the quality of organs in the context of living exchange programs. ). There was no difference in delta GFR, change in BP or post-nephrectomy proteinuria between these groups. Conclusion: Implant biopsies at the time of donor nephrectomy revealed pathological abnormalities in 34% of cases where there was adequate tissue for analysis. Although there were no differences in short term outcomes between patients with and without abnormal implant biopsies in this study, long term follow information is needed to fully assess the clinical utility of the implant biopsy in living donor follow-up. More rigorous pathological analysis of implant biopsies (eg quantification of interstitial fibrosis) may yield additional useful information and is currently being pursued. It is essential that candidates for living kidney donation are carefully screened to ensure safety of donation. It is particularly important to rule out any evidence of impairment in renal function and extensive and often expensive testing (ie. nuclear medicine GFR determination) occurs to accomplish this task. In the general population, serum cystatin C has been shown to be more sensitive than serum creatinine (SCr) in detecting subtle impairment in GFR earlier (88 ml/min versus 75 ml/min; p < 0.001) (AJKD 2000; 36: 29) . The latter property could potentially be effective in evaluating potential living kidney donors for subtle changes in renal function but has not been previously evaluated. We examined 43 potential donors and 7 recent donors (within the past year) to determine the correlation between serum cystatin C and the following tests: (1) Tc-99 radioisotope nuclear medicine GFR (Tc-99); (2) SCr; (3) Sixty-eight arterial control problems resulted in: 1 donor death, 2 donors with near fatal hemorrhagic shock and contralateral renal failure, at least 13 donors receiving transfusion (data not available for all) and 16 urgent reoperations. Vascular clips were involved in 11/18 (61%) of delayed failures. The most common failures were intraoperative (at application + immediate) associated with staple dysfunction. Importantly, delayed arterial control problems occurred on 18 occasions, 11 of which involved clips (locking 5 and standard 6). There were an additional 36 reports of failed renal vein control; none proved fatal, but 4 required re-exploration; 21/36 (58%) involved staplers. Significant vascular complications, some resulting in death or renal failure, occur with living kidney donation. Venous stump problems seldom result in life threatening hemorrhage, but arterial control problems may jeopardize a donor's life, especially if occurring in delayed (i.e. post-recover period) fashion. Both locking and standard clips, used to control the renal artery, appear to be associated with the highest risk of delayed arterial complication (although ligature and transfixion of the renal artery was associated with at least one post-operative donor death Background. Gender differences in live donor kidney transplantation are well established. Females are more often donors than males. Our goal was to examine the potential donor pool and to determine at what point in the donor evaluation process this gender disparity becomes apparent so as to better understand the reasons for the disparity. Methods. We obtained records from our HLA tissue typing lab on all patients who came forward for blood typing to be a potential kidney donor for recipients being evaluated for live donor kidney transplant in our center between January 2000 and December 2002. We then reviewed the patients records to determine at which of the following points along the evaluation process that drop-out occurred: ABO incompatible, positive cross-match, no medical work-up, medical workup not completed, medical contraindication, social contraindication, approved but recipient sick, dead Abstracts or already transplanted, approved but unwilling to donate, approved and donated. Results. A total of 363 potential donors (54% females, 46% males, NS) were evaluated. Eighteen patients were lost to follow-up and were excluded. Fifty-two patients were ruled out due to ABO incompatibility (42% female, 58% male, NS). Nineteen patients were excluded due to a positive cross-match, which occurred more often in females (63% vs. 37%, NS). Ninety-eight patients never began a medical work-up (56% female, 44% male, NS). Of this group, the potential recipient was already transplanted (65%), was unable to undergo transplant because he or she had become too sick or had died (19%), or was still pre-dialysis (4%). Twelve patients began, but did not complete a medical workup (50% female, 50% male, NS). Fifty-eight patients were ruled out for medical reasons (57% female, 43% male, NS), and 6 patients were ruled out for social reasons (67% female, 33% male, NS). This left 100 potential donors who were approved for donation (55% female, 45% male, NS). Of these, 5 patients had a potential recipient who was already transplanted, dead, or too sick to undergo transplant. Of the remaining 95 potential donors, 73 went on to donate (64% female, 36% male, p=0.017) and 22 were unwilling to donate (27% female, 73% male, p=0.03). Conclusion. While males and females initially come forward as a potential living kidney donor equally as often, and are equally as likely to get through each step of the evaluation process, females approved for donation are significantly more likely to donate than approved males. The reasons for this remain to be determined. Blood pressure fell slightly after nephrectomy accompanied by a reduction in heart rate. Cardiac index fell while systemic vascular resistance index increased. Body weight did not change (85±2 to 84±2 kg). Absolute thoracic impedance measures were stable after nephrectomy, while impedance change with posture increased substantially. Conclusion: These results before and after donor nephrectomy establish the magnitude of hemodynamic changes induced by the removal of this large vascular bed. The fall in heart rate and cardiac output with increased postural fluid loss from the cardiopulmonary bed suggest both reduced intravascular volume and sympathetic adrenergic tone, possibly related to partial loss of afferent sympathetic stimuli from the removed kidney. These changes offset a minor rise in systemic resistance and may explain stable or reduced arterial pressure observed in the first year after donor nephrectomy. Poster Board #-Session: P238-II Background: Little is known about attitudinal and psychological differences between altruistic strangers and the general public. Identifying differences could lend insight to altruistic strangers' motivations for donation. We performed a case control study interviewing 34 altruistic strangers (cases identifying themselves to our Live Kidney Transplant Program) and 68 zip code-matched persons from the general public (controls) to assess their attitudes regarding tolerable thresholds for risks (medical complications and kidney failure, out of pocket expenses, and failure of transplant) incurred with donation, other potential motivators for donation (prior donation and religiosity) and psychological factors (depression, and anxiety, assessed using a validated diagnostic questions). We used descriptive and comparative (Kruskal-Wallis and χ²) statistics to compare altruistic strangers and controls. Results: The mean (SD) age of participants was 45 (12) years, 64% were female, 84% were White, 52% were college graduates, 6% were employed full time, and 37% had household incomes ≥ $60,000. There were no demographic differences between groups. Altruistic strangers vs. controls were willing to donate given greater levels of potential risk for: medical complications (>50% vs. 35% risk, p<0.001), their own kidney failure (100% vs. 50% risk, p<0.001), time without monetary compensation (>3 vs. 2.5 months, p<0.001), out of pocket expenses (>$4500 vs. $4500, p<0.001), and failed transplant in the recipient (90% vs.70% risk, p<0.001). More altruistic strangers than controls donated blood in the past, (91% vs. 54%, p<0.001), but they were no more likely than controls to consider themselves "moderately or very" religious (70% vs. 82%, p=0.2) or spiritual and (82% vs. 82%, p=0.9). There were no differences in rates of depression (12% vs. 18%, p=0.4), panic disorder (3% vs. 0%, p=0.15), or suicidal ideation (2% vs. 2%, p=0.9) between altruistic strangers and controls. Conclusions: While altruistic strangers have similar attitudes regarding religion/ spirituality and similar rates of psychological disorders when compared to the general public, they are more likely to have had past donation experiences and are willing to accept greater personal risks for donation (medical and financial). When presenting potential risks of donation to altruistic strangers, transplant programs should seek to confirm altruistic strangers' full understanding of such risks prior to proceeding with donation. It had been previously demonstrated that transplantation across the ABO barrier is feasible using A2 donor kidneys. However, the practice of A2→O/B transplantation has not been universally accepted due to fears of inferior long-term results. We report the long-term results of A2→O/B renal transplantation at our transplant centre. Between November 1990 and June 2001, 8 patients with blood group B (3) or O (5) received A2 renal transplants. Five of these kidneys were from living-related donors. The mean %PRA was 1% (0-6%), and the mean number of HLA matches was 3/6 (2-4/6). All patients received cyclosporine/prednisone immunosuppressive therapy. Four patients received mycophenolate mofetil and the remaining patients were treated with azathioprine as a third agent. One patient received thymoglobulin induction therapy for delayed graft function. In 4 patients identified to have > 1:16 pre-operative anti-A1 or anti-A2 antibody titres, plasmapheresis was carried out. One patient was pre-treated with cyclophosphamide due to the inability to lower anti-A antibody titres using plasmapheresis alone. Prior to our current established pre-transplant target of anti-A levels of <1:8, 2 patients were not given pre-operative plasmapheresis and went on to lose their grafts from acute humoral rejection within the first week post-transplantation. After a mean follow-up of 7.6 years (2.5-13) in remaining patients, none of the other patients had a documented acute rejection episode and all have remained off dialysis. Furthermore, renal reserve remained excellent with a mean serum creatinine at last followup of 1.6 mg/dl (1.0-2.1 mg/dl). One patient died from metastatic squamous cell carcinoma 6 years post-transplant with a functioning graft (creatinine 1.3 mg/dl). In summary long-term outcomes of A2→O/B renal transplantation are excellent and should be encouraged in all centres if pre-operative anti-A1 antibody titres can be reduced to < 1:8. The development of safe organ preservation solutions revolutionized the field of transplantation. Currently, the solution most commonly used for organ preservation in the United States is University of Wisconsin (UW) solution. The use of UW solution is not without its price -high cost, high viscosity and the risk of hyperkalemic arrest in the reperfusion period. An alternative solution, HTK, has lower viscosity and is an adequate buffer. We studied the efficacy (cost-effectiveness and early graft function) of using HTK as preservation solution in adult patients undergoing live donor renal transplants. Methods: Adult patients undergoing live donor renal transplantation at our institution between 9/01 and 8/03 were included in the study. The patients were divided into 2 groups -those whose organs were preserved using UW solution (n=106) and those using HTK solution (n=102). Total cost of solution used (irrespective of volume, as unused solution was discarded) was calculated for each group, as well as clinical evidence of early graft function. Mean age of patients in the UW group was 42±12 yrs, compared to 47±11yrs in the HTK group (p=NS). No differences in warm or cold ischemic times were found between groups. Immunosuppression regimens were followed according to institutional protocol. INTRODUCTION: Motivation may play a role in low psychosomatic morbidity seen in live renal donors. We objectively evaluated the role of motivation and psychosocial factors affecting the psychosomatic outcome in living renal donation. METHODS: Psychosocial evaluation of the donors was performed based on a structured questionnaire preoperatively and at three months postoperatively. Motivation was objectively assessed (Score 0-66). Donors were interviewed for depression, anxiety and social support according to "Beck's depression inventory", "Spielbergs State and trait anxiety" and "social support" questionnaire. Postoperative pain was quantified based on Visual Analog Scale (0-100 Background. Quantitating the interstitial fibrosis area of a renal allograft biopsy may be a surrogate marker of graft outcome. Some groups measure interstitial fibrosis (VIntFib) using the total matrix area (Trichrome or Sirius Red imaged by White Light (SRWL) whereas others measure the area of fibrotic change represented by Collagen Type 1 & 3 as determined by Sirius Red stain imaged by Polarized Light (SRPL). In allografts, the total matrix area is increased by acute inflammatory processes with transient, reversible expression of matrix components such as hyaluronan. In protocol allograft biopsies SRPL is more predictive of long term outcome than SRWL, especially in allografts with acute rejection. We hypothesize that in organs without injury nor inflammation such as donor biopsies, these measures may be very close. Methods. We studied 46 autopsy and nephrectomy (healthy tissue removed at tumor nephrectomy) specimens (age range 1 month to 88 years). After staining with Sirius Red, cortical portions were outlined with a black pen, 400X images were obtained and archived using polarized and white illumination. The interstitial fibrosis fraction VIntFib obtained with white light (SRWL) represents the entire matrix, while VIntFib obtained with polarized light (SRPL) represents Collagen 1& 3. These measurements were compared with age, Banff score and glomerulosclerosis %. The operator was blinded to the clinical data. Results. VIntFib from both SRPL and SRWL were tightly correlated (p<0.0001, r=0.85). Donor age was more highly correlated with SRWL (p<0.0001, r=0.59) than SRPL (p=0.0003, r=0.51). This is in contrast to the findings in allograft biopsies were only SRPL is highly correlated with outcome.The difference between the SRWL and SRPL (total vs fibrotic matrix) increased with increasing age (p=0.004, r=0.42). Glomerulosclerosis was highly correlated with age (p<0.0001, r=0.61),SRWL (p=0.0002, r=0.53) and SRPL(p<0.0001, r=0.57). Summary. Glomerulosclerosis, interstitial fibrosis and interstitial matrix area increase with advanced age and are tightly correlated. This is contrary to findings in transplant biopsies where inflammation leads to marked discrepancies between interstitial fibrosis and interstitial matrix area. Sirius red staining of potential donor biopsies may be useful in assessing organ quality, especially in biopsies with a limited sample of glomeruli. Poster Board #-Session: P246-II Existing information about barriers to living donation in African-Americans (AA) is sparse. Utilizing focus groups of health care workers and interviews from previous living donors, we developed a questionnaire to assess the concerns of potential donors and recipients. The purpose of this study was to prospectively use our living donor organ survey (LDOS) with all potential donors and recipients referred for a transplantation evaluation to identify barriers to living organ donation in AA. LDOS was pilot tested and revised. Sixty (60) potential donors and recipients referred to the transplant center were asked to complete the LDOS and FACES II. The Brief COPE was administered only to recipients. A factor analysis of the content-specific items was completed. Coefficient alpha was calculated for each instrument. Test-retest reliability coefficients, (Kuder-Richardson 20), were also calculated for each instrument. Using a two-sample Student's t-test demographics (race, age, occupation, education, income, marital status) were tested against each of the factors and total score. Because of the multiple tests performed we adjusted for the number of comparisons made (Bonferroni method). The factor analysis identified six factors from the potential recipient instrument accounting for 83.44% of the variance and identified five factors from the potential donor instrument accounting for 81.71% of the variance. There were no significant differences for any of the factors on any of the demographic variables for potential recipients. However, potential AA donors differed significantly in relations to personal concerns. The survey captured concerns about making a big sacrifice, time away from family, feeling obligated to donate, living with one kidney, sexual activity and having children, surgical complications, scarring, and care from unfamiliar doctors. General concerns included, religious beliefs about donating, improving the health of the recipient and time away from work, (p<0.0001 and p=0.0031, respectively). The mean scores for AA were significantly higher for both groups. Racial differences exist in the concerns of potential donors and recipients. Addressing these concerns by transplant professionals may increase the number of AA living donors. Poster Board #-Session: P247-II Background and Aims: Living kidney donors (LD) now outnumber deceased donors in the USA and acceptance of a wider spectrum of LD is becoming more prevalent. We sought to examine the presence of pre-existing structural abnormalities in a recent cohort of LD kidneys and correlate the findings with relevant clinical indices. Methods: The following histologic analyses were carried out on biopsies taken intraoperatively from 44 LD kidneys: (a) Assignment of Banff 97 "chronicity" indices (cg, ci, ct, cv, ah). (b) Estimation of % tubulointerstitial fibrosis (%TIF) by a renal pathologist as well as by computerized digital analysis of Sirius Red-stained sections. The biopsies were divided into two groups: those for which cg+ci+ct+cv+ah was 0 or 1 and those for which cg+ci+ct+cv+ah was ≥2. Clinical data was derived from donor and recipient records. GFR was measured by iothalamate clearance. BP readings during initial clinic visit and by Ambulatory BP Monitoring (ABPM) were separately recorded. Results: See table: Conclusions: (a) A subgroup of LD kidneys have mild chronic histologic abnormalities at the time of transplantation which are primarily associated with older donor age. (c) We did not observe significant functional differences between LD kidneys with and without mild chronic histologic abnormalities. (d) %TIF was low in both subgroups of LD kidneys. (e) Correlation of the baseline histology of LD kidneys with subsequent graft biopsies, long-term graft function, and donor clinical follow-up will be important for optimization of the practice of LD kidney transplantation. Since 1998, we have had 360 enquires about NDD, done detailed medical and psychosocial evaluations of 42 potential NDDs, and done 22 NDD transplants. As our program has evolved, we have noted differences for NDDs and have made changes in our practice: 1) We have insisted that potential NDDs travel to our center for evaluation (unlike directed donors who may be evaluated at a local center). NDDs, like all donors, must pay their travel costs. After an early experience with positive viral serology (which was a contraindication) we began to require the potential NDD obtain an H&P, blood work (including electrolytes, CBC, and hepatitis B, C, and HIV), prior to coming for evaluation. 2) We require that the NDD and recipient remain anonymous (to each other), so we assigned each donor an alias at the time of admission for surgery. This practice led to logistic problems and was abandoned. 3) We had a minimum age of 18 for NDDs (similar to directed donors). The few inquirers who have been under 21 had voiced their parents' concerns about donating, or avoided telling them altogether, for fear of disapproval, or their angered response. This has raised concerns about family stress or lack of support post donation. We have now raised the minimum age to 21: any volunteers under 21 are informed about our reasoning, provided with our donor educational materials, and are welcomed to re contact us when 21. 4) One advantage of NDD (in contrast to directed donation) is that there is no family pressure to donate. In fact, we have noticed in a few cases that family and friends have tried to dissuade the potential donor. In the majority, these issues were discussed and resolved, and at the time of surgery, family and friends were at the hospital and were supportive. However, in 3 cases, we learned that spouses were upset with the donation at some point in the process (1 during surgery because of their own limited support, 1 because the donor was out of work longer than expected, 1 who we later learned was opposed all along. As a consequence, we have intensified our efforts to have family involvement. We encourage family to come to the initial screening (and information providing) interview but we recognize that this is not always feasible. We have made a video that is mailed to prospective donors (NDD and directed) that details the process and risks. And we counsel prospective donors on the importance of discussing donation with the family. These issues will be discussed in detail. All tests were 2-tailed and significance was set at P≤.05. When compared to those who consented to donation, next-of-kin who refused donation were more likely to report that financial incentives would have made a difference in their donation decision, that they personally would be more likely to donate organs if financial incentives were available, and that families should retain veto authority over donation decisions. Those who consented to donation (vs. those who did not) reported more favorable attitudes toward presumed consent and proceeding with procurement with or without family permission if the deceased's donation intentions are known. Conclusions: Next-ofkin who have been faced with a donation decision have an important voice in the debate on strategies to increase organ donation. Findings from this study suggest that there may be important differences between next-of-kin donors and non-donors in their attitudes toward financial incentives, presumed consent, and whether families should be permitted to override the donation wishes of the deceased. INTRODUCTION. Hyperglycemia following pancreas transplantation (PT) is typically due to 1) chronic rejection (CR) of the pancreas, 2) insulin resistance associated with obesity or immunosuppressive medication (calcineurin inhibitors/steroids), or rarely 3) recurrence of type 1 diabetes (T1D). While recurrence of T1D is unusual and occasionally associated with islet cell auto-antibodies, the prevalence of islet cell auto-antibodies following PT is not known. MATERIALS AND METHODS. Since 1990, over 250 SPK transplants have been performed at this center. Over the past 1 ½ years, five SPK recipients (5-9 years following transplantation) have become hyperglycemic, associated with prior development of islet cell auto-antibodies, and loss of insulin secreting cells on PT biopsy (4/5), while maintaining pancreatic exocrine function (urine amylase) and kidney transplant function. In addition, we have identified 28 other SPK recipients who have become hyperglycemic, as well as 14 patients with normal glucose tolerance, and evaluated them for islet cell auto-antibodies. RESULTS. Anti-GAD65 and anti-IA2 antibodies were assessed retrospectively in these 42 patients: GAD 1A-2 Either Ab Hyperglycemic Pancreas CR (n=9) 5 6 7 78% T1D Recurrence (n=7) 5 4 5 71% Kidney-Pancreas CR (n=4) 0 1 1 25% T2D/IGT (n=13) 2 1 2 15% Normoglycemic NGT (n=14) 3 0 3 21% Of 33 patients with hyperglycemia, 15 (45.4%) were positive for one of the islet cell auto-antibodies. The data show a higher percentage of islet cell autoantibodies in recurrence of T1D and CR of the pancreas transplant, than in the other 3 groups, CR-SPK, T2D/impaired glucose tolerance (IGT), or normal glucose tolerance (NGT). Therefore in recipients of pancreas transplants, islet cell auto-antibodies may mark the presence of an auto-immune response in those patients with clinical evidence of T1D recurrence and, surprisingly, CR. Background. Despite recent improvements in clinical islet transplantation outcome, the current isolation methods have not produced consistent islet yields and several islet preparations could not be used for clinical transplantation. Many factors could limit the yield of the currently employed islet isolation procedures. A significant number of islets could be lost during pancreas digestion, islet purification and culture. There is a need to develop strategies that can reduce islet damage and improve islet yields. Vitamin E (VitE) and Nicotinamide (NA) have cytoprotective and antioxidant properties that can be beneficial to islets. In this study, we investigated the effect of addition of VitE and NA to the islet processing medium used during human islet isolation. Methods. 81 pancreata were processed using a modification of the automated method and continuous density gradient purification. Pancreata were divided into 4 groups. Group I (n=19): islets were processed using standard isolation medium, without VitE and NA, following pancreas preservation with UW alone. Group II (n=14) VitE and NA were added to the medium, following preservation with UW alone. Group III (n=10): islets were processed using the same medium as Group I and preservation with the twolayer method (TLM). Group IV (n=38) Islets were processed using isolation medium with both Vitamins, following pancreas preservation with TLM. Results. There were no significant differences in donor related factors (i.e. age, body mass index, pancreatic weight, etc) between Groups. In Group I and II, where the pancreas was preserved in UW alone; the addition of VitE and NA in Group II resulted in a significant increase in islet yields ( Background/Purpose: Islet transplantation became more popular for the treatment of type 1 diabetes and shortage of donor pancreata became more apparent. If we were able to use living donor pancreata, the shortage would be alleviated. The critical issue for using the living donor pancreas is islet yields from a part of pancreas. Currently, whole pancreas is used for islet isolation, however, it is known that pancreas head is not suitable for islet isolation due to complex anatomy. The purpose of this study is to evaluate whether pancreatic tail could provide high enough islet yield for transplantation. Methods: After obtaining human pancreata, islets were isolated from the head part (N=20, head group) or tail part (N=23, tail group) or whole pancreata (N=24, whole group). Islets were isolated by enzymatic digestion followed by purification. We compared islet yield, purity, viability using AO/PI staining and stimulation index of glucose challenge test. Results: Fifteen out of 20 cases (75%) with head group, all cases (100%) with tail group and 23 out of 24 cases (96%) with whole group were successfully completed for islet isolation and head group was significantly difficult to complete compared to tail (P<0.02) and whole groups (P<0.05). Further analyses were performed with completed cases. The islet yield per gram pancreas was significantly higher in the tail group compared to both the head (P<0.001) and whole (P<0.01) group ( Intoduction: Islet isolation has steadily improved over the past few years aided by technical modifications and the introduction of new and defined enzyme mixes. Although, islet yield is an important consideration, it is crucial that isolated islets be shown to demonstrate in vivo function. We examined factors influencing islets recovery and in vivo function with emphasis on donor related factors. Methods and Results: Islets were isolated either by mechanical shaking (n=95) or hand shaking (n=123) from human donor pancreata, the majority of which were unsuitable for whole organ transplant. Islet yield was reported as islet equivalent per gram pancreatic tissue. Donor factors were collected for each isolation and correlation statistics performed between donor variables and islet yield. Data analysed in 95 human isolations indicated a differential effect of enzyme mixes on yield with Collagenase P digestion most suitable for increased ischemic time (R² = 0.1; P< 0.08), Liberase with small donor pancreas size and elevated pre-procurement glucose (P<0.05) and Serva with female donor gender (R² = 0.17; P< 0.06). Results from the 123 hand shaking isolations regression analysis with donor factors only showed age correlates with islet equivalent per gram (R² = 0.05; P<0.03), while regression analysis with isolation as well as donor factors showed that hypertension and length of digestion are the main factors that correlate with islet equivalent per gram (R² = 0.1; P<0.01). Islets from mechanical isolations (n=20) and hand shaking isolations (n=64) were further tested by transplantation under the kidney capsule of immune-deficient NOD-SCID mice following short-term culture (≤7 days). In-vivo function was assessed by measuring the production of human insulin and C-peptide. Age was the only donor factor to correlate with in-vivo function. Young donor age (33.3±18) was associated with better function than older age (56.4±6.6; p<0.001), in the mechanical isolation group. In the hand shaking islet isolation using wilcoxon two-sample test cold ischemia (CIT) was the most significant factor were mean CIT 11.4±5.6 (function) vs. 14.9 ±5. Background Pancreatic islet cell identity, purity, and mass are central components of islet product characterization that may predict insulin-independence in islet recipients. To analyze the cellular composition of islet products we modified current immunocytostaining techniques by using microporous polyethelene terephthalate membranes instead of glass slides for cell adhesion. All cells were captured on the membrane without loss; cell detachment during staining process was avoided and cellular morphology was preserved. Method Islets were prepared from human cadaver donor pancreases using controlled perfusion, Ricordi digestion and COBE purification. We examined cell composition of islet products by dissociating islets into single cells, fixing cells on membranes, and applying antibodies to identify β, α, δ, PP, acinar and ductal cells. After staining, photomicrographs were taken using a digital camera and images analyzed by computer. The ratio of positive cell # to nuclei # was calculated. Total DNA of products was measured to determine # of β cells (=DNA(pg)/7 (pg/cell) x % of β cells). Islets cultured for 2 days prior to transplant (tx) were infused into the portal vein of Type 1 diabetic patients. We assessed 16 consecutive tx's. Results Ratio of β, α, δ, pp, acinar, ductal, and other cells was 29±6, 17±9, 12±5, 5±2, 23±6, 9±9, and 6±7% respectively (mean±SD); islet purity was 62±12%; DNA content/IE ranged from 4.7 to 31 ng; # of β cells/IE ranged from 150 to 1250. Eleven single-donor islet recipients achieved sustained insulin-independence (I-ID). We retrospectively analyzed correlations of recipient outcome with # of β cells or # of IE transplanted. Insulindependent (I-D) and I-ID patients received 3.1±1.1 and 6.6±2.6 x10 6 /kg BW of β cells with range of 1.2-4.2 and 3.7-11.0 x10 6 /kg BW respectively, showing minimal overlap (n=2, p=0.002) between the 2 groups. However, I-D and I-ID patients received 6.6±1.5 and 7.6±1.7 x10³/kg BW of IE, with range of 4.5-8.0 and 6.0-11.5x10 3 /kg BW, showing more overlap (n=7, p= NS) between the 2 groups. The rate of insulin-independence increases linearly with increasing β cell number: <3 (0%), 3-5 (50%), and >5 x10 6 /kg (100%) but not linearly with IE number: <5 (0%), 5-6 (50%), 6-7 (80%), 7-8 (50%), and >8 x10³/kg (100%). Conclusion The # of transplanted islet β cells seems to be more predictive than # of conventional IE for sustained insulin-independence in human recipients. Poster Board #-Session: P258-II Rationale: Undividing islet tissue poses a huge limitation on clinical islet transplantation given the scarcity of donor tissues and the variable islet mass acquired per isolation. The genetic engineering of human islets to express telomerase (hTERT) may trigger islet cell division. This application may have potential for islet tissue expansion in-vitro and the attenuation of replicative senescence via telomere maintenance. Methods: Telomerase expression and telomere length were determined in fresh and frozen human islets by Western blot and a modified flow-FISH procedure. Western blotting was achieved with mAb (mouse) detection based on known amount of load protein. Flow-FISH was performed using a FITC-labelled telomere-specific PNA probe that hybridized to only the telomeric DNA region of dissociated human islet cells. The resulting signal intensity of fluorescence was standardized into molecules of equivalent soluble fluorochrome (MESF) unit (M) that directly correlates to telomere length. Telomerase activity was determined using a telomeric repeat amplification protocol (TRAP). A green fluorescent protein reporter gene (EGFP) was delivered into islet tissues using lipofection reagents, DOTAP and FuGene 6. Both transfection efficiency and glucose stimulation index were evaluated in hTERT-transfected islets for clinical potential and feasibility. Results: Telomerase gene expression was not detected in both fresh and frozen human islets, but detected in samples containing exocrine tissue using monoclonal hTERT antibody. Telomerase activity in dithizone-stained, handpicked islets from young (8 yrs) and old (64 yrs) human donors was also undetectable. In-vivo telomere shortening was found in human islets of advancing age. Young pancreas donors (7- After Edmonton experience islet transplantation is a real alternative therapy to insulin in the cure of type 1 diabetes.The aim of our study was to analyze advantages and risks of this procedure.14 intraportal islet transplants were performed: 10 according to Edmonton protocol (daclizumab, sirolimus and tacrolimus), 4 after a period of pretransplant treatment, 3-6 months, with statin and sirolimus, followed by the Edmonton protocol. Infusions were: 3 in 2 cases, 2 in 6, 1 in 6. Patients received 8,210.4 ± 1,076.7 islets equivalent number /kg body weight.Rate of insulin independence was 50% (duration 5.7±1.6 months), 3 patients were insulin free for more than 6 months (maximum duration: 14 months). 50% of patients reduced the exogenous insulin requirement < 50% of the pre transplant dose. Metabolic parameters were considered at the following time: pre-transplantation, after 2 weeks and after1,3,6,12 months. C-peptide (ng/mL): 0.1+0, 1+0. Currently there are over 30 transplant centers in the world focusing their efforts on the challenges and methods of islet cell transplantation. As the field of islet transplantation matures and the number of islet transplants performed increases, detailed analyses on factors that predict patient and graft survival are needed. In response to the need for more complete information in the field, the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) is sponsoring the Collaborative Islet Transplant Registry (CITR). The mission of CITR is to expedite progress and promote safety in islet/beta cell transplantation through the collection, analysis and communication of comprehensive and current data on all islet/beta cell transplants performed in North America. Compiling and analyzing data from all transplant centers in North America will accelerate the identification of both critical risk factors and key determinants of success, and thereby guide transplant centers in developing and refining islet/beta cell transplant protocols, leading to an advancement in the field of islet transplantation. Participating in CITR is voluntary and over 22 transplant centers have been invited to join, with 12 activated centers currently contributing to the efforts. All islet transplants performed in North America since January 1, 1996 are planned to be captured by the CITR database. Through an electronic, internet based, data capture system, quality control procedures, and the minimization of duplicate efforts at the transplant center, the most relevant and succinct information are entered. From these data a comprehensive report will be published annually. In addition, special analyses will be performed and published periodically. To date, over 83 islet transplant recipients have been entered in the CITR database and information on over 133 processed pancreata have been reported to the Registry. Data from the first Annual Report will be presented. Steroid-free immunosuppression with sirolimus and tacrolimus is the hallmark of the successful "Edmonton protocol" for islet cell transplantation. However, sirolimus has been associated with hyperlipidemia and hypertension, and tacrolimus can induce hypertension; two major cardiovascular risk factors that are especially important in diabetic patients. The goal of this study was to assess the impact of switching immunosuppression to sirolimus and tacrolimus in type 1 diabetic patients, recipient of renal allografts, prior to islet cell transplantation. Twenty-three type 1 diabetic patients (mean age 45±7 years, duration of diabetes 33±7 years, time since kidney transplant 7±6 years) were switched from a cyclosporin-based immunosuppression to sirolimus and tacrolimus. No significant changes were observed in body weight, blood pressure, serum creatinine, fasting lipid or hemoglobin A1c levels. Six patients underwent successful islet cell transplantations (four off insulin after the second transplant and two on reduced insulin doses after the initial transplants), and 7 are currently on the waiting list. Five patients were found ineligible for islet cell transplantation, 1 opted for a pancreas transplantation, 2 could not tolerate sirolimus, one developed acute rejection and is back on dialysis, and one patient had sudden death. The percentage of patients with blood pressure at goal (130/80 mmHg) was 35% before and 52% after change in immunosuppression without any obvious changes in hypertensive regimens, with 70% of patients on anti-hypertensive drugs. Forty-three percent of patients had LDL cholesterol at goal (100 mg/dl) and 87% had LDL < 130 before changing immunosuppression, with 52% and 95%, respectively after the change. Thirty percent of patients were on lipid-lowering drugs at baseline and 50% while on sirolimus and tacrolimus. In conclusion, switching type 1 diabetic patients with renal allografts from a cyclosporin-based immunosuppressive regimen to sirolimus and tacrolimus is not associated with worsening renal function or cardiovascular risk factor profile, although there was an increased need for lipid-lowering drugs. This immunosuppression regimen was tolerated by most patients and allowed successful islet cell transplantation in the six patients that have been transplanted so far. The Edmonton protocol has significantly increased the rate of insulin independence after islet transplantation. This was achieved in a population of patients with brittle diabetes and preserved kidney function. The purpose of this study is to present the results of a series of patients with type 1 diabetes with an established kidney graft, who received islet after kidney (IAK) transplants with an immunosuppressive regimen similar to the Edmonton protocol. Five patients (4 females/1male) with a median age of 45 (30-54) received a first percutaneous islet infusion between August 2002 and April 2003. They had been transplanted with a kidney for terminal diabetic nephropathy 3.5 to 23 years earlier (median 13), and had a creatinin clearance 32-69 ml/min (median 46) at the time of islet transplantation. They received a total 10,045-18,134 islet equivalent (IEQ)/kg body weight (median 12,559) isolated from 2 donors each. Immunosuppression was switched to sirolimus (12-15 ng/ml) and tacrolimus (4-6 ng/ml) with daclizumab induction at islet transplantation, with slow wean of any steroids since waitlisting. All five patients became insulin-independent. Three patients are still insulin-free 8, 9 and 11 months after transplantation, and 2 went back on insulin 3 and 8 months posttransplant. Islet function was associated with improved metabolic control in all patients, as assessed by a steady decrease of HbA1c (median 8% pre-Tx vs 6.6% at latest followup) and fructosamine (median 346 vs 277). One patient lost islet and kidney graft function 9 months post-transplant after immunosuppression withdrawal for pneumonitis. One patient developed transplant glomerulopathy. Two additional patients developed microabuminuria. Immunosuppression side effects included dyslipidemia (N=4), mouth ulcers (N=3), polyarthritis (N=1) and pneumonitis (N=1). The steroid-free, sirolimus/tacrolimus-based immunosuppressive regimen was successfully applied to a series of IAK recipients rceiving a total islet mass > 10,000 IEQ/kg. Particular attention to kidney function must be paid in this particular patient population. Poster Introduction: The Edmonton protocol, a steroid-free immunosuppressive regimen including sirolimus, low-dose tacrolimus and dacluzimab has greatly improved the outcome of islet transplantation. This regimen is believed to exibit limited nephrotoxicity. The aim of our study was to assess its real impact on renal function after ITA and IAK procedures. Materials ans methods: From July 2002 to April 2003, 10 patients (female/male: 5/5, median age:41 years) received their first islet infusion in 5 ITA and 5 IAK procedures, the latter 12.6 years (3.5-22.7) after kidney transplant. A median of 12885 IEq/Kg were injected in one, two or three infusions (2, 6 and 2 cases). Eight patients achieved insulin-independance, 5 being currently free of insulin. Data were prospectively collected over a median follow-up period of 10 months. Renal function was monitored by measured or calculated creatinin clearance (CCl), proteinuria, micro-albuminuria (MA) and, when indicated, by renal biopsies. Results: Four patients (2 ITA, 2 IAK) have experienced a marked decrease of CCl (>20%) by 6 months post-transplant. One additional patient (IAK) has decreased CCl by the same extent at the end of follow-up. MA increased in 7 cases.Two patients (1 IAK, 1 ITA) already had proteinuria at the time they were put on the waiting list, including 1 IAK patient with already impaired kidney function (CCl = 36). The two ITA patients who decreased CCl were the older of the series (age 56 and 58). Two of 3 IAK patients who decreased CCl had an established graft for > 15 years. One experienced humoral rejection 5 months after the first islet infusion. There was no obvious relationship between the decrease of CCl and insulin-independance or immunosupression blood levels. Discussion: In our experience, the impairment of renal function is not uncommon after ITA and IAK using an Edmonton-inspired immunosuppression protocol. Age over 55 years, a long established kidney graft or a borderline renal function appear as potential risk factors. Poster Board #-Session: P264-II Background: Pancreas txs have demonstrable efficacy in enabling recips with type 1 DM to achieve insulin independence. The basis of treatment lies on the allograft providing insulin in patients with an absolute deficiency. What is less clear is the ability of pancreas txs to improve glucose control in type 2 diabetic patients who produce insulin, but are unable to use it effectively. The objective of this study was examining the effectiveness of pancreas txs to provide long-term glucose control in patients with type 2 DM. Methods: A retrospective review of all patients with type 2 DM post-pancreas tx at our center between 1994 and 2002 was done. We used guidelines from the American Diabetes Association to appropriately classify patients with type 2 DM (vs. type 1 DM). Results: A total of 17 recips that fit the study criteria were identified. The mean age at diabetes onset was 35.7 yrs (range=19-48). Most patients had 1 or more secondary complications related to their diabetes: retinopathy (94%), neuropathy (76%), and nephropathy (65%). The mean age of recips at time of tx was 52.5 yrs (range=38-65); mean duration of diabetes was 16.8 yrs. At the time of tx 4(24%) were on oral hypoglycemics alone. The remaining 13(76%) were on insulin therapy, with a mean daily dose of 64 units. Prior to initiating insulin therapy, these patients were on oral hypoglycemics for a mean time period of 9 yrs. Of the 17 txs, 7(41%) were SPK, 4(24%) were PAK, and 6(75%) were PTA. Most recips were male (65%). There was 1 perioperative death due to aspiration. All other recips became euglycemic and had a functioning graft at 1-yr posttx (patient and graft survival=94%). With a mean follow-up of 3 yrs since tx, patient survival is 14/17 (82%). The 2 additional deaths during the follow up were due to sepsis (n=1) and suicide (n=1). Both deaths were just after 1-yr posttx, and both had a functioning graft at the time of death. Of the 14 recips currently alive, 12 remain euglycemic without insulin or oral hypoglycemics. One patient (6%) was initiated on oral hypoglycemics 2 yrs post-tx. Another patient was initiated on insulin 2 yrs posttx with a daily dose of 38 U; this patient had an episode of acute rejection 1-yr posttx, which rendered the graft nonfunctional. Conclusion: These findings suggest that pancreas txs can provide excellent glucose control in recips with type 2 DM. All recips with a technically successful tx were rendered euglycemic. Long-term results were comparable to those seen with tx in type 1 diabetics. Introduction. Persistent hyperglycemia following pancreas transplantation is commonly attributed to graft loss, but may occur despite an otherwise well-functioning pancreas allograft. The aim of this study was to define the incidence and possible causes of persistent hyperglycemia after pancreas transplantation. Methods. We retrospectively studied all patients (n=88) undergoing pancreas transplantation at our institution between 1/2001 and 1/2003. Persistent hyperglycemia with a functioning pancreas graft was defined as: 1) the need for exogenous insulin therapy to achieve a non-diabetic fasting plasma glucose (<126 mg/dl) and 2) evidence of graft function, i.e. an increase in serum C-peptide from pretransplant levels and a normal allograft ultrasound. Data were analyzed using Student's t-test and the Chi-square test and are expressed as median (25%-75% inter-quartile range). Results. Median follow-up was 25 months (18-30 months). Actual 1-year patient survival was 100%. Of the 88 grafts, 4 were lost to immediate post-operative thrombosis, 2 to late thrombosis, and 5 to late patient death with a functioning graft. Of the remaining 77 patients, 57 (74%) remained normoglycemic, 6 (8%) were hyperglycemic for less than 1 month and 14 (18%) required insulin longer than 1 month despite evidence of graft function. Hyperglycemia occurred in 23% (16/70) of Type 1 diabetics and 57% (4/ 7, p=0.0489) of Type 2 diabetics. Other factors associated with hyperglycemia included: high pre-transplant insulin dose, high BMI and increased rejection episodes (p<0.0001, p=0.0005 and p=0.0002, respectively). With respect to insulin dose, hyperglycemia developed in 100% (6/6) of patients requiring >100 U insulin/day pretransplant, in 83% (10/12) requiring > 75 U/day and in only 15% (10/65) of those requiring ≤ 75 U/ day (p< 0.0001). Despite hyperglycemia, glucose control significantly was improved after transplant. The median hemoglobin A1c pretransplant for patients developing hyperglycemia was 8.0 (6.7-8.7) compared to 6.6 (5.5-7.4) following transplant (p=0.0294). There were no significant differences between the groups with regard to age, gender, pre-transplant hemoglobin A1c, steroid doses or tacrolimus concentrations at 1 month and 1 year. Conclusions. Hyperglycemia in patients with a functioning pancreas allograft is primarily due to the high insulin requirements. However, glycemic control is significantly improved after pancreas transplantation. Introduction: Diabetic patients undergoing simultaneous kidney-pancreas transplantation (SKPT) have a relatively high rate of CMV seronegativity at the time of transplant, placing them at greater risk for primary CMV exposure. The purpose of this study was to determine if donor (D) and recipient (R) CMV sero-pairing at the time of SKPT subsequently influences outcomes in a large cohort of patients with long-term follow-up. Methods: Between 1/1/97 and 12/31/99, 1025 pancreas transplants were performed at SEOPF member institutions and reported to the registry, including 746 SKPTs. CMV serology and survival data were available in 740 SKPTs, including 723 primary transplants. For purposes of this study, retransplants were excluded and 4 groups were defined based upon D and R CMV sero-pairing: D+/R-, N=203 (28%); D+/R+, N=206 (28%); D-/R+, N=156 (22%); and D-/R-, N=158 (22%). Patient and graft survival for the study groups were computed by Kaplan-Meier estimates and tests of equality of survival curves were performed utilizing both the Log-Rank and Wilcoxon test statistics. A multivariate Cox proportional hazards model was fit to adjust for variables known or suspected to impact patient and graft survival. Logistic regression was used to examine the effect of CMV sero-pairing on rejection. Results: A total of 56% of Ds were CMV+ and 50% of Rs were CMV-. D serostatus was not, but R serostatus was, a significant independent risk factor for patient and kidney, but not pancreas, graft survival in the uncensored analysis. The 5 year survival rates were: Patient (90% R-vs 80% R+, p=.01); kidney (81% R-vs 71% R+, p=.008); and pancreas (73% R-vs 64% R+, p=.12). When examining the CMV D/R groups in both univariate and multivariate fashion and adjusting for other covariates, CMV sero-pairing was not an independent risk factor for either death, graft loss, or rejection in both censored and uncensored analyses (significant risk factors included donor age and cold ischemia time). However, when considering CMV sero-pairing as a binary variable (D-/R-vs all other D/R groups), the 5 year uncensored survival rates were as follows: Patient (92% D-/R-vs 83%, p=.03); kidney (85% D-/R-vs 74%, p=.006); and pancreas (80% D-/R-vs 65%, p=.01). Conclusion: CMV seronegativity is present in half of diabetic patients at the time of SKPT, and protective CMV seronegative matching confers a long-term survival advantage. Poster Board #-Session: P269-II induction. Two groups were retrospectively studied: Group 1 (G1) (n=48, AA =14) THY/Tacrolimus (T)/ Mycophenolate Mofetil (MMF)/ Prednisone (P); and Group 2 (G2) (n=24, AA=5), D/T/MMF/P. THY (1.5 mg/kg) was given intraoperatively, plus six consecutive days with a solumedrol taper. D (1 mg/kg) was given intraoperatively and on POD 5 with a solumedrol taper.Maintenance immunosuppression: T (trough, 7-10ng/ml); MMF (2 gm/day); and P 10 mg/d. Donor and recipient demographics (age, gender, ethincity, ABDR match, and cold ischemic time) were similar between groups. Results: Overall patient, renal, and pancreas 1-yr survival was 100%, 99%, and 98%, respectively. There were no differences bewteen the groups. In T1DM patients with uremia which underwent kidney transplantation, both impaired glucose homeostasis and hypertension may significantly affect graft survival. Aim of our study was to assess whether vascular of metabolic markers or kidney disfunctions were detectable non-invasively in vivo in the transplanted patients depending on whether they received K or KP transplantation. 20 KP and 15 K patients matched for anthropometric features and plasma creatinine but with different HbA1c levels (6.4±0.7% and 8.6±0.6% in KP and K respectively, P<0.05) underwent two procedures: 1) evaluation of renal blood flow with Magnetic Resonance Quantitative-flow (MR-Qflow) in the resting state and after i.v. L-arginine infusion to assess renal vasodilatory reserve 2) assessment of high energy phosphates (HEP) in the resting state using localized 31 P-MR-Spectroscopy of the transplanted kidney. Both MR-Qflow and MRS protocols were performed using a 1.5T system (Gyroscan Intera; release 8; Philips). Basal blood flow was similar in KP and K and it increased significantly (+12.0±8.1%) after L-arginine administration in KP (p=0.02), but not in K (-1.3±8.6%). At the same time a trend for a reduced inorganic phosphate (Pi)/βATP ratio was also evident in the K (1.17±0.14) in comparison with KP (1.62±0.30). In summary, L-arginine-dependent renal vasodilatory response was demonstrated in KP patients using MR-Qflow technique and it resulted to be blunted in K patients. Altered HEPs metabolism in the transplanted organ was also evident in K in comparison with KP patients using 31 P-MRS. In conclusion, correction of diabetes in KP along with that of uremia was shown to be associated with better vascular and metabolic features of the transplanted kidney than in K; both MR-Qflow and MRS techniques may be useful tools to reveal early alterations of renal function in the transplanted organs. Background: Anastomotic leaks after PTx are seen in 3-5% of recips, usually occur early posttx, and are typically due to technical factors. However leaks may occur late after tx. We studied such events to determine predisposing factors and management. The study population consisted of all PTx recips that were diagnosed with a late leak, defined as one occurring more than 3 months after the tx. Excluded were recips with an early leak or a leak immediately after enteric conversion. Results: Between 1994-2002, a total of 25 PTx recips were identified that had a late leak (incidence=3.5%). Mean recip age was 40.3 yrs; mean donor age 31.3 yrs. Category of tx was as follows: SPK (n=5, 20%), PAK (n=10, 40%), and PTA (n=10, 40%). The majority of patients were bladder drained (n=23, 92%); only 2 were enteric drained. The mean length of time from tx to presentation of the leak was 20.5 months (range=3.5-74). A direct predisposing event occurring in the 6 weeks preceding the leak was identified in 10 (40%) of the recips. These factors included: a biopsy proven episode of acute rejection (n=4, 16%; diagnosed at a mean of 17.8 days prior to the leak), a history of blunt abdominal trauma (n=3, 12%; which occurred at a mean of 3.3 days before the leak), and CMV infection (n=3,12%; diagnosed at a mean of 19.3 days before the leak). Non-operative management (Foley catheter placement with or without percutaneous abdominal drains) was the initial treatment in 14(56%) of the recips. This was successful in 9(64%) of these cases; the other 5 required surgical repair after failure of conservative management at a mean of 10 days after placement of the Foley. 11 of the recips had surgical intervention as their initial treatment: repair in 9 and pancreatectomy due to severe peritonitis in 2. After appropriate management of the initial leak (conservative or operative), 5(20%) of the recips had a recurrent leak at a mean time of 5.6 months after the initial leak. All 5 of these patients ultimately required surgical repair. Blunt abdominal trauma or acute rejection was noted in the antecedent period leading up to the recurrent leak in 2 of these recips. Conclusions: Late pancreatic leaks are not uncommon and may be more common with bladder drained grafts. Almost half of the cases have some obvious preceding event such as acute rejection, abdominal trauma, or CMV infection that predisposes to the leak. For stable patients with bladder-drained grafts, non-operative treatment will be successful in two-thirds of the cases. , immunosuppression (use of induction therapy, initial calcineurin inhibitor used, mycophenolate mofetil use) and stratifying by transplant centers, PAK was associated with a higher rate of kidney allograft loss which was not statistically significant (Table) . In adjusted models, PAK was associated with a trend to a higher rate of the combined outcome of kidney allograft loss or patient death. Conclusion: Among those eligible for an SPK and receiving a KA, our study failed to confirm a benefit of a pancreas after a kidney transplant. Whether the trend to an elevated risk of allograft loss or patient death in PAK is in the perioperative period only, needs further exploration. Background: Long-term neurologic recovery and complications in children receiving a liver transplant for FHF are not well described. The purpose of this analysis was to determine if there were long-term neurologic sequelae once these patients had recovered from their liver transplant. We also tried to determine the impact of this on long-term schooling in these children. Results: We studied outcomes in pediatric patients receiving a successful liver transplant for FHF. Only patients transplanted after 1990 were included in this analysis. Excluded were recipients who lost their graft or died within the first 3 months after transplant. A total of 10 recipients were identified that fit the study criteria. Mean age at time of transplant was 8.2 years (range = 1-18). There were no documented neurologic abnormalities in these patients prior to the onset of FHF. Cause of FHF was due to Hep A (1), autoimmune (1), acetaminophen (1), and unknown (7). Seven of the 10 recipients had severe encephalopathy (Gr III or IV) at the time of transplant. Mean waiting time from presentation to transplant was 8.8 days (range = 2-23 days). With a mean follow up of 6.3 years (range 1-12 years), all 10 recipients are alive with functioning grafts. Neurologic complications in the 10 recipients include persistent seizures in 2 of the recipients -in 1 recipient the seizures are well controlled medically , but the other recipient has ongoing severe seizure disorder and is now globally aphasic as a result of these. Of the other 8, 1 has major developmental and language delays and another recipient has mild cognitive impairment on formal neuropsychological testing. Another 3 recipients have had major behavior issues requiring psychological intervention. With regards to schooling -5 are at an age appropriate grade level, 2 are 2 grade levels behind, 1 is 1 grade level behind, 1 does not attend school due to neurologic impairment, and another left school after 2 years due to behavioral problems. Conclusions: Long-term major and minor neurologic problems are not uncommon in pediatric patients transplanted for FHF. These should be anticipated and adequately addressed in the long-term management of these patients. Poster Board #-Session: P277-II The success of pediatric liver transplantation (OLTxp) has improved greatly since its widespread application in the 1980's however, it remains a technically challenging procedure, especially in the very small recipient. We report our experience in infants who weighed less than 5 kilograms at the time of OLTxp. Methods: A retrospective review of the medical records of all children who underwent OLTxp who weighed less than 5 kg, transplanted from 1987 -2003 was performed. Results: 17 patients were identified. The mean age at the time of OLTxp was 5.5 ±3.1 months with a range from 19days to 1.2 years. The mean weight was 4.21±0.76kgs with the smallest child weighing 2.5kgs. The indication for liver transplantation was acute liver failure 41%, biliary atresia 35%, chronic liver disease 18% and vascular malformation 6%. At the time of transplantation, 65% of the patients were Status I, 24% were Status II and 11% were Status III. 10 patients received cadaveric reduced size grafts, 4 received cadaveric whole organs and 3 received living related left lateral segments. Vascular complications occurred in 2 patients (hepatic artery thrombosis (HAT, n=1), portal vein thrombosis (n=1)). The patient with HAT expired. Four patients died from sepsis. Two patients required re-transplantation for chronic rejection. Post-transplant lymphoproliferative disease (PTLD) occurred in 24% of the patients and was managed by a reduction in immunosuppression. The one and five year survival of these patients was 76% and 65% respectively. In contrast, our overall series (n=268 patients) one and five year survival was 87% and 82% respectively. Conclusion: Liver transplantation can be performed successfully in very small infants but at a higher mortality rate. More of these children required urgent transplantion. Sepsis, not technical complications was the most common cause of poor outcome. Technical complications occur but not at a greater rate than that experienced in all other patients undergoing transplantation. Background. Post-transplant biliary complications occur in 15-35% of the pediatric liver transplant recipients. The presence of multiple bile ducts in the graft has been rarely studied as a risk factor for biliary complications in pediatric liver transplantation. Aims. To review the risk factors for the occurrence of biliary complications and to investigate the impact of the presence of multiple ducts in causing biliary complications in pediatric liver transplantation. Methods. 106-children (39-whole allografts, 1-RL, 15 LL, 51 LLS) who underwent primary liver transplantation were included in this study. Patients were divided into two groups: those with a single bile duct (n=80) and those with multiple bile ducts (n=26). For accurate analysis, the number of bile ducts was considered as the number of biliary anastomoses, as described in the operative note. Biliary complication was defined as any deviation from the expected postoperative course caused by a problem in the biliary anastomoses. Conclusions. The presence of multiple bile ducts is an independent risk factor for biliary complications after pediatric liver transplantation. The risk of biliary complications is not associated with the size, anatomical segment or source of graft, but with the number of bile ducts and the complexity of the biliary reconstruction. Retrospective review of all children with IFALD who underwent ILTx in a single centre between 1998 to 2003. Patients & grafts surviving more than 1 month were included in the study. Results: 9 children (6 male) median (range) age 11.8 (5.5-14.7) months with residual bowel length 30-80 cm underwent ILTx. Biliary reconstruction: Duct to Duct (Group I, n=4); Duct to Intestine without Roux loop (Group II, n=5). Surgical complications, biochemistry, radiology, histology, patients and graft survival were compared and summarised in table. Summary: In group II, (2/5) showed air in the biliary tree; one required re-exploration for bile leak and one had cholangitis with mild fibrosis at 5 year with normal liver function tests. Patient and graft survival in both groups was 100% with normalization of liver function test within 3 months of ILTx and similar incidence of complications. The median time to discharge was 71 days with 7/9 patients requiring partial PN at the time of discharge. Conclusion. Both types of biliary reconstructions are technically possible in children with IFALD undergoing ITLx with good medium term outcome Purpose: although split liver transplantation for pediatric patients is increasingly accepted, few data are awailable on graft size mismatch impact on the outcome. Methods: two-hundred and thirty-two cases of pediatric primary, isolate, orthotopic liver transplantation were reviewed for graft size matching. One-hundred and seventyone cases (74 %) were split liver transplants. Twenty-nine cases were urgent, receiving intensive care preoperatively. Graft weight was aveilable in 145 cases. Patients were categorized in four groups by graft-to-recipient weight ratio (GRWR): small grafts (SG; GRWR ≤ 1.5 %, 27 cases), medium grafts (MG; 1.5% < GRWR ≤ 3.5%, 62 cases), large grafts (LG; 3.5% < GRWR ≤ 6%, 47 cases), and extra large grafts (XLG; GRWR > 6%, 9 cases). Results: SG were associated with larger and older recipients, but not with older donors; LG and XLG from extra-large donors were more used in urgent cases. Post-transplant bilirubine clearence was delayed in SG. Post-transplant surgical complications requiring relaparotomy were more frequent in SG, while no differences were observed in vascular, biliary and infectious complications and acute rejecton rates among groups. Unrelated death-censored graft survival in XLG (actuarial 75% at 1 year, and 63 % at 4 years) was significantly lower compared to that in SG(82% and 80%), MG (90% and 88%), and LG (84% and 82%), while differences was not significant considering elective cases (83%, 91%, 97%, and 75% at 1 year in SG, MG, LG, and XLG, respectively). Patients survival were similar among groups (actuarial 92%, 94%, 85%, and 81% at 1 year in SG, MG, LG, and XLG, respectively). A strong relationship was observed between GRWR and donor-to-recipient weight ratio (r=0.741). Conclusion: Pediatric split liver transplantation can be safely performed with a wide range of GRWR. GRWR can be fairly predicted by means of donor-to-recipienent weight ratio.The use of SG led to an higher incidence of complication requiring relaparotomy, while the use of XLG resulted in a significant lower graft survival. The analysis of elective cases showed that SG and XLG can be used obtaining results comparable to those obtained with MG and LG. Background: Elevated levels of serum autoantibodies (AAB) after orthotopic liver transplantation (OLT) can be associated with late graft dysfunction. However, the impact of isolated elevation of AAB on otherwise stable pediatric recipients is unknown. This study aimed to evaluate the prevalence, clinical significance, and risk factors of isolated elevation of AAB in children after OLT. Methods: Children more than 6 months post-OLT with no history of hepatitis type C or autoimmune disease were prospectively recruited. A single blood specimen to determine levels of ANA, anti SM, anti LKM 1, and pANCA was drawn. Past medical history, physical exam and laboratory data were obtained to identify potential risk factors and to evaluate clinical significance of presence or absence of serum AAB. Results: Sixty-eight children, median age of 9 (range 1.6-18.2) years and median time since OLT of 5 (range 0.6-15.1) years were consented for the study. Eighteen (26%) patients had at least one positive (titer≥1/20) AAB test with ANA being the most common (n=10). AAB prevalence increased over time with positive AAB in 39% of the patients >5 years post OLT vs. 14% of the patients <5 years post OLT. Time since OLT was identified as the only risk factor for development of AAB (RR 3.3, 95% CI 1.2-9). Patient demographics, reason for OLT, immunosuppressive regimens and infection history did not increase the risk for a positive AAB test. Patients with positive AAB test did not have increased risk for liver dysfunction, occurrence of chronic or acute rejection, or other autoimmune phenomenon. The prevalence of AAB increases over time in survivors of pediatric OLT. Isolated elevation of AAB was not a risk factor for graft dysfunction or onset of de-novo autoimmune phenomenon and did not require immediate clinical intervention. Longer prospective follow-up is needed to determine if the presence of AAB can forecast patients at risk for late graft dysfunction. Introduction: Nephrotoxic side-effects of CNI are of critical importance in clinical outcome and "quality of life" after pediatric transplantation. Long-term renal function after OLT in children under CNI based immunosuppression is not well documented. Patients and methods: Longitudinal data of 31 children (19 male,12 female) transplanted after 1990 were acquired. Mean age at transplantation was 6,3 ± 5,8 years (mean ± SD) with a range from 0,3 to 24,6 years. Mean follow-up was 5,2 ± 3,5 years (range 0,4 to 15,7). Multifactorial causes lead to the liver failure.The immunosuppressive therapy included neoral (CSA)/tacrolimus (TAC) and azathioprin/ mycophenolate mofetil with prednisone.16 patients were treated with TAC, 15 with CSA based immunosuppressive regimen. Glomerular filtration rate was estimated using the Schwartz formula (calculated GFR (ml/min/1,73 m²) = k x height (cm) / serum creatinine ). Results: Before transplantation GFR was elevated significantly with 159,5 ± 38,2 ; after OLT GFR decreased significantly in both treatment groups within one month into the normal range within the CSA (105,8±18,1) group (and similarly in the TAC (108,4±32,8group respectively (p< 0,05 pre versus post). In long term follow-up GFR remained constant with a GFR of 109,6 ± 40,7 at two years and 107,4 ± 31,4 at six years after OLT. Mathematical modeling of renal function using exponential one compartment model (y= a*expbx) did not demonstrate decrease in GFR even for very long observation times with a positive b for both treatment groups. This indicates again that in this patient group GFR was maintained. Conclusion: Liver insufficiency leads to glomerular hyperfiltration pre transplantation. After OLT GFR normalizes within weeks and remains stable even in long term followup. Unlike renal transplantation the potential nephrotoxic side effects of CNI do not necessarily cause renal insufficiency even in long term observation time after pediatric liver transplantation. If lower trough levels of CNI in OLT versus renal transplantation or other factors are important to explain these striking differences remains to be elucidated. Background: Post-transplant lymphoproliferative disorders (PTLD) are a well-known complication following organ transplantation. Pediatric liver transplant (LTx) patients are at increased risk, since many of them are Epstein-Barr Virus (EBV) sero-negative at the time of transplant. Much data confirms the positive correlation of EBV viral load and PTLD. The standard of care in managing PTLD is reduction or withdrawal of immunosuppression; however, this is associated with a high incidence of mortality due to graft rejection. Recently, rituximab, a chimeric humanized anti-CD20 monoclonal antibody (MoAb) has opened new treatment possibilities. Purpose: Although much experience has been accumulated on short-term effects on EBV-associated PTLD, we sought to determine the long-term treatment outcomes with anti-CD20 MoAb and baseline immunosuppression. Methods and Results: From 1998-2003, 5 pediatric LTx patients (median age 73 months; range: 31-105 months) were diagnosed with EBVassociated PTLD. Diagnosis was made by presence of bulky lesions or progressive disseminated disease along with proliferation of B-cells expressing the CD20 antigen. At the time of diagnosis, the mean EBV viral load measured by PCR of peripheral blood was 14,076 copies/mcg of DNA. Patients were treated with 4 weekly intravenous doses of anti-CD20 MoAb at 375 mg/m². Immunosuppression was modified to maintain a tacrolimus or cyclosporine trough concentration between 3-6 ng/ml or 50-75 ng/ml, respectively. Clinical remission of PTLD was achieved 4 weeks after completing therapy with an absence of B-cells and a decrease in EBV viral load to a mean of 73 copies/mcg of DNA. Two patients reached undetectable levels of EBV titer 4 weeks after completion of anti-CD20 MoAb, and of these, 1 patient maintained a negative EBV titer. Five months after therapy, B-cells returned to a level of >1% in all patients, and we noted that EBV titers also increased to a mean of 1,458 copies/mcg (range: 0-4246 copies/mcg) of DNA. All patients are alive and disease free after a median follow-up of 23 months (range: 10-32 months). No patient has experienced recurrence of disease, even with the resurgence of EBV viremia. Conclusion: Anti-CD20 MoAb along with a reduction in immunosuppression is effective in the long-term management of PTLD. However, longterm follow-up is necessary with the resurgence of EBV viremia and the potential for recurrence of PTLD. BACKGROUND: Quantitative analysis of the Epstein-Barr virus (EBV) genome has been recently reported to be helpful for early identification of EBV viremia which could reduce the risk of EBV infection and post-transplantation lymphoproliferative disorder(PTLD). AIM: To demonstrate the significance of serial monitoring of EBV genome load by realtime quantitative polymerase chain reaction (PCR) after pediatric living donor liver transplantation. METHODS: From Sep 1997 to June 2003, the EBV genome load in peripheral blood mononuclear cells was measured serially in a total of 25 consecutive pediatric recipients of living donor liver transplantation (LDLT) with a minimum of 6 months follow up. EBV PCR was measured every week for 8 weeks after LDLT and every 2-4 weeks thereafter. EBV DNA levels were expressed in copies/microg DNA. RESULTS: Eight of 25 (28%) developed URS (n=4), diarrhea (n=3), fever (n=2), severe stomatitis (n=1), liver dysfunction (n=1), and megaroblastic anemia (n=1) with significantly elevated EBV DNA levels with peak values ranged from 531 to 8530 copies/microg DNA(mean 2843+-1971). The patients were treated by reduction or discontinuation of immunosuppressants and administration of acyclovir. The EBV DNA levels decreased in all these patients following the recovery from their symptoms. One case developed two episodes of PTLD; first episode in intestine and peritracheal lymphnodes with peak EBV DNA level of 421 copies/microg DNA POD 103, second in neck and inguinal lymphnodes with peak EBV DNA level of 2700 copies/microg DNA POD 194. Both episodes were successfully treated immunoglobulin and gancyclovir with either reduction or discontinuation of immunosuppression. Seven of 8 recipients with EBV infection and one with PTLD were EBV seronegative, who received EBV seropositive grafts. The other one with EBV infection was EBVseronegative, receiveing EBV-seronegative graft. CONCLUSIONS: Serial quantitative analysis of the EBV genome load by means of real-time PCR are thought to be useful for early detection and treatment of EBV infection as well as PTLD through adjustment of the immunosuppression level. EBV-seropositive to seronegative combination seems to be high risk for EBV infection and PTLD. The SBS patients showed lower IS requirements than the EHBA group and a lower rejection rate. Conclusion : Isolated LTx in SBS is associated with an excellent medium term outcome and a good allograft tolerance. The anticipated problems with the bioavailability of oral IS were not evident. previously had fractures. 10 were receiving vit D supplements and 7 supplemental feeding. Median height and weight z-scores were -0.7 (-4.8 to 3.0) and -0.2 (-3.3 to 1.8) respectively. Ionised calcium, phosphate and magnesium were low in 4, 7 and 3 children respectively but no child had a raised serum PTH. Although 9/10 and 5/10 had low 25-OH vit D2 and 25-OH vit D3 respectively, 1,25-(OH) 2 vit D was normal in all cases. Median BMD z-score for L2 to L4 lumbar spine was -1.2 (-2.5 to 0.02) and for total body was -1.0 (-1.61 to -0.34). Median BMAD z-score was -0.7 (-3.0 to -1.3). Conclusions: In children aged over 3 undergoing OLT, bone mineral density and vitamin D and PTH status were better than anticipated. Optimal bone health can be achieved in children with chronic liver disease prior to OLT using vitamin D supplementation and ensuring optimal nutrition. Background: Little is known about de novo glucose intolerance (GI) after pediatric liver transplantation (LT). Given the potential impact of GI on long term survival and quality of life, it is important to identify those children at increased risk for GI, and the effects of GI on morbidity. Purpose: To determine the prevalence of de novo GI as defined by the need for insulin treatment within 30 days of transplantation (insulin group), and to define the prevalence of GI at one year post-LT as defined by fasting blood sugar (FBS) >126mg/ml or insulin therapy. Secondary aims were to identify the predictors of GI in children undergoing liver transplantation, and assess the impact of GI on graft rejection (acute cellular (AR) and ductopenic (CR)) and mortality. We conducted a single center retrospective cohort study of 123 children who underwent 126 LT from 7/96 to 7/02. All patients but one received Tacrolimus and corticosteroids. Demographic and clinical data were obtained from a prospectively maintained LT database. Independent variables were primary diagnosis, age at LT, UNOS status at the time of transplant. We performed univariate analyses to identify independent predictors of diabetes, and compare outcomes. Results: Three patients received a second transplant during this period. Twenty one patients required insulin treatment within 30 days post-LT; at one year post-LT, 3 had recurrent or continuous insulin requirement, and one had a high FBS. In the noninsulin group, 6/102 required insulin with steroid cycles, and 2 had high FBS at one year. In univariate analysis, early GI correlated with a higher mortality rate (15/102 v/ s 5/21). There was no difference in the incidence of AR (55/102 v/s 11/21), or CR (3/ 102 v/s 0/21). Risk factors included advanced age (mean age 9.3 years v/s 1.5 years ), immune and metabolic liver disorders, re-transplantation (8/102 v/s 7/21), acute presentation (UNOS status one 31/102 v/s 11/21). Conclusions: GI is an uncommon complication in children who are long-term survivors of LT. Early insulin requirement highlights a high risk group. Follow up on long term complications is also needed in those who do improve past the acute period, or maintain GI, particularly in regards to their kidney function, lipid status and cardio-vascular risk. Poster Board #-Session: P291-II The purpose of this study was to examine the effects of nutrition, oral vs. nasogastric, on growth and development in children with biliary atresia who are waiting for a liver transplant. Also examined was how the chronic illness impacted on the family. The subjects were a convenience sample of 24 children with biliary atresia who were awaiting liver transplant. Twelve of the children received nutrition by mouth only. The other twelve received oral as well as nasogastric feedings. At the time of referral for transplant evaluation, which served as baseline for this study, the ages of the children ranged from 2 months to 30 months. The mean age of those who received oral feedings was 10.6 months and was 4.6 months among those who also received nasogastric feedings. A prospective design was implemented to follow the children from their transplant evaluation to six months post evaluation. Anthropometrics growth measurements of length, standard deviation scores for height (SDS), weight, head circumference, midarm circumference (MAC), tricep skinfold thickness (TSF), and nutritional assessment were recorded bimonthly. The children served as their own controls. Developmental evaluation using the Mullen Scales for Early Learning, were performed at the baseline evaluation and six months later. A family assessment, using the Stein Impact of the Illness of the Family Scale, was used to evaluate the family's level of stress at six months post baseline evaluation. Repeated measures analysis of variance were performed separately on the oral and nasogastric feeding groups to evaluate changes in body length, SDS, weight, head circumference, MAC, TSF and nutritional intake (calorie per kilogram of body weight). Both groups demonstrated significant gains over time in height, weight, head circumference, and midarm circumference. The results of a paired t-test did not show a significant developmental change among the children in the oral feeding group. However, the children in the nasogastric group, demonstrated a significant decline in expressive language, and visual reception raw scores. Based on the results of an independent t-test, utilizing the Stein Impact of the Illness of the Family scale there were no significant differences in the level of stress between the two feeding groups. Maintenance immunosuppression for pediatric liver transplant recipients in our unit is with tacrolimus and prednisolone. We retrospectively reviewed the circumstances for using rapamycin (rapa) or azathioprine (aza) in selected patients. Of the 9 patients on rapa; 3 had autoimmune hepatitis (AIH), 3 developed 'de novo' AIH. Five of these 6 children had previously been on aza or mycophenolate mofetil (MMF), but were switched to rapa (cross over) as their AIH was not controlled. A diagnosis of 'de novo' AIH was made in the presence of interface hepatitis on liver biopsy, antibodies (antinuclear antibody, smooth muscle antibody, liver kidney microsomal antibody) in the serum, and presence of increased IgG following transplant in children who did not have AIH before transplant. Two were on rapa as they were transplanted for malignancy. The 9 th child had rapa added when she developed late hepatic artery stenosis after an episode of rejection. The dose of rapa was titrated to maintain levels of 5-8mcg/dl. In 15 other children aza was added: 6 'de novo' AIH, 6 with AIH before liver transplant, 2 who were transplanted elsewhere with aza as part of primary regime, and 1 who was converted to cyclosporine and aza when she developed tacrolimus toxicity. Aza doses were typically titrated to yield 6-thioguanine levels >235 pmoles per 8 x 10 8 RBC. The only non-responder was a child with severe non-adherence issues. One of the children with abnormal graft function in the rapa group had several other complications including chronic rejection and has had rapa withdrawn on suspicion of lymphoproliferative disease. In conclusion, it is possible to achieve stable graft function in liver transplant recipients with a diagnosis of AIH before and after liver transplantation by using rapa or aza. In our experience, rapa has been successful in cases where aza was not, especially in AIH and 'de novo' AIH (cross over patients above). Adverse events have been minimal and monitoring drug levels is essential for optimal drug dosage. Adherence remains an important issue for long term stable graft function. Poster Board #-Session: P293-II Background: Hepatic artery thrombosis (HAT) after pediatric liver transplantation has been attributed to lack of the use of an operating microscope, small recipient size and/or weight, interposition grafts, lack of the postoperative use of anticoagulants (eg. heparin or low molecular weight dextran). Aim: The purpose of this review is to report our experience focusing on the interrelationships between risk factors, surgical technique and the incidence pediatric hepatic artery thrombosis. Methods: From the period 02/01/1997 to 06/01/2003, all liver transplants were prospectively tracked for HAT. All hepatic arterial anastomoses were performed utilizing 3.5-6 magnification loupes. All patients underwent intraoperative ultrasound with Doppler flow studies after allograft revascularization daily for three postoperative days. Anticoagulation consisted of aspirin in all patients for the first three months post transplant with or without alprostadil (PGE 1 ) for the first seven post-op days. Results: 141 consecutive liver transplants were preformed in 125 pediatric patients. 75 grafts (53%) in 69patients were whole organ transplants while 66 grafts (47%) in 56patients were partial livers. Of partial grafts, 22 left segments and 1 right lobe were from living donors. Children receiving whole livers were older (9.47 ± 6.76 years vs. 3.25 ±3.59 years; p = 0.001) and weighted more (35.1 ±26.7 kg vs. 16.2 ±13.2 kg; p = 0.001). In the whole liver group13 liver grafts (18.8 %) were transplanted into children less than 10 kg. while in the partial liver group 21 (37.5 %) were less than 10 kg. Seven iliac jump graft arteries were anastomosed using an interposition technique from the infrarenal aorta to the donor artery. while 136 allografts underwent donor hepatic artery reconstruction, end to end. In the study period, 3/141 (2.1%) grafts developed HAT, partial liver grafts 3/66 (4.5%), while whole liver grafts were 0/75 (0%). Of the three patients with HAT, two were successfully revascularized in the first 24 hours. Therefore 1/141(.71%) liver grafts were lost to HAT. No patient death was associated with HAT. Overal actuarial survival at one and two years was 95.7% and 93.0% respectively. Conclusions: HAT may be minimized in the pediatric transplantation population without the use of intraoperative microscopes, heparin, urokinase, or dipyridamole. Surgical loupes, anticoagulation utilizing aspirin with alprostadil were adequate to minimize HAT in the pediatric population. We examined the efficacy of recombinant soluble P-selectin glycoprotein ligand (rPSGL-Ig) ± sirolimus (SRL) ± cyclosporine (CsA) in treating the accelerated chronic rejection (CR) of renal allografts from brain dead (BD) F344 donors transplanted into LEW recipients. Materials and Methods: rPSGL-Ig (50µg iv) was administered to the donor 3hrs after BD and immediately to the host after transplantation to inhibit initial cellular activity. As shown previously, this strategy inhibits T cell activation and signalling. SRL±CsA was given to immunosuppress subsequent antigen-dependent responses. Grafts from all recipient groups (n=8-12/Gp) were examined histologically at 150 days. Activated CD4+CD25+ T lymphocytes and T cell signalling (signal 1 and 2: TCR/MHC class II and CTLA-4/B7-1/B7/2/CD28) were assessed using double immunostaining. Gene expression of IL-2, IL-10, IFN-γ, TNF-α, TGF-β, and ICAM-1 were measured with RNAse protection assay and Real Time PCR. Recipient groups included: Gp1=isografts; Gp 2=BD donor allografts in rats treated with CsA 1mg daily x 10days; Gp 3=SRL 0.4mg daily x 21; Gp4=SRL(as above)+rPSGLIg (as above); Gp5=SRL+CsA; Gp6=SRL+CsA+rPSGL-Ig; Gp7=SRL 0.1mgx4-13 days+CsA+rPSGL-Ig. Results: At 150 days after transplantation, Gp1 grafts showed only minor (1/4+) signs of CR (tubular atrophy, interstitial fibrosis, and vascular obliteration). CR of Gp 2 allografts was moderate (2/4+) and extensive in Gp 3 allografts (3/4+). The addition of rPSGL-Ig protected the allografts of Gp4 hosts substantially (1/4+). Severe tubular Abstracts injury (4/4+) occurred with high dose SRL+CsA treatment (Gp5), regardless of rPSGL-Ig (Gp6). However, grafts of Gp7 (low dose SRL+CsA+rPSGL-Ig) resembled isografts with inhibited gene expression (IL-2, IFN-γ, and TGF-β), sporadic CD4+CD25+ cells, and signal 1+2. Conclusions: Organs from recipients treated with high dose SRL±CsA showed severe chronic changes, regardless of the effects of rPSGL-Ig on initial T cell activity. In contrast, selectin blockade acts synergistically with low dose SRL+CsA to prevent CR of renal allografts from BD donors over the long-term. Documented reports have demonstrated that ischemia/reperfusion injury (IRI) might promote the development of chronic graft dysfunction (CGD). However, the mechanism has not been well-defined. It has been found that MHC class I chain-related antigen A (MICA) and B (MICB), the ligands for NKG2D (an activating receptor of natural killer cells), could be induced through cellular stress, and were closely related with CGD. The purpose of this study is to investigate whether IRI could upregulate expression of RAE-1 and H60 (MICA/B homologues) genes in mice and then trigger CGD by activating NK cells. Methods: Male Balb/c mice weighing 22 to 25g were anesthetized with sodium pentobarbital, then a midline laparotomy was performed and an atraumatic clip was used to interrupt the arterial and the portal venous blood supply to the left lob of the liver. After 90 minutes of partial hepatic ischemia, the clip was removed initiating hepatic reperfusion. Sham control mice underwent the same protocol, but without vascular occlusion. The sham and ischemic lobs were taken at intervals of 1, 2, 3, 5, 7, 10, 15, 20, and 30 days postoperation for analysis. Total RNA was extracted from liver tissue and quantified by UV spectrophotometer. RNA (1 µg) was reverse transcribed and amplified by real-time quantitative PCR. All samples were detected in triplicate, and the change folds of RAE-1 and H60 mRNA were shown by the ratios of ischemic sample mRNA/sham sample mRNA.Results: Compared with the mRNA level of sham control, IRI caused a decrease of RAE-1 mRNA level in ischemic liver from day 1 to day 3 after surgery, while the RAE-1 mRNA levels increased by day 5 to day 30. Starting on day 7 and persisting to day 30, RAE-1 mRNA levels were increased by 2.5 to 5 times in the ischemic liver over the sham control. On the other hand, IRI caused a 2-6 fold increase in H60 mRNA level from day 1 to day 20 over the sham control, however, H60 mRNA level was decreased by day 30 after surgery. The change patterns of RAE-1 and H60 mRNA levels caused by hepatic IRI were apparently different.Conclusion: This study first report that hepatic IRI increased RAE-1 and H60 mRNA levels in ischemic mouse liver over the sham control with different change patterns, though the mechanism was unknown. The upregulated expression of these NKG2D ligands might activate NK cells and play a significant role in innate immunity associated with transplantation and then promote CGD, and these molecules may become the new interfering targets for prevention. Background: VEGF, a major angiogenesis factor, is also a pro-inflammatory cytokine and plays a critical role in a variety of physiological and pathological immune response. It is well established to be induced by hypoxia, suggesting that its expression will be characteristic of transplantation. However, little is known of its role in posttransplantation ischemia/reperfusion (I/R) injury. In this study, we investigated the expression and function of VEGF and its receptors (flt-1 and flk-1) in I/R injury. Methods and Results: Seventy percent partial hepatic ischemia was performed for 75 minutes in male C57BL/6 mice. First, we evaluated the local expression of VEGF and receptors in the liver using quantative real-time PCR. VEGF expression was significantly upregulated after 2h of reperfusion following 75 minutes of ischemia, while flt-1 and flk-1 expression was down-regulated after reperfusion. Next, we administrated anti-flk-1 (DC101) and anti-flt-1 (MF-1) monoclonal antibodies (mAb, 1mg of each) 30 minutes before reperfusion. Interestingly, mAb treatment significantly inhibited hepatic injury compared to control at 6 h of reperfusion (sAST: P=0.0068, sALT: P=0.034, sLDH: P=0.006). Histological analysis also revealed the protective effect of targeting VEGF receptors on hepatic damage. Massive cellular infiltration and extensive hepatic cellular necrosis was observed in control mice, while the lobular architecture was relatively preserved and there was less necrosis in mice treated with mAb of VEGF receptors. Since VEGF may function in vivo via alterations in leukocyte trafficking, we also evaluated intragraft expression of chemokines and adhesion molecule in control and antibody-treated grafts using real-time PCR. We found that blockade of flt-1 and flk-1 significantly down-regulated the local expression of the VEGF-regulated proinflammatory chemokine MCP-1 and the adhesion molecule E-selectin after 2h of reperfusion. Conclusion: This study demonstrates for the first time that VEGF is expressed and is functional in hepatic I/R injury. We suggest that blockade of VEGF via inhibition of its receptors may represent a novel target for the protection of the liver in the peri-transplant period. (Background) With the current shortage of cadaver donors and the increasing number of diabetic patients on the transplant waiting list, there is a critical need to optimally use "less-than-ideal" donors for pancreas transplantation. However, there are no objective and rapid means for assessing pancreas graft viability and suitability for transplantation. In this study, we examined the possibility of graft viability assessment and post-transplant outcome prediction using 31P-Nuclear Magnetic Resonance (NMR) spectroscopy combined with two-layer cold storage method (TLM) preservation. (Methods) Segmental canine pancreas grafts were preserved with TLM for 24 hours after 0, 60 or 120 minutes of warm ischemia (Group 1, 2 or 3, respectively). After preservation, we determined intragraft phosphate metabolites non-invasively using 31P-NMR spectroscopy. Time required for this assessment was only 5 minutes. Since all grafts in groups 1 and 2 were successfully transplanted (the viable group) while all in group 3 failed to survive after transplantation (the non-viable group) based on our previous study, possibility of post-transplant outcome prediction was examined based on the comparison between these two groups. (Results) The ratios of Pi/γ-ATP and Pi/ β-ATP reflected the extent of graft damage and the differences were statistically significant among groups 1, 2 and 3. Based on analyses of receiver operator characteristic (ROC) curves, the optimum cutoff levels between the viable and non-viable groups were 1.6 and 2.2 for Pi/-γATP and Pi/β-ATP, respectively. The accuracy rates of these ratios were both 83%. (Conclusion) 31P-NMR spectroscopy combined with TLM preservation could provide an objective, rapid, and possibly non-invasive mean to assess pancreas graft viability and to determine suitability of damaged pancreata for organ transplantation. Nitric oxide (NO) is a short-lived chemical mediator which is a vasodilator, as well as an inhibitor of smooth muscle cell proliferation, platelet adhesion and aggregation. NO could therefore be an ideal agent to prevent stenosis and thrombosis in dialysis access grafts, as well as to help ameliorate the deleterious effects of cold storage/ischemiareperfusion injury. Purpose: To develop a local delivery system for NO that could be used in PTFE dialysis grafts and to use this as a model for developing technology to help prevent preservation, storage, and reperfusion injury during transplantation. Methods: NO releasing polymers were prepared using diaminoalkyl trimethoxylsilane-N 2 O 2 (DACA/N 2 O 2 -SR) for dip coating onto standard PTFE graft material. To prepare grafts releasing a range of NO fluxes, PTFE tubing was dipped into the silicone rubberdiazeniumdiolate solution. NO flux was measured in real time via chemiluminescence. Results: A wide range of steady state NO fluxes, ranging from 1-20 x 10 -10 mol/cm²min, was achieved by varying the number of dip coats and the concentration of the diazeniumdiolate compound. This followed an intial burst effect of NO release over the first 30 mins of as high as 40-50 x 10 -10 mol/cm²min (Figure) . [Basal rate of NO release from stable endothelial cells is 1 x 10 -10 mol/cm²min; while the rate of release from activated endothelial cells is 4 x 10 -10 mol/cm²min.] At 24 hrs, only 4-11% of the total amount of NO available had been released. Analysis of the NO release curve suggests that steady state NO release is obtainable for at least 10-14 days. Conclusions: Local delivery of NO is a very favorable and logical approach to harness the beneficial effects that NO has on stenosis and thrombosis, without the problems of short half life and systemic toxicity. The local application of NO may find utility as a means to improve organ function following cold preservation and reperfusion. Recent data in kidney, liver and lung has implicated a direct modulatory role for T cells in the organ response to ischemia-reperfusion injury (IRI). The underlying mechanisms for this important response are largely unknown. Early T cell activation in vivo, particularly in the absence of alloantigen, is not established. Furthermore, conventional immunohistochemistry has not revealed a significant T cell influx early into reperfused tissue. We hypothesized that T cell activation and infiltration occured very early after reperfusion and enhanced teniques are required to detected this. We have established a tissue collagenase digestion technique followed by sequential lymphocyte isolation techniques to isolate T cells from kidney tissue. We used a murine model of 30 min of warm ischemia followed by one hour of reperfusion at which time kidneys were harvested. Isolation of T cells was performed from normal mouse kidneys, sham IRI kidneys that had anesthesia and surgery but no renal artery clamping, and IRI kidneys. This was followed by quantification of T cell infiltration/kidney, and then flow cytometric evaluation of the T cell activation markers CD69 (early activation marker) and CD25 (later activation marker). T cell counts in kidney increased from normals (6.3x10 5 , n=6) and sham IRI mice that underwent surgery (6.7x10 5 , n=6) to renal IRI mice (13.8x10 5 , n=5)(p<0.01). A significant increase in CD3CD69 double positive cells (14.7x10 4 , n=6) was seen in renal IRI mice, but not in sham IRI mice (5.3x10 4 , n=6), compared to the normal (3.8x10 4 , n=6) (p<0.01). The numbers of CD3CD25 double positive cell did not differ. Similar quantification at later time points and further phenotyping of these IRI-activated T cells is underway. This is the first demonstration of very early T cell activation and infiltration into postischemic kidneys in the absence of alloantigen. Mechanisms underlying this response are unknown, but could involve "Signal 3" on T cells. Early T cell activation during IRI could explain why ischemic organs are more susceptible to rejection. Interventions directed against T cell infiltration and activation could improve allograft injury for both cadaveric as well as live donor transplants. Heme oxygenase (HO) has been shown to provide potent protection in numerous models of cellular stress. Both carbon monoxide (CO) and biliverdin (BV), products of heme degradation by HO, have been shown to suppress ischemia/reperfusion (I/R) injury. We hypothesized that co-treatment of CO and BV would show enhanced protective effects against I/R injury following prolonged cold preservation and transplantation. Methods: Syngenic heterotopic heart transplantation (HTx) and orthotopic kidney transplantation (KTx) were performed in Lewis rats with 24 hrs UW cold preservation. Recipients received CO inhalation (20 ppm) for 24 hours and/or BV (50 mg/kg, ip) at -2 hrs and immediately after reperfusion. Results (Table) : While monotherapy with CO or BV did not alter the survival of heart grafts, combination of CO and BV significantly reduced myocardial injury (lower CPK levels), inhibited proinflammatory mediator activation (e.g. TNFα), and improved graft function (Langendorff apparatus), resulting in improved graft survival of 80% from 0% in untreated recipients. Following KTx, there was a significant improvement of creatinine clearance (CCR) with combination therapy. Protective effects of CO and BV were mediated via different mechanisms, since CO, but not BV, effectively improved renal cortical blood flow. Whereas, BV was more effective than CO in inhibiting lipid peroxidation (lower tissue malondialdehyde [MDA] level) with maintainance of the reductive capacity. Conclusions: These data demonstrated that the co-therapy with CO and BV was more effective than monotherapy in protecting HTx and KTx against I/R injury. Enhanced effectiveness appeared to be provided by the different mechanisms of CO and BV to exert protection against I/R injury. The study may also suggest that the potent cytoprotection of HO-1 attributes to the different actions of byproducts of heme catabolism to mediate cytoprotection against oxidative stress. Growing experimental and clinical evidence supports the concepts that injury to allograft activates immune system. Cyclosporine(CsA)-induced nephropathy is nonimmunologically associated with persistent low-grade ischemic injury. But, it has not been tested whether CsA-induced renal injury is associated with activation of immune system. Based on above findings, we hypothesized that CsA-induced nephropathy may increase immunogenicity by activating immune system. To define this hypothesis, we evaluated the expression of HLA class II Ag, Toll-like receptor (TLR) and heat shock protein (HSP)70 expression in normal and CsA-treated rat kidneys. Sprague-Dawley rats were used. Chronic CsA nephropathy was induced by administering CsA (15 mg/kg, S.C.) for 4 weeks, and control rats were treated with vehicle (olive oil, 1mg/ kg, S.C.) for 4 weeks. Renal function and histological findings (striped fibrosis, interstitial inflammatory cells infiltration, arteriolopathy) were measured. HSP70 and HLA class II antigen expression was detected with immunoblot and immunohistochemistry, and TLR-4 mRNA and protein was detected with RT-PCR and immunohistochemistry. Compared to the vehicle, CsA-treated rat kidneys showed dramatic increase of HSP70 on renal tubular cells on outer medulla as well as cortical tubular cells. TLR-4 mRNA was significantly increased in CsA-treated rat kidney, and localization of TLR-4 protein revealed increased immunoreactivity on proximal tubular cells. HLA class II antigen expression was also increased on renal tubular cells and dendrite cells in outer medulla in CsA-treated rat kidneys as compared with vehicletreated rat kidneys.The results of our study suggest that CsA treatment increases HSP70 protein, and this activates immune system by increasing MHC class II antigen via TLR. This finding provides new concept that CsA-induced renal injury is associated with activation of immune system, and this may increase rejection episode in allograft. Introduction: Ischemia/Reperfusion (I/R) injury is an independent risk factor for longterm outcome of renal allografts. However, current knowledge about the molecular mechanisms underlying renal I/R injury is limited so far although the induction of cytoprotective genes like HO-1 has been shown to improve graft function. Therefore, efficient strategies to prevent I/R injury require a better understanding of the molecular processes occuring during cold ischemia and following pretreatment strategies. We were interested in the gene expression profile of 820 immune-related genes using cDNA microarrays in rat kidney allografts undergoing prolonged cold ischemia with or without the induction of HO-1. Material and Methods: In a well established F344 to Lewis rat kidney transplant model, F344 donors were either pretreated with the selective inductor of HO-1, cobaltprotoporphyrin (CoPP), or remained untreated. Kidneys were engrafted following 20 min or 24 hrs cold ischemic time. Recipients were sacrificed 12 hrs later and mRNA was extracted. A customized cDNA microarray with 820 rat related target genes was used for analysis. Native F344 kidneys served as controls. Results: Cold ischemia of 20 min. and 24 hrs induced differential expression of 78 genes and 114 genes, respectively. A short ischemic time of 20 min leads to the induction of adhesion molecules (e.g. ICAM1), or apoptosis related genes (e.g. FAS). Prolonged ischemia of 24 hrs results in the upregulation of a higher number of genes (e.g. stress proteins like HSP70 and HSP105), stronger up-regulation (2-10 times higher than 20 min) and in accelerated chronic graft dysfunction compared to 20 min controls. CoPP pretreatment could not abrogate enhanced expression of HSP70, but in contrast, leads to the inhibition of macrophage-related markers (e.g. CD68), cell cycle associated genes (p53) and transcription factors (STAT1). Conclusion: Microarrays provide a powerful tool to uncover the multitude of molecular events occuring during I/R contributing to early allograft dysfunction. We could already demonstrate that upregulation of HO-1 by a single donor treatment with CoPP improves graft function long-term significantly. Our recent study suggests that CoPP treatment protects allografts against I/R injury within the first 12 hrs by interacting at the transcriptional level with selective but not all inflammation related markers. Objectives: PI3-kinase/Akt pathway is so-called cell survival pathway, insulin can stimulate this pathway. However, our previous study showed that insulin currently used in UW solution were harmful to the preserved rat liver grafts. We want to clarify the roles of PI3-Kinase/Akt pathway on the ischemia/reperfusion injury during graft preservation in UW solution by stimulation with insulin or inhibition with LY294002. Methods:The rat liver grafts were preserved in UW solution with the addition of insulin or LY294002 for different peroids. The downstream proteins of PI3-kinase/Akt pathway were detected by Western-blot. The ischemia injruy in the preserved grafts were compared in terms of apoptosis and necrosis between the two groups. The reperfusion injury were investigeted by comparing the survival rates, apoptosis, and necrosis of the implanted grafts. Results: Akt was activated by insulin at the beginning of graft preservation with the phosphorylation at Thr308 and Ser 473 and the expression levels of phospho-Akt were decreased gradually. The higher expression levels of Bad, phospho-GSK-3β, and caspase-12 in insulin group contributed to stimulate the caspase-3/7 and cleaved caspase-3/7, leaded the cell apoptosis and necrosis. LY294002 inhibited the expression of the PI3-kinase/Akt pathway by downregulating the expression of phospho-Akt at Thr308 and Ser 473. The downregulation of Bad, phospho-GSK-3β, and caspase-12 could be found in LY294002 group in which the liver grafts were shown less ischemia injury in terms of apoptosis and necrosis after 9 hours' preservation. For the implanted grafts, the one-week survival rates were significant increased in LY294002 group than that of insulin group.Severe reperfuson injury could be found in that of massive confluent necrosis and apoptosis at the 24 hours after operation in the grafts preserved in UW solution with insulin for 9 and 24 hours. Conclusions: PI3-Kinase/Akt pathway was dys-regulated and related to ischemia/ reperfusion injury during graft preservation. Insulin exacerbated ischemia injury through activation of caspase-12 , phosphorylation of GSK-3β, and dephosphorylation of Bad during graft preservation. Inhibition of PI3-Kinase/Akt pathway might be beneficial to long-term graft preservation. Ischemia/reperfusion (I/R) injury has been established as a non-immunologic risk factor for the development of chronic graft nephropathy following renal transplantation. We showed previously that a chemically identified injection, YM, flavonoid extracted from Chinese herbs, attenuated the short and long-term consequences of cold I/R injury in the pig renal allografts with CsA synergetically. Since warm I/R is potentially more damaging than cold storage, this study was undertaken to determine if YM would attenuate warm I/R injury in mice. Methods. C57BL/6 males were subjected to 50 min of left renal (with right renal resection) ischemia as three groups (n=8/group). Group I-sham operated animals; group II-nontreatment animals (saline, i.p.), 30 min before I/R; group III-YM (25 mg/kg, i.p.), 12 hours and 30 min before I/R. Mice were sacrificed 24 hours post-reperfusion. Serum creatinine and blood urea nitrogen were measured. Myeloperoxidase activity of kidney was measured for assessment of neutrophil infiltration. Morphology changes of kidneys were evaluated by histological analyses. Renal expression of TNF-alpha and ICAM-1 was studied using reverse transcription-polymerase chain reaction (RT-PCR) and immunohistology. Results. I/R caused a 6-fold increase in creatinine and urea nitrogen levels 24 hours post-reperfusion in group II. YM reduced the increase by 50%. The saline treated mice demonstrated an increased infiltration of neutrophils and widespread loss of brush border (dramatically decreased PAS staining). YM-treated mice had only patchy necrosis and dramatically decreased neutrophil plugging. YM significantly reduced kidney Myeloperoxidase activity (5.6 ± 0.8 vs. 15.2 ± 2.4 U/g wet tissue). Immunohistochemical staining revealed that the upregulation of TNF-alpha and ICAM-1 was greatly diminished by YM (p<0.01). Using RT-PCR, the enhanced TNF-alpha and ICAM-1 mRNA expression was decreased in the YM treated animals, compared with controls. Conclusion. We observed that administration of YM resulted in morphological and functional protection of renal warm I/R injury. These observations show that YM reduced mice renal warm I/R injury at least partially via decreasing neutrophil infiltration and the inhibition of upregulation of TNF-alpha and ICAM-1. This recent experiment proves that pharmacological preconditioning with YM may be beneficial for attenuating renal warm I/R injury. and can mitigate the negative consequences of cold ischemia (CI) on heart transplants and skeletal muscle. We investigated if SOE illumination in rat hearts during CI could improve the preservative effect of the University of Wisconsin solution (UW) on high-energy phosphate (HEP) levels. The hearts of 24 Lewis rats weighing 220 g were explanted with standard technique using cold (+ 4 ° C) UW solution. They were subsequently immersed in UW. In half of the grafts, SOE was produced by illuminating the hearts for 10 minutes each 30 minutes period with photons at λ634 nm with the Valkion® equipment. After 2 or 4 hours of ischemia, the hearts were snap frozen in liquid nitrogen before freeze -drying. The samples were then minced to powder and the nucleotides extracted using a 1.5 M perchlorid acid solution containing 1 mM EDTA. The samples were then analyzed with in vitro 31 Phosphorus Magnetic Resonance Spectroscopy ( 31 P MRS) at 11.75 T and the relative concentrations in mmol/g dry weight of phosphocreatine (PCr), inorganic acid (Pi) and beta-adenosine triphosphate (β-ATP) were obtained. Their relative concentrations were obtained by integrating their peak areas and comparing to the internal standard phenylphosphonic acid (PPA). The phosphorylation ratio, PCr/β-ATP, a known correlate to biochemical and functional outcome, was calculated. Results: After 2 hours of CI the group were SOE was induced had a higher PCr/β-ATP ratio, 0.79 ± 0.36 vs. 0.31 ± 0.07 (p < 0.05 Brain death has been shown to affect hormone regulation, inflammatory reactivity and hemodynamic stability. Previously, we and others have observed that brain death results in progressive organ dysfunction and immune activation. Moreover, in the transplant model, kidneys, livers and lungs retrieved from brain-dead (BD) rats were subject to increased primary non-function and deteriorated graft survival. However, the mechanism(s) by which brain death leads to these processes remain yet unclear. To further unravel these mechanisms we have now performed DNA microarray studies with pooled RNA isolated from kidneys from normo-and hypotensive 6 hours BD rats, corresponding with optimal and marginal BD donors, respectively, and used RNA from living donor kidneys as control. Oligonucleotide arrays were manufactured using the Sigma/Genosys Rat Oligonucleotide Library harbouring 4854 unique rat sequences. A 2-fold change in expression was regarded as the cut-off point of a gene being differentially expressed. In kidneys from normotensive donors 72 genes were identified that were either up (64) or down (8) regulated, whereas 91 genes were differentially expressed (67 up and 24 down regulated) in hypotensive BD donor kidneys. Most of the differentially expressed genes from the normotensive group (87%) were recognized in the hypotensive group as well. From a selected number of genes (e-selectin, MCP-1, KC, Egr-1, KIM-1, HO-1, Hsp70 and Aqp-2) expression changes were confirmed (p<0.05) using semi-quantitative RT-PCR. Moreover, genes were found which, in previous studies, had been identified as being differentially expressed (e.g. Il-1beta, Il-6). Analyses of the data enabled us to categorize most of the genes in different functional groups: Inflammation/Coagulation, Cell Division/Fibrosis and Defense/Repair. Also genes encoding transcription factors and proteins involved in signal transduction were identified. Summarizing, the use of DNA microarrays has clarified parts of the process of brain death. It appears that not only deleterious processes as inflammation Prolonged cold ischemia is suggested to exacerbate ischemia reperfusion (I/R) injury and graft coronary artery disease (GCAD). We investigated the effects of cold ischemia for different periods on cardiomyocyte apoptosis and inflammatory response during I/ R injury and the degree of GCAD in rat cardiac allografts. Methods: PVG rat (RT1 c ) hearts subjected to cold ischemia for 30, 60, 90, 120, 150 min (n = 6 each group) were heterotopically transplanted into ACI rats (RT1 a ). Grafts were procured after 4 hours of reperfusion and analyzed for superoxide generation by the spin-trapping method; for myeloperoxidase activity, TNF-α, IL-1β, and MCP-1 production by ELISA; for cardiomyocyte apoptosis by TUNEL and by caspase-2, -3, -8, and -9 activities. Additional animals (n = 8 each group) received cyclosporine A 7.5 mg/kg/day for 10 days as chronic rejection models. Indices of GCAD were determined at 90 days. Results: A positive linear correlation was found between cold ischemia time, I/R injury, and GCAD. Superoxide generation, myeloperoxidase activity, TNF-α, IL-1β, and MCP-1 production, cardiomyocyte apoptosis, and caspase-2, -3, -8, and -9 activities increased with ischemia time, peaked at 120 min, and plateaued at 150 min. The percentage of luminal narrowing, the intima-to-media ratio, and the percentage of diseased vessels increased with increased ischemia time, peaked at 120 min, and plateaued at 150 min. All tested variables in both the acute and chronic phases were significantly increased with 120-min ischemia in comparison to 30-min ischemia (p < 0.01). These data indicate that the degree of cardiomyocyte apoptosis and inflammatory response in the cardiac allografts during I/R injury depends on the duration of cold ischemia. More importantly, prolonged cold ischemia correlates with advanced GCAD. Geranylgeranylacetone increases renal expression of heat shock protein 70 and protects kidney from ischemia/reperfusion injury. Introduction and Objectives: Heat shock proteins (HSPs) are known as molecular chaperons that protect cells from various stress. Geranylgeranylacetone (GGA), an antiulcer drug, has been shown to increase expression of HSPs in various cells of humans and other mammals. The aim of this study was to test the effects of GGA on induction of HSPs in kidney and renal ischemia/reperfusion injury. METHODS. Male C57BL/6 mice of 8-12 weeks old were orally administered either GGA 600 mg/day (GGA group, n=8) or vehicle (control group, n=8) for 7 consecutive days. These mice were subjected to 27 min of warm ischemia following right nephrectomy. Another group of mice underwent sham surgery after GGA treatment (sham group). Twenty-four hours after reperfusion the mice were euthanized and the serum levels of creatinine (Cr) and urea nitrogen (UN) were measured. Expression of HSPs (25, 40, 60, 70, and 90) in renal homogenates was measured by immunoblotting assay and was semi-quantitated using NIH Image. Renal parenchymal injury was assessed using histology score (0: none to 4: severe) for each tubulus (10 microscopic field per slide, 3 slides per kidney, x 200 magnification) in hematoxylin and eosin stained sections. RESULTS: The expression of HSP70 was significantly higher in the GGA group when compared to the control group (p<0.05). The expression of HSP 25, 40, 60, and 90 was comparable in both groups. Serum Cr and UN levels were significantly lower in the GGA group when compared to the control group (Cr 0.56±0.45 mg/dL vs. 1.45±0.40 mg/dL, p<0.05 , UN 89.8±14.5 mg/dL vs. 146±10.9 mg/dL, p<0.05). Average histology score was significantly lower in the GGA group in comparison with the control group (2.84±0.49 vs. 3.62±0.18 , p<0.05). CONCLUSIONS: Based on these results we conclude that GGA induces the expression of HSP70 in kidney and protects kidney from ischemia/reperfusion injury. Poster Board #-Session: P313-II Proximal tubular epithelial cells (PTE) are integral part of renal histological changes observed in organ transplant recipients treated for long-term with immunosuppressive agents. These changes could be reflected in molecular changes in these cells. We studied the effect of CsA, Tacrolimus (TAC), Sirolimus (SRL) and TGF-beta on mRNA expression of TGF-beta, collagen, fibronectin, CTGF and iNOS. Methods: PTE cells were treated with different concentrations of these agents and at 4 h, at which maximal expression for these genes was observed. Renal cells were treated with 50 ng/ml of Tac and SRL and 250 ng/ml of CsA and TGF-β protein (5 ng/ml). The semiquantitative analysis of mRNA expression was performed using RT-PCR. Results: A significant increase in TGF-β mRNA expression in PTE cells treated with CsA (p<0.03) tacrolimus (p<0.05) and TGF-β (p<0.04) are shown in Fig. A . A significant increase in collagen mRNA expression in PTE cells treated with CsA (p<0.05), TAC (p<0.05) and TGF-β (p<0.02) are shown in Fig. B . A significant increase in fibronectin mRNA expression in PTE cells treated with CsA (p<0.03), tacrolimus (p<0.006) and TGF-β (p<0.02) are shown (Fig. C) . These agents failed to induce mRNA expression of fibrogenic molecules in TGF-β deleted PTE cells. We also studied the effect of CsA, tacrolimus (Tac), sirolimus (SRL) on iNOS and CTGF mRNA expression in these cells. The expression of iNOS mRNA was significantly inhibited by these agents whereas CTGF was increased. Conclusions: These results demonstrate that TGF-β mediates nephrotoxic effects of both CsA and tacrolimus by inducing expression of pro-fibrogenic molecules in renal cells. These results also for the first time demonstrate the effect of immunosuppressive drugs on CTGF mRNA expression in renal cells in relationship to iNOS expression, allowing the use of specifc modulators to prevent nephrotoxicity. Despite the well-known pro-inflammatory effects of tumor necrosis factor (TNF), role of TNF cleavage process in the pathogenesis of lung injury following lung transplantation remains unclear. Although TNF plays a critical role in certain physiological defensive responses, it causes severe cellular damage in the host when produced in excess. TNF has two forms with apparently different biological activities, a membrane associated form and a soluble form generated from the membrane-bound protein by proteolytic cleavage with TNF converting enzyme (TACE). We hypothesized that TACE inhibition might prevent TNF-induced tissue injury while preserving benefits of TNF, such as host defense. In this study, we used TACE-inhibitor (TACEI), Y41654, to elucidate a role of TNF cleavage process in acute inflammation using a rat model of lung transplantation. Inbred male Lewis rats were subjected to left lung isotransplantation. Donor lungs were kept in Euro-Collins solution with or without 1 mg/ml Y41654 (TACEI and control groups, respectively) (n=10 for each group) for 6 hours. Then, the left lung was transplanted into recipient rat and reperfused for 4 hours. The animals were injected intravenously with 125I-labeled albumin, as a marker of pulmonary albumin leakage, 3 hours after the initiation of reperfusion. Pulmonary 125I-albumin leakage was assessed by using the concentration ratio of lung tissue to plasma (T/P ratio) and that of BAL fluid to plasma (B/P ratio), which were used as parameters of pulmonary endothelial and alveolar septal damage, respectively. The TACEI group showed significantly lower T/P and B/P ratio than the control group. Neutrophil accumulation in the alveolar space and other histopathological findings after lung isotransplantation were significantly attenuated in the TACEI group. Additionally, significantly lower levels of MCP-1, CINC-1, HMGB-1(High Moblity Group Box -1) and soluble E-cadherin and decreased neutrophil elastase activity were observed in BAL fluid from the TACEI group. We conclude TNF cleavage process plays a critical role in the development of post-transplantation lung injury in rat. Ischemia/reoxygenation (I/R) is still a serious concern in organ transplantation. I/R confers oxidative stress to the cell and induces apoptotic cell death. Signal transducers and activators of transcription-3 (Stat3) is one of the most important molecules involved in the initiation of liver development and regeneration, and recently known to protect cells from various pathogens. In order to investigate the hepatoprotective effects of Stat3, we examined whether Stat3 protects against hypoxai/reoxygenation (H/R)induced injury in primary hepatocytes (PHC) of rats. [Methods] PHC were prepared from SD rat (250g, male) by collagen-perfusion method, and seeded at 3 x 10(6) cells per 10-cm dish. Adenovirus and cytokine were added 2 days and 1 hr, respectively prior to the H/R insult. [Results] Interleukin-6 (IL-6) and cardiotropin-1 (CT-1), that function mainly through Stat3 activation, protected cells from H/R-induced apoptosis, which was reversed by over-expression of dominatly negative form of Stat3. Adenoviral over-expression of constitutively activated form of Stat3 (S3-C) or N-acetyl-L-cysteine (NAC) reduced H/ R-induced apoptosis ( Figure) as well as generation of reactive oxygen species (ROS) in hepatocytes. Interestingly, S3-C induced Survivin and Mn-SOD (but not Cu,Zn-SOD) both in protein and mRNA levels, though S3-C did not affect the expression of other anti-oxidant proteins. Over-oxpression of Mn-SOD significantly reduced H/Rinduced apoptosis by inhibiting redox-sensitive caspase-3 activity. Stat3, activated by cytokines or over-expressed in the cell, functions to protect hepatocytes from H/R-induced cell injury in a redox-dependent manner by up-regulating Mn-SOD and inactivating caspase-3. Considering its strong mitogenic and anti-oxidant/apoptotic properties, Stat3 seems to be a good therapeutic target for preventing injury and promoting regeneration in liver transplantation. Introduction. Machine perfusion (MP) proves to be beneficial in the preservation of the liver. Currently the modified University of Wisconsin solution (UW-G) is used as the MP preservation solution of choice. However, this solution may not contain a sufficient amount of substrates for the decreased liver metabolism at 4 degrees. Therefore we have developed Polysol, a MP preservation solution based on a colloid, containing the necessary nutrients for the liver. In a previous study we have shown that Polysol results in equal or even better quality liver preservation when compared to UW-G. We sought to optimize Polysol by substituting the colloid Hydroxyethylstarch (HES), which causes microvasculatory obstructions, is expensive and difficult to obtain. We therefore compared HES with the colloids Dextran and Polyethyleneglycol (PEG). Methods. In an isolated perfused rat liver model, hepatocellular damage and liver function were assessed during reperfusion after 24 hours hypothermic MP using P-Abstracts HES, P-Dextran or P-PEG. To determine hepatocellular damage ALT and LDH levels were measured during 60 minutes of normothermic reperfusion with oxygenated KHB. Liver function was assessed using oxygen consumption, bile production and ammonia clearance. Control livers were preserved for 24 hours by MP using UW-G. Results. Compared to the control (UW-G) MP with either P-HES, P-Dextran or P-PEG resulted in significantly less hepatocellular damage, higher flow, bile production and oxygen consumption during reperfusion. MP with P-Dextran resulted in less hepatocellular damage (U/L) as compared to MP with P-HES: ALT (t=40) 2.17±0.98 vs 3.50±3.51, respectively. Livers perfused with P-PEG also sustained less hepatocellular damage as compared to P-HES: ALT (t=40) 2.20±1.30 vs 3.50±3.51 respectively. Oxygen consumption (mmHg) was increased after MP with P-Dextran as compared to P-HES: (t=10) 447.80±53.10 vs 399.28±30.64. Neither these differences, nor differences in ammonia clearance or bile production were significant. Conclusions: 24 hours MP of livers using UW-G results in more extensive hepatocellular damage and reduced liver function when compared to MP using either Polysol-HES, Polysol-Dextran or Polysol-PEG. MP using Polysol-Dextran or Polysol-PEG results in equal or even less hepatocellular damage when compared to MP using Polysol-HES. Therefore substituting HES in Polysol with either Dextran or PEG can be considered feasible. Purpose: The effect of cold ischemia-reperfusion as a non-immunologic factor in the pathogenesis of chronic rejection was evaluated in a rat cardiac allograft model. Methods: Heart transplantations were performed in the Lew to F344 rat model with (CRI) or without (CR) 10h of cold ischemia (CI) at 0-1 °C prior to transplantation. Both group were compared with corresponding syngeneic control groups. After evaluation of graft function hearts were excised at 2, 10, 40 and 60 days after transplantation. Degree of vasculopathy was investigated by HE histology. Mitochondrial function was estimated in situ by high-resolution respirometry of permeabilized myocardial fibers. Enzymes activities (citrate synthase, Complex I; LDH) were assessed in myocardial homogenates. Results: Graft function (cardiac score) declined with duration of reperfusion from normal function (score 4) to score 1.6±0.5. This decrease was faster in the CRI when compared to the CR group and was nearly absent in isografts. The main histopathology of allografts undergoing chronic rejection was arterial myointimal proliferation, mononuclear cell infiltration and fibrosis. After 60 days, CR but not syngeneic groups revealed graft vasculopathy and fibrosis, both significantly more evident in CRI (2,1±0,8 vs 1,6±1,0) . Mitochondrial respiration declined with all substrates used, indicating the general loss of mitochondria rather than specific defect of respiratory chain complexes. This decline was significantly more pronounced in CRI group. Consistently, activity of the mitochondrial matrix enzyme citrate synthase was relatively stable in syngeneic grafts, but decreased substantially (up to 20% of syngeneic controls) in both experimental groups and correlated well with cardiac score. Similarly to mitochondrial respiration, these enzymatic changes were more evident in CRI. In contrast, the activity of mitochondrial respiratory chain enzyme complex I equally declined in all groups and did not match with cardiac score. Activity of LDH similarly decreased to 50% of syngeneic controls in CRI and CR. Conclusion: Severe mitochondrial loss and cell injury were evident in CR. In addition to an allogeneic response, prolonged CI substantially contributes to the progression of chronic rejection and loss of organ function and is associated with mitochondria-related mechanisms. Vascular endothelial dysfunction occurs in the kidney graft coming from marginal brain death donors and may be responsible for a low success rate after transplantation. Given that nitric oxide is a potent endothelial cell survival factor we hypothesise that stimulating NO synthesis with L-arginine could modulate vascular integrity and hence affect the progression of alloreactivity after transplantation. Brain death was induced in 16 dogs for 6 hours. Immediately after the inflation of the intracranial balloon, the treated group (n=6) received 40 mg/kg bolus followed by 3 mg/kg/min infusion of L-arginine for 30 min. Renal vascular function, hemodynamic and biochemical parameters were determined. During BD progressive renal dysfunction was observed that coincided with a significant vasoconstriction, increase in renal venous nitrite ( Human monocytes up-regulate co-stimulatory molecules during allogeneic cellmediated responses. We have previously shown that monocytes engulf membranes from allogeneic endothelial cells (EC) facilitating monocyte activation. In this study we investigated whether this uptake is via direct cell to cell contact, whether it is receptor dependent, and whether receptor blockade can prevent up-regulation of monocytesderived co-stimulatory molecules.We used human peripheral blood mononuclear cell (PBMC) responders, and cultured human EC or EC labeled with PKH-26 as targets in a PBMC-EC co-cultures. A PBMC-EC transwell co-culture was used to determine whether this up-take was cell contact dependent. PBMC were collected following coincubation with PKH-26 labeled or unlabeled EC monolayers, and analyzed by FACS to detect PKH-26/CD14/CD40/CD80/CD86/HLA-DR positive monocytes. In additional experiments, polyguanylic acid (poly (G)), a scavenger receptor ligand, was added into the PBMC-EC co-cultures at different concentrations to block scavenger receptor. CD14 + /CD86 + /HLA-DR + monocytes were shown to phagocytize live EC membranes as indicated by PKH-26 positivity. Additionally, PKH-26 positive monocytes up-regulated CD40 and CD80 following 72 hour-co-incubation. There were no PKH-26 positive T cells observed from the co-cultures. In contrast, CD14 + / CD86 + /HLA-DR + monocytes did not become positive for PKH-26 when co-cultures were separated by transwell membrane. Scavenger receptor blockade with poly (G) demonstrated a dose-dependent inhibition of live-EC membrane up-take by monocytes. Poly (G) at 1 mg/ml completely inhibited EC membrane up-take by monocytes in a 12hour co-culture. In addition, the up-regulation of monocytes-derived CD40 and CD80 was inhibited by poly (G) at 500 µg/ml in a 72-hour co-culture. These data demonstrate that monocyte up-take of live EC membranes during their interaction with EC monolayers is contact dependent. Scavenger receptor blockade by poly (G) completely prevents monocyte EC membrane up-take, and block subsequent monocyte activation and costimulatory molecule expression. These data suggest that monocyte up-take of allogeneic membranes is via the poly(G) scavenger receptor, and that therapeutic intervention targeting these receptors may limit allorecognition and antigen presentation. There is an increasing body of evidence that coagulation factors can modulate the immune response, although their ability to influence the Th-1/Th-2 bias is unknown. DC are antigen presenting cells that modulate Th1/Th2 balance. IL-12 produced by DC turn on the Th1 response, while IL-10-producing DC promote Th2 bias. The aim of the present study was to evaluate whether monocyte(M)-derived DC express PAR-1 and 2 (thrombin and FXa receptors, respectively) and to investigate the effect of their activation on DC phenotype and IL-12 and IL-10 expression As demonstrated by RT-PCR, both PAR-1 and PAR-2 gene expression remained unchanged in M, iDC and mDC, suggesting that the modulation observed in flow cytometry was due to post-trascriptional events or changes in protein trafficking. The latter hypothesis was confirmed by the observation that total PAR-1 and PAR-2 proteins, evaluated by western blot, were increased in iDC and mDC. Thrombin caused a time-dependent increase in IL-12 p40 gene expression, with the maximal effect at 6 hours (2.7 fold over basal), while strikingly reducing IL-10 mRNA abundance. On the contrary, DC stimulated with PAR-2-activating peptide showed a time-dependent increase in IL-10 expression with a peak at 6 hours (7 fold over basal) and no effect on IL-12 p40 mRNA. In addition, PAR-2 stimulation significantly reduced both CD86 and CD54 expression on iDC and mDC, as demonstrated by flow cytometry. Finally, both thrombin and FXa induced a significant shape-change in iDC with an increase in the length of the dendrites, as demonstrated by confocal microscopy. In conclusion, our data would suggest a role for PAR-1 in the modulation of Th-1 response by DC while PAR-2 seems to induce Th2 response. This observation in the setting of transplantation, may suggest a potential pathogenic link between the innate and acquired immunologic mechanisms of graft damage. While the frequencies of alloreactive T cells can be defined by various methods, there has not been a comparable assessment of B lymphocytes. We developed a procedure for labeling HLA-specific B cells with HLA tetramers, based on the specificity of surface immunoglobulin receptors for epitopes on HLA molecules. The specificity of this approach was verified in patients with either historic or current sensitization to two commonly mismatched HLA antigens, HLA-A2 and B7. Methods: Anti-coagulated blood samples were obtained from 23 sensitized patients; 12 to A2 and 11 to B7. Controls included healthy, non-sensitized males, patients sensitized to other HLA antigens, and patients with A2 and/or B7 in their own phenotype. B cells were enriched either by depletion of T cells with anti-CD2 magnetic beads or by positive selection with anti-CD19 beads. Enriched B cells were stained with 10µl of tetramers: HLA-A*0201MART 1-PE, HLA-A*0201GAG-PE, and/or HLA-B*0702p24-APC (Beckman Coulter) at 4° for 45 minutes. Cells were also labeled with PE or FITC-anti-CD3 and CD19 (BD Biosciences/Pharmingen). After washing and fixation, two or three color analysis was performed on a FACSCalibur cytometer using Cell Quest software (Becton Dickinson). Results: The mean frequencies (%) of CD19, A*0201 and B7*0702 tetramer positive(tet+) cells among patients sensitized to A2 (6.6±6.1; P=0.002) or B7 (5.4±2.5; P=0.006) were significantly higher than those of 18 controls (1.3±0.7). The frequency of binding to non-CD19 cells ranged from 0.3 to 1.0% among both patients and controls. A*0201 tet+ frequencies tended to be higher among patients with current anti-A2 antibodies compared to those with historic, but currently nondetectable, antibodies (8.3±6.9 and 3.1±0.5, respectively; P=0.086). The frequencies of B7 tet+ cells were not significantly different between current and historic samples (5.0±2.4 v.s. 5.8±2.8, p=0.63). Sustained frequencies of B7 tet+ cells may reflect reactivity with the large B7 cross reactive antigen group (CREG). Broader reactivity with B7 CREG members was supported by respective frequencies of A*0201 and B*0702 tet+ cells of 0.9 and 5.7 in a highly sensitized patient (PRA=65) who was crossmatch negative with A2 cells but positive with B7 CREG cells. Conclusions: Frequencies of HLA specific B cells can identify historic sensitization when antibody is no longer detectable and can also measure the breadth of sensitization to CREG antigens. Many experiments have demonstrated a requirement for the presence of both CD4+ and CD8+ T cell responses for efficient and complete rejection of graft tissue. Previously, we have shown that trachea grafts with both H-Y minor Ag (mHAg) differences and one MHC class I mismatch are fibrosed significantly more often than grafts with either mismatch alone. These results have suggested the hypothesis that help provided by mHAg-specific CD4+ T cells promotes chronic graft rejection by allo-class I-specific CD8+ T cells. CD4+ T cells responding to mHAg either in the draining lymph node or allograft may be responsible for producing molecules that assist in the activation, clonal expansion, and/or differentiation of directly-alloreactive CD8+ effector T lymphocytes. In order to further investigate the activation requirements for B6 (H-2b) CD8+ T cells responding directly to BALB/c (H-2d) class I MHC antigens, we examined trachea allograft rejection in either the presence or absence of CD4+ T cell help. Fully mismatched BALB/c trachea allografts became fibrosed significantly less frequently in CD4-deficient B6 recipients compared to wild-type controls. Interestingly, infiltration of allograft tissue by CD8+ T cells did not depend on CD4+ help. However, CD8+ T cells in allograft tissue were significantly less likely to express CD69 (a marker of recent Ag stimulation) in the absence of CD4+ T cells. These findings imply that CD4+ T cell help is required either for the continued activation of graft-reactive CD8+ T cells, or for the retention of Ag-specific CD8+ T cells within the allograft.The polyclonal nature of normal lymphocytes prevented a determination of the Ag-specificity of the graft infiltrating CD8+ T cells. Purpose of the study: Serine elastases degrade the extracellular matrix, releasing growth factors and chemotactic peptides, thereby promoting vascular cell proliferation and migration. Increased elastase activity is associated with several cardiovascular disorders and with the post-cardiac transplant coronary arteriopathy. Elafin is a serine elastasespecific inhibitor, and its overexpression in transgenic mice results in reduction of experimentally-induced arterial injury. The goal of our study was to evaluate the effect of elafin overexpression on cardiac allograft survival. For this purpose, we used elafintransgenic mice (E-Tg), in which overexpression of the human elafin transgene is targeted to the cardiovascular system. Methods: Elafin overexpressing transgenic mice (H-2q) and their nontransgenic littermates were either heterotopically transplanted with allogenic BALB/c (H-2d) hearts or used as donors of cardiac allografts for BALB/c recipients. No immunosuppression was used. At the time of rejection, frequencies of alloreactive and heart tissue antigen (cardiac myosin, CM)-specific T cells were monitored by ELISPOT. Results: Our data show that in full MHC-mismatched combination, elafin overexpression in the host but not in the donor heart results in significant prolongation of transplant survival (24.5 days vs 9.0 days). Prolongation of graft survival in E-Tg recipients is associated with 1) expansion of anti-inflammatory IL-4-producing T cells directly recognizing alloMHC on donor cells (direct pathway), and 2) with increase in frequency of IFNg-producing T cells reactive to both donor MHC-and CM antigens processed and presented by host antigen-presenting cells-APCs (indirect pathway and tissuespecific immunity, respectively). Conclusions: Our data show that overexpression of serine elastase specific inhibitorelafin in the host abrogates acute rejection of fully allogeneic heart transplant. This suggests that selective inhibition of extracellular matrix antigen processing may favor induction of protective/regulatory immunity in the host, thus improving allograft survival. Introduction: Pancreatic islet grafts transplanted into Type I diabetics are subject to immunologic destruction from both alloimmune and autoimmune mechanisms. Several strategies exist in which tolerance can be induced to islet allografts in the absence of autoimmunity; however, these strategies have been less successful in diabetic recipients. Thus, it is important to identify factors which may regulate autoimmune destruction of islet cell transplants. CD103 is a T-cell integrin which plays a critical role in promoting destruction of epithelial cell compartments by CD8+ T cells, and whose ligand, Ecadherin, is highly expressed by pancreatic β-cells. Recently, a critical role for CD103 in promoting islet allograft destruction by CD8+ T cells was discovered. The aim of the present study was to determine if CD103 plays an analogous role in destruction of endogenous islets during development of autoimmune diabetes. Methods: Standard multi-color and immunohistological analyses were used to monitor CD103 expression by CD8 effector populations in a group of NOD female mice at different time points in progression to diabetes (age 7 weeks through development of hyperglycemia at 25 weeks of age). Results: Small numbers of CD8 cells (0.2x10 6 cells/pancreas) were present in the pancreata of female NOD mice as early as 7 weeks of age and the total number increased progressively with age, reaching maximal numbers at the time of overt diabetes (0.5x10 6 cells/pancreas). While CD103 was expressed by a subset of CD8 effectors (gated CD8 + CD44 hi cells) at all time points examined, the level of CD103 expression was maximal at 12 weeks of age, the time at which female NOD mice first develop invasive insulitis. To determine the localization of the CD8+CD103+ subset within the NOD pancreas during development of diabetes, we harvested pancreata from 12 week old prediabetic female NOD mice and performed immunohistochemistry utilizing anti-CD103 antibody. Interestingly, CD103+ cells within the pancreas were densely concentrated within the islets of Langerhans, the critical targets in autoimmune diabetes. Conclusions: These data provide strong support for the hypothesis that autoreactive CD8+CD103+ effector T cells play a key role in islet destruction in Type I diabetes and are consistent with a central role for CD8+CD103+ effectors in regulating progression from peri-insulitis to invasive insulitis. Purpose: It has been reported that apoptosis appears in human islet cells after isolation and it has a detrimental effect on islet functions. The purpose of this study is to determine whether caspase-3 inhibitor (Z-DEVD-FMK) protects human islets from apoptosis immediately after isolation. We also compare human serum albumin (HSA) and fetal bovine serum (FBS) as a protein supplement in a culture medium for human islets. Methods: Isolated human islets from 6 cadaver donors were incubated under 4 different conditions (group A: 0.5% HSA, B: 10% FBS, C: 0.5% HSA+25µM caspase-3 inhibitor, D: 0.5% HSA+100µM caspase-3 inhibitor) for 2 days. Then 1000 IEQ (islet equivalent) incubated islets from group A and D were transplanted into diabetic nude mice. The final values of in vitro assays (mean±SD) were expressed as a percentage of the value for group A. Results: After 2 days of incubation, the islet yields was significantly increased than in group A in both group B (126.4±25.1%) and group D (135.8±30.5%). The yield in group C (126.9±29.7%) was also more than that in group A, but the difference was not statistically significant. The apoptosis index was in 73.9±30.6% in group C and 41.1±25.7% in group D, as compared with group A. Caspase-3 inhibitor thus dramatically prevented apoptosis of isolated human islet cells in a dose-dependent manner. The insulin release stimulated by high glucose (300mg/dl) was 141.6±37.9% in group B; 119.0±29.5% in group C; and 126.5±45.3% in group D, as compared with group A. In transplant experiments, diabetes of all 6 mice with islets from group D were ameliorated by 9.0±5.3 days posttransplant. However, only 3 out of 8 mice with islets from group A became normoglycemic by 17.7±11.0 days posttransplant. Intraperitoneal glucose tolerance test performed 30 days posttransplant revealed that the glucose tolerance of mice in group D was superior to that of mice in group A. The islets grafts from group A in the hyperglycemic recipient were found to be poorly granulated with insulin but the islets from group D in the normoglycemic mouse were intact with intense insulin staining. Conclusions: Caspase-3 inhibitor prevented apoptosis of isolated human islets and improved its functions. Ten percent FBS improved the islet yield and insulin secretion more than 0.5% HSA. With the advent of immunosuppression better than ever before, it is debatable whether subjecting kidneys to prolonged cold ischemic injury for the sake of better tissue matching is justifiable. Given the negative effects of cold ischemia, attempts are being made in US to reduce cold ischemia time (CIT) by facilitating the local utilization of organs. We surveyed the CIT of the deceased-donor kidneys from 1987 to 2000 in the UNOS database, and assessed its effect on post-transplant dialysis requirement, discharge serum creatinine, and 1-year graft survival. Using all years in the database (1987) (1988) (1989) (1990) (1991) (1992) (1993) (1994) (1995) (1996) (1997) (1998) (1999) (2000) , there was a significant negative correlation between year of transplant and cold ischemia time (r= -0.18). This was also reflected in the CIT using 3 representative years of 1990, 1995 and 2000, which were 24± 11, 21± 9 and 20± 9 hrs, respectively (Mean± SD; P<0.001). Discharge serum creatinine, on the other hand, continued to be positively correlated with cold ischemia time (r=0.11) and with year of transplant (r=0.15), which for 1990, 1995 and 2000 were 2.4± 2.1, 3.0± 2.6 and 3.5± 3.2 mg/dl, respectively (P<0.001). The frequency of 1 st week post-transplant dialysis in 10 years was reduced by 2% (25% in 1990, 24% in 1995 and 23% in 2000; P<0.05) . Despite a smaller overall reduction in CIT, first year graft survival showed a significant improvement over the 10-year period: 84% in 1990, 86% in 1995 and 89% in 2000 (P<0.001) . However, the CIT continued to persist as a significant predictor of 1-yr graft survival in a Cox model, with a one hour HRR of 1.01, even after adjusting for year of transplantation. In conclusion, this analysis supports the view that reduction in CIT over the years has been very modest, and despite significant recent improvement in 1yr graft survival, probably due to better immunosuppression, cold ischemic injury continues to contribute to early graft dysfunction and graft loss. Drastic reduction in CIT is predicted to result in cadaveric renal graft function and survival closely comparable to live donor kidneys. The United Network for Organ Sharing (UNOS), with organ procurement organizations and transplant programs, has defined a class of cadaver kidney grafts for special allocation procedures to enhance utilization of those organs. The criteria defining these expanded-criteria donor (ECD) kidneys are donor age 60+ or donor age 50-59 plus two of the following: donor history of cerebrovascular accident, hypertension or creatinine > 1.5 during donor management. Kidney grafts from ECD donors carry an increased relative risk of nonfunction. The purpose of this study was to assess potential best strategies for the acceptance of ECD grafts based on further stratification based on donor age and cold ischemia time (CIT). Methods: We queried the SEOPF database for cadaveric kidney transplants between 1/1/1997 and 8/15/2002. We defined delayed graft function (DGF) as dialysis within the first week post-transplant and primary nonfunction (PNF) as dialysis within the first week and failure in the first year. We defined "good-risk" ECD as those from donors aged 50-59 yrs and "bad risk" as from donors > 60 yrs. Results: There were 1,312 ECD transplants and 8,451 non-ECD. Between these groups, there were no significant differences in recipient gender, ethnicity, peak and most recent PRA. Recipients of ECD kidneys were significantly older (50.9± 13.0 years vs. 44.9± 13.9, p<0.0001). There were statistically significant but very small differences in DR mismatch (0.82 for ECD vs. 0.87 for non-ECD). We found that ECD kidneys had a significantly (p<0.0001) higher incidence of PNF and DGF. PNF in ECD appeared to be uniformly distributed across CIT and while DGF was more CIT-dependent , the DGF differences between ECD and non-ECD were fairly consistent across CIT. Conclusions:While CIT minimization may be beneficial in reducing DGF, ECD kidneys were not more sensitive to it than non-ECD. The increased risk of PNF appears to be intrinsic to ECD kidneys, independent of CIT and donor age. Methods: 320 transplants from donors with malignancies diagnosed either pre-or post-transplantation (TXP), were examined for errors in brain death etiology. Diagnostic errors, including intracerebral hemorrhage(ICH) from undiagnosed metastatic disease and brain masses from metastatic vs.primary brain lesions, were examined along with clinical management and outcomes. Results: 42 recipients who received organs from 29 at-risk donors with misdiagnosed primary brain deaths were examined. Mean donor age was 42.0 ± 11.6 yrs, with gender evenly divided between males and females. Eight donors (27%) had an identified history of malignancy. Post TXP identification of donor malignancies included melanoma (23%), renal cell carcinoma (RCC) (19%), choriocarcinoma (12%), sarcoma (10%), Kaposi's sarcoma (KS) (7%), and variable tumors (29%). The majority were kidney recipients (n=37), followed by liver (n=4) and lung recipients (n=1). The most common diagnostic errors were ICH (62%), tumor mass (21%), and anoxia (17%). A donor-related transmission rate of 74% (31/42) was identified among recipients with misdiagnosed brain death. Tumor transmission was associated with the following donor histologies: melanoma (n=8), RCC (n=7), choriocarcinoma (n=5), sarcoma (n=3), KS (n=3), colon cancer (n=2), lung cancer (n=2), and lymphoma (n=1). While the majority of donortransmitted cancers were identified in the allograft (71%),64% of recipients also suffered diffuse metastatic disease. Explantation was performed in 21/42 (50%) recipients in the at risk group. Of the 31 patients with donor transmission, five-year survival was 32%. When allograft explantation was performed (17/31), a survival benefit was obtained (10/17; 59% vs. 0/14;0%; p<0.01). Conclusions: Error in the diagnosis of donor brain death has significant and often fatal consequences. In cases of donor malignancy transmission, allograft explantation for kidney recipients or re-TXP for extra-renal recipients should be undertaken to provide a survival benefit. In an effort to reduce the potential for donor transmission of malignancy, potential donors with unclear etiologies for brain death particularly ICH should be considered for a limited brain autopsy after donation. The last decade of pediatric liver transplantation (LTx) has been characterized by significant technical advances and improved outcomes, leading us to question whether the profile of candidates undergoing LTx has changed. METHODS: Information regarding all pediatric (<18 yrs) LTx recipients in 1990-92 (Era1) and 2000-02 (Era2) in the UNOS database were collected and compared using Chi-squared and Wilcoxon tests. RESULTS: There were 1,509 pediatric LTxs performed in Era1 compared to 1,736 in Era2. While there were no changes in recipient age, gender and height and only a modest change in recipient weight, there were substantial changes for many characteristics of pediatric Ltx recipients (Table 1) . Ethnicity and diagnoses leading to LTx broadened significantly. Utilization of partial grafts -reduced, split, or living donor -increased substantially while utilization of ABO-incompatible grafts decreased substantially. Nevertheless, waiting times increased. The need for retransplantation in infants and young children decreased. There was no consistent trend in laboratory parameters of disease severity (albumin, bilirubin, and creatinine) between the two eras. CONCLUSIONS: The national profile of pediatric LTx recipients has changed substantially during the past decade. Greater diversity in ethnicity and diagnoses of LTx children suggests a broader application of LTx as a therapeutic modality. Waiting time, albeit increased, remains modest, at least in part because the decade has witnessed substantial surgical advances in reduced, split, and living donor LTxs. In spite of these technical developments, the decreased frequency of retransplantation, particularly for infants and young children during the past decade suggests concomitant improvements in surgical proficiency and / or immunosuppression. Introduction: Hypothermic machine perfusion (HMP) provides a better protection against ischemic kidney damage compared to cold-storage (CS). A switch from static CS to HMP in liver transplantation could prevent Ischemic-Type-Biliary-Lesions due to incomplete perfusion and allow the use of marginal and non-heart-beating donor livers. HMP could, furthermore, prolong preservation time and improve graft function. An important question concerning the application of HMP in liver preservation is the required perfusion pressure in the hepatic artery and portal vein. Specific aim: To determine the optimal perfusion pressure during HMP preservation of the liver, enabling complete perfusion without inducing endothelial injury. Methods: Rat livers were preserved using continuous perfusion with Belzer-UW. Group A was perfused with a mean arterial pressure of 12.5 mmHg at 360 beats per minute with a portal perfusion pressure of 2 mmHg, group B was perfused with 25 mmHg and 4 mmHg and group C at 50 mmHg and 8 mmHg respectively. UW was enriched with 13.5 µM acridine orange (AO), to stain viable hepatocytes, and 14.9 µM propidium iodide (PI) to stain dead cells. After 1 h preservation the percentage of liver perfusion was assessed with in-vivo fluorescent microscopy (that is: epi-illumination) of the liver surface using a 484/520 nm filter. Image analysis was performed to determine the percentage of perfusion. Cryosections (4 µm) were examined to identify the location of PI stained cells. Results: Group A showed 70 +/-4 % liver perfusion of zone 1 and 2. No PI positive staining was found for the endothelial cells. Group B and C showed complete perfusion for all three zones. Endothelial cell injury, as measured by PI, was found in group C but not in group B. Conclusion: Hypothermic Machine Perfusion of the liver could improve liver preservation, however, a low grade perfusion is insufficient for complete liver perfusion and adequate preservation. Although arterial perfusion at 50 mmHg -perfectly normal for kidney perfusion preservation-and portal perfusion at 8 mmHg shows complete perfusion, a major drawback is the occurrence of arterial endothelial cell injury, and high-grade perfusion is thus not suitable for HMP preservation. Perfusion at 25 mmHg arterially and 4 mmHg portally is complete with no endothelial injury and is optimal for HMP preservation. Introduction: The development of an effective preservation solution has been a major obstacle for the successful transplantation of small bowel (SB). The objective of this study was to improve SB preservation by combining known antioxidant agents with a proven amino acid-rich solution (AA solution) which is tailored to the specific metabolic requirements SB. Methods: SB from Sprague-Dawley rats (n=6 in each Group) was flushed vascularly with modified UW solution and flushed luminally as: Group 1none (control); Group 2-1 h oxygenated perfusion then static storage with AA solution; Group 3group 2 + trolox; Group 4group 2 + superoxide dismutase/catalase. Energetics, oxidative stress and histology were assessed over 24 h at 4°C. Results: Oxidative damage as assessed by levels of a by-product of lipid peroxidation, malondialdehyde (MDA), was significantly lower in all groups treated with AA solution than in untreated controls (Group 1). The addition of trolox in Group 3 resulted in a significant reduction in MDA levels compared to all other groups throughout 24 h cold storage. Tissue energetics correlated with reduced oxidative injury; ATP and total adenylates were superior in tissues treated with trolox (Group 3) versus AA solution alone (Group 2). Group 1 (clinical control) had the lowest levels of energetic parameters assessed compared to all AA-treated groups. Histologic integrity was markedly improved in Group 3 after 24 h cold storage. Group 3 exhibited only minor injury including moderate epithelial clefting of the villi; this was an improvement on treatment with AA solution (no additives, Group 2) which exhibited varying degrees of injury ranging from epithelial clefting to villus disintegration and crypt infarction. Control tissues (Group 1) exhibited severe injury, including villus degeneration, crypt infarction and transmucosal infarction. Conclusion: Use of the novel amino acid-rich solution significantly reduces peroxidative damage; this effect is enhanced via supplementation with the water-soluble vitamin E analogue, Trolox. These antioxidant effects lead to a superior maintenance of mucosal integrity supported by sustained energy metabolism. This combined strategy may have implications for the successful preservation and transplantation of SB in the clinic. CD39/NTPDase-1 the dominant vascular ectonucleotidase has been shown to inhibit platelet aggregation as well as EC activation following nucleotide-mediated stimulation of P2-receptors in vitro. Increased catalysis of plasma nucleotides by NTPDases is associated with decreased vascular injury and prolonged graft survival in transplantation models. Here we evaluate effects of vascular NTPDase-1 expression and purinergic mediators on survival and vascular permeability in intestinal ischemia-reperfusion injury (IRI). Methods: Wild-type and cd39-null mice were subjected to superior mesenteric arterial occlusion for 45 or 60 min. Matched control mice underwent sham surgery prior to tissue harvesting at 60 min. Treatment groups received soluble NTPDase (0.2U/g), adenosine (1µmol/kg/min)/amrinone (0.5µmol/kg/min), or saline continuously infused over 60 min commencing 5 min prior to reperfusion. Evans Blue (EB) (0.8µL/g, 0.5%) was injected and accumulation in jejunal specimens determined as a measure of vascular permeability. Results: Mortality rates in vehicle treated cd39-null and wild-type mice were comparable. NTPDase supplementation protected only wild-type animals from death due to IRI (p=0.038 vs. vehicle), while adenosine/amrinone administration could not influence survival figures in either group. Post-ischemic EB tissue levels increased in cd39-null mice over wild-type mice (p=0.039), while baseline levels did not differ (p=0.29 NS). Wild-type mice had no increases in EB tissue levels following sham operation, while IRI led to increased EB accumulation (44.6 OD/g) compared to baseline levels (21.6 OD/g, p=0.002). In cd39-null mice, permeability increased both postsham operation (39.7 OD/g, p=0.007) and IRI (51.7 OD/g, p=0.001), as compared to baseline levels (25.8 OD/g). Treatment with either soluble NTPDase or adenosine/ amrinone maintained post-IRI tissue permeability at baseline levels. Discussion: The cd39-null mice demonstrated higher susceptibility to intestinal injury when compared to wild-type animals. Soluble NTPDase as well as adenosine/amrinone effectively prevented the increases in post-ischemic vascular permeability. Therefore, we conclude NTPDase maintains vascular integrity in intestinal IRI and that exogenous NTPDase administration has protective effects post-ischemic insults. Monza, Italy; 7 Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA. Background: Apoptosis is a hallmark of ischemia-reperfusion injury and high levels of apoptotic events at reperfusion have previously been correlated with poor initial graft function. This study has investigated whether carbon monoxide (CO) administration was able to confer protection against ischemia-reperfusion injury and, possibly, extend the survival of pig organs transplanted into primates. Materials and methods: Kidneys from hDAF transgenic pigs exposed to CO were transplanted into 6 immunosuppressed cynomolgus monkeys in a life-supporting model. The anti-inflammatory effects of the CO regimen used was monitored by evaluating in vitro TNF-α production by PBMC from these pigs. A thorough histopathological assessment, including TUNEL evaluation of the apoptotic index (AI), was conducted on the kidney both in the 30-minute biopsy and at euthanasia. Results: The transplanted primates survived from 2 to 37 days and were euthanized with rejection in 5 cases. CO treatment significantly reduced TNF-a production by PBMC isolated from donor pigs (P<0.02). Similarly, at the biopsy taken 30 minutes after reperfusion, the AI in the tubuli and interstitium was significantly lower in the grafts from CO treated animals than in controls (P<0.05 and P=0.007, respectively). No cases of overt DIC were observed. Conclusions: This study is the first in vivo demonstration in a primate transplantation model that CO preconditioning is an effective treatment to attenuate ischemia-reperfusion injury. Further studies are needed to determine to what extent CO-based regimens can ultimately extend the survival of xeno-or allografts transplanted into primates. RENAL ALLOGRAFTS. C. Du, 1,4 J. Jiang, 3, 4 Q. Guan, 2,4 Z. Yin, 2,4 R. Zhong, 1, 2, 3, 4 A. M. Jevnikar. 1, 2, 3, 4 1 LHRI, London, ON, Canada; 2 RRI, London, ON; 3 Med.Surgery, Micro & Imm., Univ. West. Ontario; 4 MOTS, LHSC, London, ON. NO is synthesized from the terminal guanidino nitrogen of L-arginine by a family of NO synthetases. iNOS is expressed by glomerular, mesangial and tubular epithelial cells (TEC) and induced by cytokines (i.e. IFN-g and TNF-a) present during renal allograft rejection. However NO has been shown to have disparate functions in cell survival, either inducing or inhibiting apoptosis which could alter graft survival. While recipient NO inhibition has been beneficial in murine cardiac allografts, the role of renal derived NO in the survival of renal allografts is not known. CS3.7 tubular epithelial cells (TEC) were used to test the effect of NO in vitro. Treatment with IFN-g and TNF-a upregulated both iNOS synthesis and release of NO. Addition of sodium nitroprosside (SNP), a NO-donor molecule for 24 h to TEC induced apoptosis in a dose-dependent manner with maximal apoptosis of 49% ( p<0.05 from control) using 50 ug/ml. However at 100 ug/ml, TEC apoptosis was equivalent to basal levels of untreated TEC (20%). We analysed cell survival proteins in these TEC. 10-50 ug/ml of SNP decreased mRNA levels of anti-apoptosis proteins (c-FLIP, IAP-3 and Survivin). In contrast, SNP at 100 ug/ml did not alter c-FLIP or IAP-3 levels, but enhanced Survivin mRNA expression. Furthermore, TEC which survived SNP induced apoptosis were resistant to downregulation of c-FLIP, IAP-3 and Survivin on re-exposure to SNP (10-50 ug/ ml), and remained resistant to further apoptosis.These data suggest that NO could induce survival proteins in TEC in vivo. To test this in vivo, wild type (WT) or iNOS null B6 (H-2b) kidneys were transplanted into nephrectomized untreated, BALB/c (H-2d) mice. WT grafts survived longer (48±10 d, n = 4) compared to those with iNOS null grafts (24±5d, n = 8, p=0.04). These data suggest that while NO can induce TEC apoptosis, at higher amounts NO can promote survival of TEC by upregulation of antiapoptosis proteins. Furthermore we show that endogenous levels of NO within donor tissue may be important to graft function and survival. Maintaining renal derived NO as well as upregulation of survival proteins such as c-FLIP and survivin, might be of benefit in strategies for long-term survival of renal allografts. Paulo N. A. Martins, 1 Anke Jurisch, 1 Anja Reutzel-Selke, 1 Andreas Pascher, 1 Sven Jonas, 1 Johann Pratschke, 1 Peter Neuhaus, 1 Hans-Dieter Volk, 2 Stefan G. Tullius. 1 1 Dept. of Surgery, Charité, Berlin, Germany; 2 Dept. of Medical Immunology, Charité, Mitte, Berlin, Germany. Events leading to unspecific inflammatory damages occurring prior to organ transplantation reduce graft survival. It can be speculated that an increased immunogenicity accelerates specific immune responses. We tested the effects of CO induction in donor animals immediately prior to organ harvesting. F-344 renal allografts were grafted into LEW recipients. Ischemia was prolonged to 6h to induce unspecific inflammatory damages. Recipients received a short term CyA treatment (1.5 mg/kg/d x 10 d) to overcome an initial acute rejection episode. Donor animals were either treated with Methylene Chloride (MC) at -4h prior to organ harvesting to induce CO or remained untreated. Effects of recipient CO induction were followed by a short (10 d Similarly, cellular infiltrates (CD4, CD8+ T cells, ED1+ monocytes/macrophages) were significantly reduced following donor pretreatment compared to controls (p<0.0001). Both, grafts of short-term and long-term MC-treated recipients demonstrated reduced functional and structural deterioration compared to controls. However, improvement of morphological changes was even more pronounced in grafts following a single donor pretreatment compared to long-term treatment of recipients (glomerulosclerosis: 20±5% vs. 37±13%, p=0.01). CO induction in the donor shortly prior to organ harvesting was associated with a marked improvement of long-term graft function. Those observations may be based on a reduced immunogenicity. Background: Stress response genes, such as heme oxygenase (HO)-1 and inducible nitric oxide synthase (iNOS), have been reported to be induced after ischemia/ reperfusion (I/R) injury; however, their definite roles remain undetermined. We have previously demonstrated that carbon monoxide (CO), byproduct of heme catalysis, protects kidney and small intestinal grafts against I/R injury. This study examined whether CO would protect the liver against cold I/R injury with a particular attention to iNOS. Methods: Orthotopic syngeneic LEW rat liver transplantation (OLT) was performed with 18 hrs preservation in cold UW solution. Recipients were exposed to air or a low dose CO (100 ppm) for 1 hr before and 24 hrs after OLT and sacrificed 3∼48 hrs for serum AST and NO levels, histopathology, and hepatic iNOS protein expression. Efficacy of CO was further analyzed in vitro primary rat hepatocyte culture system after stimulation with cytokine mixture (CM) (TNF-α 500U/ml, IL-1 200U/ml, IFN-γ 100 U/ml). iNOS protein, NO production and hepatocyte viability after CM stimulation were analyzed. Results: After OLT, CO-inhalation effectively improved hepatic I/R injury. Serum AST levels and tissue necrosis was significantly reduced with CO, compared to air-treated controls. Hepatic iNOS protein expression was significantly decreased in CO-treated group versus air-treated controls at 3 and 48 hrs with lower serum NO levels (Table, mean±SE, n=3 animals per group, * P < 0.05). In vitro culture study confirmed the efficacy of CO. CM-induced iNOS protein expression was markedly inhibited, and NO production was significantly reduced in µM in Air, p < 0.05). Hepatocyte viability also was increased with CO (93.5 vs. 63.1% in Air, p < 0.05). Conclusion: The results demonstrate that exogenous CO treatment efficiently ameliorates hepatic I/R injury. The possible mechanism, by which CO protects the liver against cold I/R, may include the downregulation of iNOS/NO pathway. Thus, low-dose CO inhalation treatment would be a novel therapeutic strategy to combat hepatic cold I/R injury. Background: Recurrent HCV after liver transplantation (LT) is a well-described entity occurring in all patients transplanted with HCV. Anecdotal reports suggest that recurrent HCV in LDALT recipients has a more severe outcome than in CAD recipients. Comparative histologic data is lacking however. In this study we compare histologic outcomes in patients with HCV who received LDALT organs to case-matched CAD organ recipients. Methods: Subjects were matched with regard to year of LT and interval to biopsy (bx). All bx were event driven (abnormal liver enzymes) and reviewed by a single pathologist. Ludwig/Batts grade/stage were recorded for each pair. 66 patients with HCV underwent primary LT between 1/1/00 and 9/3/02. 19 received an LDALT and 47 received a CAD organ. 6 LDALT recipients were excluded (5 had no bx material, 1 required retransplantation) leaving 13 study subjects. Case-matches were found among the 47 CAD recipients. The mean interval from LT to bx was 11.5 months in both groups. Liver allograft recipients were treated with therapeutic principles of pretreatment and minimum posttransplant immunosuppression consistent with graft survival and function. Thymoglubulin (THY, 3-6 mg/kg) or Campath (CAM, 30 mg) was infused before transplant and tacrolimus (TAC) monotherapy was used as postoperative treatment with other agents (e.g. steroid) for biopsy-proved rejection. Changes in immunological parameters during 6-12M follow-up were studied. Methods: Blood samples at 6 and 12M after transplantation were analyzed with four-color flow cytometry and in vitro cell proliferation assays. Results: Pretreatment with THY or CAM resulted in significant posttransplant lymphocyte depletion, which slowly recovered over the next 6-12M. CAM appeared to have stronger and prolonged impacts on depletion. Recovery of CD4 counts was slower than CD8, resulting in remarkably low CD4/CD8 ratios (<0.8) in nearly 40% of recipients at 6-12M with a high incidence in CAM group. Similarly, DC1 and DC2 populations significantly decreased. DC1 gradually recovered and became nearly normal by 12M (49% vs. 57% in normal volunteers). DC2 recovery was remarkably slow and was 5.5 ± 5.0% at 12M compared to 15.0 ± 9.0% in normals. CAM also effectively depleted B cells, and significantly low B cell counts persisted nearly 12M. At 1 year, the majority of patients (74%) were on TAC or cyclosporin monotherapy with doses spaced from QOD (19%), 3/W (9%), 2/W (30%), and 1/W (14%). Detailed in vitro studies (MLR, CML) were performed in 13 recipients with >1 year follow-up. In 10 of 13 on spaced monotherapy, 7 showed profound hyporesponsiveness to mitogens and to donor and 3 rd party alloantigens in MLR. One developed donorspecific hyporeactivity in MLR and CML. Remaining 2 showed vigorous antidonor reactivity in MLR and CML with 13 and 43% donor killing. In 3 patients with daily treatment, 2 showed intact antidonor reactivity in MLR, and one had total hyporesponsiveness. Conclusions: Liver recipients treated with toleranceenhancing regimen achieved stable engraftment. CAM appeared to have robust and prolonged impacts on immunological parameters. High frequency of profound in vitro hyporesponsiveness under minimum immunosuppression in liver recipients may be the result of pretreatment and complicated immunological status in this population. AIM: The introduction of MELD in 2002 was designed to eliminate subjective parameters from influencing priority on the waiting list. Recognizing that MELD may bias against certain diagnoses, a system is in place to review exceptional cases and adjust MELD score based on regional consensus. The OPTN recommended HCC, hepatopulmonary syndrome, familial amyloid and metabolic liver diseases for standard upgrade consideration. Regional review boards approve or deny individual requests. The aim of this study is to compare the regional practices and to review the diagnoses that are used to justify MELD increases in non-HCC cases. METHODS: The UNOS database was queried to extract all adult cases where exceptions to MELD were requested from 2/27/02 until 8/27/03. For the purpose of this study, only non-HCC cases were included. Request narratives were reviewed by our group and a justification for exception request was assigned. The data was stratified by UNOS region and justification for exception. RESULTS: Data for 29510 pts was available. 3281 exceptions were requested in that period of time. Of those, 827 were for non-standard diagnoses 477 (58%) were granted, 39 (4.7%) withdrawn, and 302 (37.4%) denied. The percentage of petitions approved varied significantly among regions (28-75% p<0.0001). The most common diagnoses for these exceptions included ascites (164), biliary complications (78), porto-systemic encephalopathy (73), cholangitis (71) portal hypertension bleeding (62) regional agreement (31) and hepatic hydrothorax (26). The percentage of patients listed with non-HCC MELD exceptions varies significantly among regions (0.7-8.3 %, p<0.0001). Furthermore, the proportion of pts transplanted in this period of time had significant variability (2.1 to 31.9% p<0.0001) Pts who faced a re-LT were significantly more likely to be considered for exception (p< 0.0001). Gender, ABO group, or ethnicity had no influence on the choice to petition.CONCLUSIONS: Allocation of livers using the MELD score has resulted in fewer deaths on the waiting list and increased numbers of pts receiving organs. Widespread variations exist, however, between the different regions in the country where a significant percentage of pts are being transplanted using non-MELD criteria. These variations mandate a reform to standardized exception criteria that can be applied uniformly across the country to maintain equity for our patients. Raymond Reding, 1 Christophe Bourdeaux, 1 Tran Thanh Tri, 1 Jeremie Gras. 1 1 Pediatric Liver Transplantation Program, Universite Catholique de Louvain, Brussels, Belgium. Pediatric end-stage liver disease score (PELD), a severity-of-illness assessment, has been proposed as an objective tool to prioritize children awaiting LT, higher PELD scores being associated with increased pre-LT mortality. PELD is in use in the UNOS, whereas it is still debated in Eurotransplant (ET). We investigated whether PELD at listing may also impact on post-LT results. Patients and Methods: PELD was retrospectively analyzed in 100 pediatric recipients (median age:1.4y, range:0.4-13.9) of a primary LT, transplanted between 05/94 and 04/02. Hepatic malignancy and fulminant hepatitis were excluded from this study, the two main diagnoses being biliary atresia (n=64) and Byler's disease (n=12). PELD was calculated by computation of bilirubin (median:11mg/dl; range:0.3-51.9), albumin (median:3.8g/dl; range:1.7-5.1), INR (median:1.2; range:0.9-8.0), age, and growth retardation. Post-mortem liver grafts were allocated by ET using an allocation system taking into account waiting times as well as medical urgency. Post-LT results were studied according to PELD calculated at the time of listing on the post-mortem donor waiting list (n=51), or at the pre-LT assessment of children included the living donor program (n=49). Results: Overall 1y and 5y actuarial patient survival in this series was 97% and 96%, the corresponding figures for graft survival being 92% and 91%. Individual PELD (mean:13.3; SD:9.7; range:-5 to +46) showed an almost normal statistical distribution (Kurtosis: 1.112; standard error of Kurtosis: 0.478). One-year patient survivals were calculated within the following PELD categories (NS): 100% for children with PELD >+2SD (n=5); Abstracts 91% with PELD +2/+1SD (n=11); 96% with PELD +1/0SD (n=24); 98% with PELD 0/-1SD (n=47); 100% with PELD <-1SD (n=13). Similarly, no statistical impact of PELD could be found for graft survival, retransplantation rate, as well as for acute rejection-free and chronic rejection-free survivals. Conclusion: PELD did not significantly impact on post-LT results, which suggests that giving priority to high PELD recipients may not result in worsening the post-LT outcome. PELD-based liver graft allocation in children de-emphasizes waiting time and directs organs to the sickest patients. In accordance to our data, we support such "sickest children first" allocation policy, which should contribute to reduce pre-LT mortality without worsening post-LT results and increasing organ waste. Pirenne, 1 F. Van Gelder, 1 C. Verslype, 1 W. Peetermans, 1 F. Nevens. 1 1 UZ Gasthuisberg, University Hospital, Leuven, Belgium. Quality of life and performance are altered by liver failure and improved by Liver Transplantation (LTx) but no study compares physical capacity in LTx versus normal healthy subjects. How LTx tolerate strenuous physical activity (and extreme altitude) is unknown. Methods.6 LTx patients gave consent and participated to a trek up Kilimanjaro, Africa's highest point (5.895m). Inclusion citeria were:<50yo, LTx>1 yr, normal liver/cardio/pulmonary function, normal life-pattern, non-sport-professional. LTx were accompanied by 15 control subjects (similar profile, matched for age/bodymass-index/gender/VO2max). Daily data recording included: physical performance, Borg-scale, Lake Louise acute-mountain-sickness (AMS) score, cardiorespiratory parameters. Immunosuppression was steroid-free and tacrolimus-based (5-8ng/ml). Prevention against AMS and infection was given. Results.83.3 % Tx subjects summited versus 84.6 % controls (ns). No difference in Borg-scale was seen. Lake Louise score showed no increased vulnerability to AMS in LTx. O2 saturation (sat) decreased whereas arterial blood pressure and heart rate increased with increasing altitude in LTx and controls. The only difference was a higher arterial blood pressure at all time-points in LTx. One LTx with hepatitis C abandonned at 4,600m due to exhaustion, hypoglycemia, low O2 sat. A biopsy done immediately after the trek showed relapsing hepatitis that was clinically silent and compatible with normal activities at sea level but that impaired the liver adaptive response to exercice in altitude. No infection was seen among LTx subjects. Conclusion: Selected LTx recipients, free of recurrent liver disease, tolerate extreme physical conditions (strenuous physical activities, extreme altitude) similarly to control subjects, suggesting that today's LTx technology has the potential to restore physical ability ad integrum. INDICATION AND OUTCOME. Gerrit Grannas, 1 Martin Strueber, 2 Rainer Lueck, 1 Thomas Becker, 1 Michael Neipp, 1 Juergen Klempnauer, 1 Bjoern Nashan. Medizinische Hochschule Hannover, Hannover, Germany; Medizinische Hochschule Hannover, Hannover, Germany. Background: Portopulmonary hypertension (PPHT) is often associated with pulmonary fibrosis based on cystic fibrosis or a1-antitrypsin deficency. The ultimative choice of treatment is a lung transplant (Tx). Since there is a significant co-incidence of liver cirrosis, a combined lung-liver transplantation (Lu-LTx) is the therapy of choice. The combination of endstage lung and liver disease leads to patients with poor nutritional condition related to severe intestinal malabsorption and chronic infections of the lung with often multiresistent organisms. The suitability of these patients with particullar poor outcome (1) is called in question, paricularly in the context of global organ shortage. Patients and methods: From 04/1999 to 12/2003 12 Patients (Pat.) (8 male, 4 female) with a therapy refractory PPHT underwent a combined liver and lung Tx; in 1 Pat. with secondary PPHT due to restrictive cardiomyopathy enbloc heart/ lung and liver Tx was performed. Median recipient age was 36 years (19 to 55 years). The underlying disease was in 2 Pat. an a1 antitrypsin deficency, in 4 Pat. a cystic fibrosis, in 1 Pat. a sarcoidosis with secondary liver cirrosis, in 5 Pat. an idiopathic pulmonary hypertension in combination with in 1 case each, a chronic HBV infection, a hemochromatosis, a cryptogenic cirrhosis, an autoimmune cirrhosis. In 10 cases a double lung Tx and in 2 cases a single lung Tx was performed. The immunosuppression was based on Cyclosporin A in combination with predisolon and Aza respectively MMF. Outcome: 3 Pat. died, 1 with an initial non function and toxic liver failure, the other 2 months post Tx due to a rupture of an splenic artery aneurysm respectively a brain edema after resuscitation after severe bleeding from pulmonary artery.10 Pat. are doing well (survival: 1yr 73%, 2ys 73%, 3ys 73%, 4ys 73%)Rejection: In the postoperative course we observed in 2 Pat. a rejection episode of the liver and in 1 Pat. a rejection of the lung. The rejections were successfully treated with steroids and in 1 Pat. additionally with an immunosuppressive switch to tacrolimus.Complications: one rectum perforation, one esophageal bleeding, two pneumonia, one CMV infection and one secondary wound healing. Conclusion: Combined lung/ liver or even heart/ liver and lung Tx in a selected population demonstrate a favourable outcome given the surgical, immunological and infectious risks 2 1 Deapartment of Nephrology, Toho University School of Meidicine, Tokyo, Tokyo, Japan; 2 Blood Transfusion Center, Toho University, Omori Hospital, Tokyo, Tokyo, Japan. [Purpose] To investigate accommodation in ABO incompatible kidney transplantation, we studied anti-donor and compatible blood type antibodies in patients' sera and blood type antigen expression in renal allograft biopsy following transplantation. [Methods] Seventy seven except 2 patients had splenectomy and were pretreated with immunoadsorption and/or plasmapheresis and exchange. Immunosuppression consisted of cyclosporine or tacrolimus, steroid and azathioprine or mycophenolate mofetile. Forty five ABO incompatible recipients (13 A1 to O, 11 B to O, 9 A1 to B, 11 B to A1, and 1 A1B to O) more than 1 year post-transplantation were measured anti-donor and compatible blood type IgG and IgM antibodies. The titers were compared with those of control groups (6 O to O, 7 A1 to A1, and 7 B to B) . Blood type antigens were stained with an indirect immunoperoxidase method in 35 renal allograft biopsy specimens. [Results] The titers of anti-A1 IgG and IgM antibodies were significantly lower in 13 recipients (A1 to O) than those in 6 Purpose: Historically, ABO-incompatible kidney transplantations have only been undertaken following splenectomy and unspecific plasmapheresis and with quadruple drug immunosuppression plus B-cell specific drugs. We have now evaluated a protocol for ABO-incompatible kidney transplantation without splenectomy, using antigenspecific immunoadsorption, rituximab and a conventional triple-drug immunosuppressive protocol. Methods: The protocol called for a 10-day pretransplant conditioning period, starting with one dosage of rituximab, followed by full-dose tacrolimus, mycophenolate mofetil and conventional prednisolone tapering. Antigen-specific immunoadsorption was performed on pretransplant days -6, -5, -2 and -1. Following the last session, 0.5 g/kg of intravenous immunoglobulin was administered. Postoperatively three more apheresis sessions were given every third day. Furthermore, if there was a significant increase in the antibody titers, extra sessions were considered. Results: Eight patients have been transplanted with this protocol. The donor/recipient blood groups have been two each of A2/O, B/O, B/A and A1/O. The ABO-antibodies were readily removed by the antigen-specific immunoadsorption before transplantation and were kept at a low level post transplantation by further adsorptions. There were no side effects and all the patients are discharged with normal renal transplant function. Introduction: The supply of cadaveric and living kidneys is not sufficient to satisfy the increasing number of patients requiring renal transplantation. Expansion of the donor pool by overcoming ABO-incompatibility would help to solve this problem. Since 2001, we have been employing one-week pretransplant immunosuppression with tacrolimus (FK) / mycophenolate mofetil (MMF) /methylprednisolone (MP) and have obtained excellent short-term results without any serious complications. Mid-term results of ABO-incompatible renal transplantation under one-week pretransplant immunosuppression with low-dose FK/MMF/MP maintenance immunosuppression are reviewed. Patients and Methods: Thirty-two adult patients underwent ABO-incompatible LKT at our institute between January 2001 and September 2003. There were 16 males and 16 females, with a mean age of 33 years. Plasmapheresis was carried out to remove anti-AB antibodies prior to the kidney transplantation. Since January 2001, we administered FK (0.1 mg/kg/d) / MMF (1-2 g/d) / MP (125 mg/d) concomitantly with plasmapheresis starting from 7 days before transplantation. Splenectomy was done at the time of kidney transplantation in all patients. In the maintenance phase (6 months after transplantation), all patients were maintained on low-dose immunosuppression with FK/MMF/MP. The average dose of FK, MMF, and MP was 0.05mg/kg, 1000mg/day, and 5mg/day, respectively. The target trough level of FK was 5ng/ml in the maintenance phase. Results: Patient and graft survival was 100% and 95% at three years, respectively. Only one graft was lost due to humoral rejection during the observation time. The incidence of acute rejection was 25%. Also, the incidence of steroid-resistant acute rejection was 7%. There was no serious infectious complication during the observation period. Protocol biopsies did not show any late-onset rejection. Conclusion: Mid-term results of ABO-incompatible renal transplantation under oneweek pretransplant immunosuppression with low-dose FK/MMF/MP maintenance immunosuppression showed excellent patient and graft outcome. INTRODUCTION: Despite great efforts to promote the donation of cadaveric organs, a serious shortage of cadaveric organs exists in Japan. ABO-incompatible renal transplantation has been performed for expansion of the donor pool. We will review our results of 141 cases of ABO-incompatible renal transplantation since 1989. PATIENTS AND METHODS: One hundred and forty-one patients underwent ABOincompatible living kidney transplantation at our institute between January 1989, and December 2001. To remove anti-ABO antibodies, recipients received plasmapheresis before transplantation. Methylprednisolone, cyclosporine (CyA) or tacrolimus (TAC), and azathioprine or mycophenolate mofetil (MMF) were used as basic imunosuppressants. Antilymphocyte globulin and deoxyspergualin were used in most of the cases which were performed in the pre MMF era (1989) (1990) (1991) (1992) (1993) (1994) (1995) (1996) (1997) (1998) (1999) . Splenectomy was done at the time of transplantation in all patients except one. Seven hundred and seventyseven concurrent ABO-compatible kidney transplant recipients were employed as a control. RESULTS: The patient survival of ABO-incompatible recipients was 94% at 5 years after transplantation, and 88% and 84% at 10 and 13 years post-transplantation, respectively. The patient survival was not significantly different from that of the 777 ABO-compatible kidney transplant recipients (5 years, 97%; 10 years, 92%; 13 years, 90%). The graft survival of ABO-incompatible recipients was 76% at 5 years, and 56% at 10 and 13 years. The graft survival of ABO-compatible recipients was 85% at 5 years, 67% at 10 years, and 58% at 13 years. Although there was a significant difference in graft survival between ABO-incompatible and ABO-compatible renal transplants (logrank test, P=0.007), 13-year graft survival was almost the same between the two groups. Acute rejection episodes were significantly frequent in the ABO-incompatible grafts (85 of 141, 60%) compared with the ABO-compatible grafts (377 of 777, 49%; P=0.010). However, since 1998, when TAC, MMF, and CyA-Neoral were introduced, graft survival has been markedly improved. Namely, five-year graft survival is more than 90% which is not significantly different from that of ABO-compatible cases. CONCLUSION: Long-term results of ABO-incompatible renal transplantation are excellent. ABO-incompatibility dose not seem to be a difficult immunological barrier to overcome in renal transplantation. 1 1 Department of Urology, Tokyo Women's Medical University, Shinjuku, Tokyo, Japan. Introduction: Due to the continuing shortage of cadaveric donors in Japan, ABOincompatible living kidney transplantation (LKT) is being carried out. It is well known that highly sensitized patients with positive panel reactive antibody (PRA) often present with acute rejection. Until now, it is not clear whether there is an association between the results of ABO-incompatible LKT and the existence of anti-HLA antibody. Therefore, we examined the impact of positive PRA on the results of ABO-incompatible LKT. Materials and Methods: One-hundred and seventy-seven recipients underwent ABOincompatible LKT at our institution between January 1989, and March 2003. Of these patients, 45 who had been examined for PRA before transplantation were included in this study. There were 33 males and 12 females with a mean age of 37.0 years. Plasmapheresis was carried out to remove the anti-ABO antibodies prior to transplantation. In the induction phase, methylprednisolone, azathioprine or mycophenolate mofetil and cyclosporine or tacrolimus were used for immunosuppression. Splenectomy was done at the time of kidney transplantation in all patients. PRA was measured using FlowPRA (One Lambda, CA, USA) by flow cytometer. Results: Twelve of the 45 patients had positive PRA before transplantation (class I; 8, class II; 1, class I and class II; 3). The incidence of acute rejection was 36.4% in the patients with negative PRA. The patients with positive PRA showed slightly higher incidence of acute rejection (50.0%). However, there was no statically difference between two groups in the incidence of acute rejection (p=0.4090). Graft survival also similar between two groups (97.0% vs 80.2 at 5 years, negative PRA vs positive PRA, respectively, p=0.1000 logrank test). The results in ABO-incompatible patients with positive PRA was similar to that in patients with negative PRA. Plasmapheresis and splenectomy may also help eliminate anti-HLA antibody in ABO-incompaatible LKT. IMMUNOGLOBULIN. James M. Gloor, 1 Joseph P. Grande, 2 William R. Macon, 2 Mark D. Stegall. 1 1 Transplant Center, Mayo Clinic, Rochester, MN; 2 Pathology, Mayo Clinic, Rochester, MN. Introduction: Successful positive crossmatch (+XM) kidney transplantation is possible using a protocol combining the B lymphocyte depleting anti-CD20 antibody rituximab and plasmapheresis followed by intravenous immunoglobulin (PP/IVIG). Whether PP/IVIG administered soon after rituximab interferes with B lymphocyte depletion is undetermined. The purpose of this investigation was to study the effect of rituximab followed by PP/IVIG on peripheral blood and splenic B-lymphocytes in +XM kidney transplant patients. Methods: 9 positive crossmatch (+XM) and 10 ABO incompatible (ABOI) kidney transplant recipients were studied. Both groups underwent a series of pretransplant plasmaphereses followed by 100 mg/kg IVIG as pretransplant conditioning, and both underwent splenectomy at transplant. +XM recipients received rituximab 375 mg/m2 1 day prior to beginning PP/IVIG, while ABOI patients did not receive rituximab. Both groups had peripheral blood T and B lymphocyte analysis on postoperative day 3. Peripheral blood B-lymphocytes were identified by expression of CD19. 8 ABOI and 4 +XM patients also had analysis of splenic B-lymphocytes (CD20 and CD79 positive staining using standard immunohistochemical techniques). A semiquantitative 0-3+ scale was utilized to measure extent of CD20 and CD79 expression in splenic follicles and interstitium. Peripheral blood and splenic B lymphocyte counts and distribution in the rituximab treated and untreated groups were compared using Student's t-test. Results: +XM patients underwent a mean of 3.7 (range 2-5) PP/IVIG treatments following rituximab administration. Peripheral blood B lymphocytes were significantly decreased in the rituximab treated compared to the nontreated group (2.3±2.9/µl vs. 33.0±31.6/µl, p=0.01). B lymphocytes were significantly reduced in splenic follicles in treated vs nontreated patients (1.50±0.58 vs. 2.75±0.46 on a scale of 0-3+ expression of CD20 and CD79, p=0.002). CD20 and CD79 positive lymphocytes were densely distributed in splenic interstitium in the nontreated group, and were absent in the treated group. Conclusion: Administration of rituximab 375 mg/m2 results in significant B cell depletion in spleen and peripheral blood even when closely followed by plasmapheresis and IVIG. Background:Plasmapheresis (PP) with immune globulin (IVIg) has been used to abrogate a positive crossmatch (+ XM) or ABO incompatibility (ABOI) prior to live donor renal transplantation (LDRT). The affects of PP/IVIg have not been fully evaluated. Purpose: To evaluate the kinetics of immunoglobulin (Ig) G-A-M removal & reconstitution during PP/IVIg therapy in ABOI & +XM recipients. Methods: We analyzed IgG-A-M levels in 4 pts. receiving PP/IVIg therapy for an ABOI (n=2) or + XM (n=2) LDRT. Pts. received 5-8 one-volume PP treatments prior to a LDRT & an additional 2-16 treatments post-Tx. Colloid was reconstituted with albumin unless profound coagulopathy ensued. IVIg (CytoGam®) 100mg/kg was administered after each PP. IgG-A-M levels were drawn before and after each PP & immediately after the IVIg infusion. All pts. received immunosuppression with tacrolimus, mycophenolate mofetil, daclizumab & steroids Results: Mean Ig levels at baseline were 1176 mg/dL for IgG, 310 mg/dL for IgA and 171 mg/dL for IgM. Following each PP the Ig levels decreased by a mean of 56% (33-64%), 60% (41-67%) and 64% (51-79%) for IgG, A and M, respectively. Despite treatment with IVIg, Ig levels continued to decline. After PP#9, mean Ig levels decreased to 231 mg/dL, 48 mg/dL and 19 mg/dL, and post IVIg levels were 432 mg/dL, 53 mg/dL and 15 mg/dL for IgG-A-M, respectively. All pts. are doing well with a mean f/u of 140 days. Background: Live kidney donors are excluded, if the recipient has donor specific antibody (DSA) to HLA antigens (+XM) or ABO blood type antigens (ABOI). Plasmapheresis combined with immune globulin (PP/IVIg) to remove DSA allows successful live donor Tx. Antibody-mediated (AMR) and acute cellular rejection (ACR) remain, nevertheless, significant risks. Rejection episodes in these recipients are poorly characterized, and a need exists to better define prognostic indicators and optimal therapy. Purpose: To characterize the features of rejection episodes following +XM or ABOI Tx. Methods: A retrospective review was performed of consecutive pts that received PP/ IVIg to abrogate +XM or ABOI. Extent of PP/IVIg was determined by the pretransplant DSA titer. Immunosuppression consisted of tacrolimus, mycophenolate mofetil, daclizumab, and steroids. Biopsies were obtained on POD#7 and at the onset of graft dysfunction. Characterization of rejection episodes was by histopathology, immunoflourescence, and DSA titer. Antirejection therapy and renal function were assessed. Results: 12 pts were Txd under the PP/IVIg protocol (+XM=9, ABOI=2, +XM/ABOI=1). No hyperacute rejections occurred. Patient and graft survival of 100% has been achieved with a creatinine of 1.8 ± 1.1 mg/dl after a mean follow-up of 181 days (range 13-430). No rejection occurred in 5. Rejection occurred in the remaining 7 pts. In 4 +XM pts, DSA was detected at the time of graft dysfunction despite no histologic features of AMR (7 biopsies) and negative C4d staining. In 1 of these patients, DSA preceded the histologic and immunofluorescent features of AMR. AMR occurred with and without an ACR component (Banff scores 0-IIA). In 2 pts with Borderline changes and DSA, graft function improved after PP/IVIg, despite no histologic or immunoflourescent evidence of AMR. One pt with Banff IIA ACR and DSA treated with antithymocyte antibody but not PP/IVIg had recurrent rejections and poor graft function. Conclusions: In +XM and ABOI recipients with graft dysfunction: 1) DSA may represent AMR in the absence of C4d or histologic features of AMR; 2) DSA can precede C4d or light microscopic features of AMR; 3) AMR occurs independently of ACR; 4) A poor outcome results if DSA is present and not treated despite a negative biopsy, including C4d. Introduction/Methods: Renal transplantation in the presence of low levels of HLA IgG antibody may not influence short-or long-term graft survival. Between 12/15/00 and 3/15/03, 6 of 338 (1.8%) primary recipients of deceased donor organs were transplanted when their flow cytometric T cell IgG donor-specific crossmatch was positive but the AHG T cell crossmatch was negative (Flow T + /AHG -). Immunosuppression of the recipients was center-directed, but no pre-or peri-operative IVIg or plasmapheresis were administered. Five of the 6 recipients were female.We now report on the outcomes with a mean follow-up of 13 months (12-24 months). Results/Conclusions: All organs are currently functioning (see table) . High definition antigen beads for flow cytometry allowed the identification of at least one donor-specific HLA class I antibody in each of the six cases. Among the six patients transplanted, there was one episode of acute rejection in the SPK recipient on day 14 and two in the heart recipient on days 300 and 349. Each rejection was successfully treated.We have evaluated the HLA antibody(ies) around 1 year post-transplantation for 3 patients and see evidence that the antibody specificity(ies) present before transplantation are undergoing epitope spreading as well as epitope collapse. Furthermore, we can see in the heart recipient that the strength of the donor antibody is decreasing. We have found that short-and long-term graft survival in primary transplants is not influenced by low levels of donor-specific HLA class I antibody that were defined as being Flow T + but AHG crossmatch negative. Those findings indicate that there may be a level of diagnostic sensitivity at which HLA antibodies can be safely transplanted without the need to use IVIg and/or plasmapheresis. We speculate that the low levels of HLA antibody may induce graft accommodation rather than portend rejection. Alemtuzumab is a powerful lymphocyte depleting monoclonal antibody that is licensed for the treatment of lymphoreticular malignancy and is under evaluation in transplantation. This paper describes the 5-year follow up of a study which comprised alemtuzumab followed by low dose ciclosporin monotherapy. These patients have been compared to contemporary controls. The only selection criteria for alemtuzumab was the giving of informed consent. Cadaveric renal transplant recipients (n = 33) were given two doses of 20mg alemtuzumab followed 48 hours later by low dose ciclosporin monotherapy. A cohort of patients (n = 66) transplanted in the same time period in the same centre and receiving conventional ciclosporin-based triple therapy (n = 61) or sirolimus, ciclosporin and prednisolone (n = 5) were selected for comparison. Patients receiving living donor or multi-organ transplants were excluded. Follow up was complete in the alemtuzumab group and while 2 of the control group were lost to follow up at varying intervals post transplant; both had normal graft function at the time. One patient in each group died from PTLD. The other deaths were due to ischaemic heart disease (n = 2) and calciphylaxis (n = 1) in the alemtuzumab group, or to ischaemic heart disease (n = 6), cerebrovascular disease (n=1), sepsis (n = 2), cerebral tumour (n = 1) and unknown (n=1) in the control patients. Notable events in alemtuzumab treated patients include one de novo autoimmune haemolytic anaemia and one ileocaecal TB infection. In spite of profound initial lymphocyte depletion in the alemtuzumab group, the total lymphocyte counts in both groups were similar by 6 months. Renal function remained similar in each group throughout. The incidence of acute rejection was also similar in both groups, but time of the first acute rejection episode was much later in alemtuzumab treated patients (medians 170 days vs 16 days). Alemtuzumab combined with low dose ciclosporin is a safe and effective regimen that is well tolerated and avoids steroids. Patients need to be followed up closely for late acute rejection. Purpose: The purpose of this study was to evaluate the impact of Campath-1H (Alemtuzumab, ILEX Oncology) therapy on outcomes of renal transplants which experience delayed graft function (DGF). Methods: Outcomes of 23 renal transplants with DGF treated with two doses of 30 mg of Campath-1H (day of transplant and day 1) were compared to outcomes of 195 renal transplants with DGF who received anti-CD25 antibody (n=127), Thymoglobulin (n=41), or other (n=27; anti-thymocyte globulin, ATG, n=18; OKT3, n=4, no antibody, n=5). All patients received immunosuppression with a calcineurin-inhibitor, mycophenolate mophetil, and steroids, and were transplanted between 12/97 and 5/ 03, with minimum 6 month followup. Results: Demographic features of the groups were not different. There was no difference in the incidence of DGF according to antibody therapy, with an overall incidence of 15%. Between 95 and 100% of the grafts experiencing DGF were from deceased donors in all treatment groups. At three months, the incidence of rejection was 12.5% in the Campath-1H group, 35% in the anti-CD25 group, 36% in the Thymoglobulin group, and 61% in the "other" group. Log-rank analysis of the incidence of rejection between the groups over time, using chi-square test, reveals a significant advantage of Campath-1H (p=0.0078). Graft survival was also significantly better in the Campath-1H group with no graft loss to date (p=0.012). There was no difference in patient survival, incidence of infection, or incidence of malignancy between the groups. Conclusions: Campath-1H provides a significant benefit compared to other antibody therapies used in transplantation when used in patients who receive renal transplants that experience DGF. DGF is associated with a high incidence of rejection under conventional immunosuppressive regimens, and the profound immune cell depletion associated with Campath-1H therapy reduces the risk of rejection to such a degree that graft survival is also improved. In attempt to reduce long-term nephrotoxic calcineurin inhibitor dosage and totally eliminate maintenance corticosteroids, Campath-1H was used as induction therapy in first cadaver and non-HLA identical living related donor renal transplantation. Methods. Forty one de novo renal allograft recipients were treated with Campath-1H (0.3 mg/kg) on day 0 and day 4 proceeded by a methylprednisolone bolus. Twenty five were in another randomized trial and 16 were non-randomized and chose to be treated with Campath-1H. Maintenance 12 hour Tacrolimus trough levels of 5-7 ng/ml were operational from the outset as well as reduced mycophenolate mofetil (MMF) dosage of 500 mg twice daily, no corticosteroids were given after the first week postoperatively. There was at least 1 DR antigen donor/recipient compatibility. Primary outcome measures included delayed graft function, episode of rejection, complications, incidence of opportunistic infections, graft and patient survival after median follow-up of 8 months range 1-18) months. Results. Patient and graft survival is 100% and 100% respectively. Biopsy-proven rejection was diagnosed in 4 (10%) patients. One patient experienced an acute humoral rejection that was reversed with plasmapheresis and intravenous immunoglobulin. Two patients developed infections that required hospitalization. The geometric mean/ S.E. serum creatinine concentrations at 1, 6, and 12 months are 1.53/1.07, 1.39/1.08, and 1.45/1.12 mg/dl respectively. Four patients had delayed graft function and only one patient required dialysis. There were minimal side-effects related with Campath-1H infusion. Thirty six (88%) patients have totally avoided maintenance corticosteroids. Conclusions. The combination of Campath-1H with low dose tacrolimus and MMF with avoidance of maintenance steroids appear to be safe and effective for kidney transplant recipients. Campath-1H as induction therapy appears to allow for the safe avoidance of maintenance steroids in renal transplantation. Longer term outcomes will be updated. Background: T lymphocytes are conventionally known to mediate acute rejection in renal transplants, but the role of monocytes in acute rejection has not been well understood. Since Campath-H, a humanized antibody against CD52, can more profoundly deplete T lymphocytes than monocytes, a study in renal recipients with Campath-H treatment may uncover effects of monocytes contributing to acute rejection. Methods: Total 27 renal recipients received induction therapy with 30 mg IV Campath-H preoperatively. The patients subsequently were treated with maintenance therapy of tacrolimus or mycophenolate mofetil. In addition, all patients also received 20 mg of predinisone after operation, which were weaned to off at a rate of 2.5 mg/week. All patients were followed with weekly labs. Renal transplants were biopsied at 2 w, 3 m and 6 m or as needed. Special stains in renal biopsies were done using Dako Autostainer. Results: Lab data showed that Campath-H resulted in a dramatic depletion in both lymphocytes and monocytes. Over 7 months, only one clinical plus biopsy proven acute rejection (Banff criteria Ib) was identified in a recipient 2 weeks after receiving a living non-related transplant. Inflammatory cells during rejection were composed of 95% CD68 positive monocytes and 5% CD3 positive T lymphocytes. Two additional acute rejections (Ia) were seen in protocol renal biopsies, in absence of elevated creatinine, 3 or 5 months after the transplantation. Eighty % of inflammatory cells were CD68 positive monocytes, but only 20% of inflammatory cells were CD3 positive. Negative CD34 and CD1a stains for inflammatory cells suggest no involvement of Abstracts CD34 derived dendritic cells in the acute rejection. Interestingly, all renal transplant biopsies (22/22) showed positive staining for CD3 in renal tubules but were CD3 negative in glomeruli. Renal tubules were markedly positive for CD68 (11/12), while focal mesangial cells were positive for CD68. Our data indicate that the current treatment regimen resulted in a low rate of acute rejection, which was predominantly mediated via monocytes. Findings of positive staining in renal tubules for CD3 and CD68 suggest that renal tubules share similar antigens with T lymphocytes and monocytes, which may explain why acute rejection mainly targets at renal tubules rather than glomeruli. Introduction: Several independent studies have shown the safety and efficacy of antibody induction in renal transplantation using several different maintenance immunosuppressive protocols. This describes an 18 month follow-up in a randomized study, comparing the cytoablative effect of Campath-1H® (C), Thymoglobulin® (T) and Zenapax® (Z) on peripheral blood and iliac crest bone marrow of kidney allograft recipients. Methods: In C half maintenance dosages of tacrolimus (tacro) and mycophenolate (MMF) but no corticosteroid were given. In T and Z higher maintenance doses of tacro, MMF as well as steroids were given. Depletion of discrete mononuclear cell phenotypes in peripheral blood and iliac crest bone marrow samples taken sequentially post-transplant at 10, 60, and 200 days were compared. The absolute number of mononuclear cells in peripheral blood or iliac crest bone marrow aspirates per mm³ and the CD3, 19, 56/16 (NK cells), 25, 14 (Monocytes) and 34 (Stem cells) phenotypes in flow cytometry were calculated. In peripheral blood, when compared to normal values, in the C group, there was depletion of virtually all lymphocytes, T cells by ∼100%, B cells by ∼98%, NK cells by ∼ 95%, monocytes by ∼ 83% and Stem cells by ∼40%, with an earlier recovery (in descending order) of stem cells, B cells, monocytes, and NK cells, and later (at >6 months) with T cells. In contrast, T depleted T cells (∼99%) and NK cells (∼98%), but B cells, and monocytes were only moderately affected (∼20% and ∼45% respectively). Z depleted all CD25 positive cells with early modest depletion of T cells, B cells and NK cells by ∼50%, ∼72% and ∼62% respectively, and with no effect seen on monocytes. In iliac crest marrow, at 10 days C also strongly depleted all lymphocyte, CD3 + , CD19 + , CD56/16 + , and CD14 + counts by ∼99%, ∼98%, ∼96%, and ∼68% respectively. However, virtually normal numbers of CD34 + cells were present. At 200 days post-transplant, an increased number of B cells (double the normal values) were seen. The early depletion of affected subsets by T in this compartment were 50% that of C with an earlier return to normal ranges in all phenotypes, while with Z there was only a minimal effect. Conclusion: C is the more potent lymphoablative induction agent (followed by T) allowing in most cases for early and total avoidance of corticosteroids and lower doses of maintenance immunosuppressive agents. Background: Campath 1H is a humanized anti-CD52 monoclonal antibody that has emerged as a safe and effective lymphocyte depleting antibody for induction therapy in renal transplant. Peripheral T cell depletion after a single dose of 30mg of Campath 1H usually last for 2-5 months. Recent reports by Kirk et al (Transplantation, have suggested that rejection of renal allografts in pts receiving Campath 1H induction may be mediated by an atypical population of monocytes, and not through " classical " T cell dependent pathways of allorecognition. We have hypothesized that renal allograft rejection in pts recieving Campath 1H induction will be qualitatively and quantitatively comparable to rejection observed in pts receiving other form of antibody induction. Methods: 9 biopsies from pts treated with a single dose (30mg) of Campath 1H at the time of transplant who subsequently developed acute cellular rejection (ACR), were studied with the following cell markers: CD3 (T cell), CD68 (Macrophage), and CD20 (B cell). 4 biopsies from pts treated with IL2 receptor antagonist as induction and with ACR were used as controls. Both groups received the same maintenance immunosuppression (Tacrolimus + Mycophenolate Mofetil without the use of chronic steroid). Results: See table. Immunophenotypic analysis and quantification of cellular infiltrate in kidney biopsies * % of CD3 + T CD3 + T CD20 + B CD20 + B CD68 + After elucidating the mechanisms of alloengraftment, we proposed a strategy of tolerogenic immunosuppression that facilitates these mechanisms. Two principles were applied: recipient pretreatment and the minimalistic use of post-transplant immunosuppression. In an earlier series, kidney recipients were preconditioned with Thymoglobulin and given post-transplant monotherapy with tacrolimus. Between November 2002 and September 2003, we substituted 30 mg Campath 1-H for Thymoglobulin in 96 adult kidney recipients (mean 50.3 ± 16.4 years). 13 (13.5 %) were undergoing retransplantation, and 20 (20.8 %) had a PRA >20%. Mean HLA mismatches were 3.7 ± 1.6. There were 59 (61.5 %) cadaver donors (mean cold ischemia 20.8 ± 7.2 hours) and 37 (38.5 %) live donors. Mean donor age was 41.4 ± 15.9 years (range 1 -72). Post-transplant immunosuppression was with tacrolimus monotherapy (target levels 10 mg/ml), with dose spacing to every other day or longer at various times after about four months. With a mean follow up of 4.8 ± 2.3 months, the six-month patient and graft survival rates are 100%, with a mean serum creatinine of 1.71 ± 0.99 mg/dl. Only one patient had acute rejection (steroid responsive) prior to weaning. Weaning was started in 20 patients. One developed a readily reversed rejection. The other 19 are on every other day tacrolimus. The incidence of CMV disease, diabetes, and PTLD was 0%. One patient had a BK virus infection. Although the follow-ups in these cases are still short, we conclude that the use of Campath 1-H for pretreatment has significantly improved the efficacy of our tolerogenic regimen of immunosuppression. Body-Campath -1H has been shown to cause profound depletion of T lymphocytes,.A high incidence of early,monocyte predominant,acute rejection has been reported when Campath was used without maitenence immunosuppression in renal transplantation.Recent studies also suggest that steroids can produce apoptosis of monocytes.We postulate that intra operative Campath induction coupled with minimum immunosuppression and short term steroid therapy would result in low post-transplant rejection rates. Methods-All cadaveric renal transplants after April 2003 were performed with the following protocol:Perioperatively Solumedrol(1gm), followed by Campath-1H (30mg) was given before reperfusion of kidney.Subsequent immunosuppression was either Tacrolimus (aiming for trough level of 10ng/ml) or MMF 1gram bid(.MMF was used in expended donors, non heart beating donors and if cold ischemia time was greater than 24hrs).All patients in addition received 20 mg of prednisone from day one post-transplant,which was then weaned off at a rate of 2.5mg/week. All patients were placed on CMV prophylaxis with oral Valcyte 450 mg, dose adjusted according to creatinine clearance.All patients were followed with weekly labs.Renal biopsies were performed as clinically indicated, as well within 2 weeks, 3 months and 6 months posttransplant. Results-20 cadaveric renal transplants were performed with a follow-up of 40-240 days.Patients and graft survival has been 100%.Mean creatinine is 1.36mg/dl. Absolute lymphopenia (<1000) was noted in all 20 patients.The median time to development of lymphopenia was 1 day post operative. The range of nadir lymphocyte count was 0 to 140.To date no patient has returned to a absolute lymphocyte count greater than 1000 ( longeast follow up 240 days.)No episode of CMV antigenemia has been detected.Only 1 episode of 1A rejection has occured (detected by protocol biopsy). The rejection was predominantly monocytic (70%). At present time 11 patients are on Tacrolimus monotherapy and 9 patients are on MMF monotherapy. Conclusion-Campath induction followed by rapid steroid taper and monotherapy maintenance immunosuppression has been found by our center to be associated with excellent short term renal function and only 5% incidence of acute rejection. Long term follow -up is still required to confirm our initial results. Vathsala, 1 Enrique T. Ona, 2 Si-Yen Tan, 3 Shirley Suresh, 4 Yiong-Huak Chan, 4 Huei-Xin Lou, 1 Concesca B. Cabanayan Casasola, 2 Jorgen Seldrup, 4 Roy Calne. 5 1 Renal Medicine, Singapore General Hospital, Singapore; 2 National Kidney Transplant Institute, Quezon City, Philippines; 3 Nephrology, University Hospital, Kuala Lumpur, Malaysia; 4 Clinical Trials and Epidemiology Research Unit, Singapore; 5 Surgery, National University of Singapore, Singapore. CAMPATH-1H, a humanised anti-CD52 monoclonal antibody, is a powerful lytic agent for both T-and B-lymphocytes. A pilot, multicentre randomised controlled trial was initiated in October 2001, to evaluate the safety and efficacy of low-dose Cyclosporine (CsA) monotherapy following lymphocyte depletion with CAMPATH, in preventing rejection and maintaining renal function in renal transplant recipients (RTX). RTX were randomised to receive either CAMPATH or standard CsA immunosuppression. In the Campath group, CAMPATH (iv 20 mg) was administered within 6 hours after anastamosis and 24 hours later; CsA was started at 72 hours after the first CAMPATH dose to achieve trough levels of 90-110 ng/mL; steroids were only administered at surgery and pre-CAMPATH infusion. In the standard group, CsA doses were adjusted to achieve trough levels of 180-225 ng/mL with Azathioprine and steroids. The 6 month analysis after recruitment of 30 RTX (50% males; 36.7% Chinese, 46.7% Filipino, 16.6% others; age 39.9±10.8 yrs) is shown (Table) . 10 of 14 cadaveric and 10 of 16 live-donor RTX were randomised to CAMPATH. Lymphocyte depletion was profound after 2 doses of CAMPATH (% lymphocytes: 20.6±9.3 Pre vs 0.5±0.4 Post, P<0.001) and remained depleted till 5 months post TX. Likewise, lymphocyte subsets (CD3, CD4, CD8, CD44, CD52) were depleted to 5-6 months post TX. These results suggest that CAMPATH is an effective induction agent that permits low dose steroid-free immunosuppression in RTX. Followup to 3 years, as per protocol, will determine the long term safety and efficacy of this regimen. Administration of immature dendritic cells (iDC) that fail to deliver costimulation prolongs allograft survival via promoting apoptotic death of activated T cells, but does not achieve graft long-term survival. We attempted in this study to boost iDC efficacy through the adjuvant subdose immunosupression. DC propagated from B10 (H2 b ) BM were transfected with NF-κB binding site specific oligodeoxyribonucleotide (ODN) resulting in completely blocked NF-κB activity (gel shift assay), inhibition of CD80, CD86 and D40 expression. The in vitro allostimulatory activity of ODN DC was markedly impaired in MLR and CTL assays. To examine in vivo allostimulatory activity, 2 x 10 6 ODN DC were i.v. injected into C3H (H2 k ) recipients 7 days before B10 cardiac transplant. ODN DC significantly prolonged allograft survival (MST 25d, vs.10d in non-treated control, 6d in mature (m) DC groups, both p<0.05). In contrast to cyclosporine that failed to enhance the effect of ODN DC, combination of ODN DC with subdosing sirolimus (6mg/kg/d for 3 days post-transplant) prolonged MST to 43.5d (sirolimus alone MST 25.7d, p<0.05), while combined with sirolimus treatment at 6mg/kg/d for 6 days resulted in long-term survival in all allografts (>100d) (n=6 in all groups). CTL activity of either graft infiltrating cells or splenic T cells (d7 post transplant) in the sirolimus combined treatment group was significantly lower than the control or single treatment group. More TUNEL positive cells were identified in T cell areas of mesenteric lymph nodes in the combined treatment group than the control groups (p<0.05), suggesting that sirolimus promotes activated T cell apoptosis. We utilized the Cellomics® system to analyze the influence of sirolimus on activation of transcription factors in T cells stimulated with DC. ODN DC inhibited nuclear translocation of Stat1, Stat3, ERK and ATF-2, but not NF-κB and P38, compared with mDC. The selective inhibition of Stat1, ERK, ATF-2 signal transduction can be enhanced by sirolimus, but not cyclosporin. Stat1 has been shown to be importance for cell-mediated immunity. ERK and p38, subgroups of MAP kinases, regulate cellular proliferation, apoptosis, survival and differentiation. ATF-2 involves in a feedback mechanism that regulates MAPK signaling. These transcription factors may be the useful targets in the development of new immunosuppressive agents. Sirolimus, but not cyclosporine, can enhance iDC tolerogenicity. Post transplant lymphoproliferative disorder (PTLD) is a serious complication of solid organ and bone marrow transplantation that is closely associated with Epstein Barr Virus (EBV) infection. In healthy individuals the expansion of EBV-infected B cells is controlled by EBV-specific CD8+ T lymphocytes. However, impaired T cell immunity as a result of immunosuppression places transplant recipients at increased risk for EBV + B cell lymphomas. Our lab has previously shown that rapamycin (RAPA), unlike other immunosuppressive therapies, directly inhibits in vitro proliferation of EBVinfected B cell lines (SLCL) derived from patients with PTLD by arresting cells in the G 1 phase of the cell cycle. Moreover, we demonstrated that RAPA significantly inhibits B cell lymphoma growth in vivo in a xenogeneic SCID mouse model of PTLD. To determine the mechanism by which RAPA induces cell cycle arrest in EBV-infected B cells we analyzed G 1 -associated cell cycle proteins in SLCL cultured in the absence or presence of RAPA (10 ng/ml). Western blot and densitometric analysis revealed that RAPA Abstracts significantly decreases cyclin D3 protein levels (range=71-78%) in 3/3 SLCL and cyclin D2 protein levels (range=29-63%) in 2/3 SLCL while cyclin E levels remain unchanged. The activation and association of cyclins with cyclin dependent kinases (cdk) is essential for the G 1 /S transition and is regulated by cdk inhibitors. Accordingly, RAPA decreased the protein levels of cdk4 (range=17-56%) and significantly increased the expression of the cdk inhibitor p27 (range=65-88%). In contrast, expression of another cdk inhibitor p21 was markedly inhibited by RAPA (range=58-74%) in 2/3 SLCL. p21 is a complex regulatory protein that can inhibit cdk4 and cdk6, but is also necessary for proper cyclin D/cdk interaction. These results suggest that p27 is the key inhibitor in the G 1 cell cycle arrest in EBV-infected B cells, while p21 promotes the cyclin D/cdk interactions which drive cellular proliferation. Thus, RAPA modulates multiple proteins associated with progression through the G 1 phase of the cell cycle resulting in inhibition of cell proliferation. These findings provide insight into the growth pathways of the EBV + B cell lymphomas and demonstrate the potential for RAPA treatment to prevent PTLD and other EBV + lymphomas. Introduction: Rapamycin (RAPA) has recently been shown to have antiangiogenic effects related to its remarkable anticancer activity. Conversely, cyclosporine (CsA) use is correlated with tumor progression. Here, we tested if the procancer and proangiogenic properties of CsA could be blocked by simultaneous usage of RAPA. Methods: Tumortransplant model: Syngeneic B16 melanoma cells were s.c.-injected into C57BL/6 mice on day 0 (d0). On d7, treatment was initiated with 1.5 mg/kg/d RAPA or 40 mg/ kg/d CsA, either as mono-or combination therapy (n=4-6/group). Also on d7, allogeneic, heterotopic, C3H mouse heart Tx (HTx) was done in specified experimental groups. In vitro angiogenesis model: Angiogenesis was tested in the presence of CsA (100 ng/ ml), RAPA (5 ng/ml), or the combination, using an aortic-ring assay. Wistar rat aortae sections (1-2mm) were cultured on matrigel-coated wells; vascular sprouting area was measured as a "% of control" after 4d. Results: Untreated C57BL/6 mice with B16 tumors (±HTx) had to be euthanized within 2-3 weeks due to cancer complications. All HTx in untreated mice were rejected (±tumor) by d11, whereas RAPA and CsA prolonged HTx survival to 75% and 50% on d28 (experimental endpoint), respectively. In tumor-HTx mice, RAPA improved animal survival to 83% at d28 (no controls survived) by reducing tumor growth (d10: control-tumor vol.=542±74 mm 3 vs. RAPA=192±25 mm³); all surviving mice had a beating HTx at d28 and tumors remained small (574±110 mm³). In CsA-treated tumor-bearing mice, tumor growth was accelerated (d10 vol.=1391±185 mm³), and mice were sacrificed due to tumor effects before HTx rejection. Importantly, when both CsA and RAPA were used, tumor growth was not promoted by CsA, and in fact, RAPA exerted its full anticancer effect, with beating allografts on d28. In vitro, CsA promoted (572%), while RAPA inhibited (31%) aortic-ring sprouting, vs. controls (100%), and the CsA effect was abrogated in the presence of RAPA (27%). Conclusions: (1) RAPA in a tumor-HTx situation protects a HTx and simultaneously inhibits tumor growth; (2) CsA, in contrast, promotes tumor growth, but this effect is blocked by concurrent use of RAPA, and (3) consistent with these data, the in vitro proangiogenic effects of CsA are blocked by RAPA. Therefore, when a transplant patient has cancer, concurrent use of RAPA may negate tumor progression attributed to CsA-based immunosuppression. We have found that VEGF is expressed in association with chronic rejection; and that blockade of VEGF attenuates the development of graft vascular disease (GVD) in a murine cardiac transplant model. Rapa, an immunosuppressive agent has recently been reported to inhibit VEGF mRNA and has anti-angiogenic effects. We propose that this function of Rapa is important following transplantation, but little is known about the effect of other immunosuppressants (I/S) on VEGF expression and function. Here, we first evaluated the effect of Rapa (1-10 ng/ml), CsA (0.1-1 mcg/ml) or MMF (5-10 mcg/ ml) on VEGF transcription in human endothelial cells. Total RNA was extracted at 16 hours and Real time PCR was performed for VEGF mRNA expression. As expected, Rapa treated cells showed a consistent decrease in VEGF mRNA transcription, whereas treatment with CsA, in general, resulted in an increase in VEGF transcription. In contrast, MMF had minimal or no detectable effect on VEGF mRNA expression. To next evaluate the effect of I/S on VEGF function, EC were stimulated with VEGF (5-10 ng/ml)) in the absence or presence of Rapa (1-10ng/ml), CsA (0.1-1 mcg/ml) or MMF (5-10 mcg/ml). VEGF alone enhanced endothelial cell proliferation and both Rapa and MMF inhibited VEGF-induced EC proliferation. In contrast, CsA consistently increased VEGF-induced EC proliferation in a dose dependent manner. The inhibitory effect of Rapa on EC proliferation was less pronounced in the presence of higher concentrations of VEGF (20 ng/ml) whereas MMF inhibited proliferation at all concentrations. This is suggestive that MMF might be more potent than Rapa to inhibit post VEGF Receptor signaling. To further evaluate this effect of MMF, we next evaluated its effect on VEGF-induced expression of EC adhesion molecules. By RNase protection, we found that MMF markedly reduced the expression of EC ICAM-1 and VCAM-1 mRNA, and that the addition of VEGF to MMF treated cells lead to a partial restoration of adhesion molecule transcription. Together, these findings suggest that both MMF and Rapa have anti-VEGF properties, but appear to exert their major effects through different mechanisms. We suggest that the use of rapamycin and MMF in combination as VEGF antagonists in the absence of CsA will be therapeutic to attenuate the development of chronic rejection. Innate immunity has a pivotal role in adaptive immune responses. Heat shock proteins (HSPs) activate innate toll receptor signal pathways, leading to inflammatory cytokine production, myeloid dendritic cell (mDC) activation and maturation. Preventing innate immune responses facilitates tolerance induction. Concomitant with promoting allograft tolerance, brief therapy with DSG curbs early proinflammatory cytokines and mDC maturation, and diminishes ischemia-reperfusion injury, suggesting an inhibitory effect on innate immunity. Here we show that DSG blocks innate immune signaling pathways, involving HSP70, MyD88, TRAF2, and NF-κB2 in mDC. Rhesus mDC were prepared from monocytes by culture in GMCSF and IL-4. TNFα was added to activate mDC with or without DSG. Protein expression was detected by western blotting and normalized to tubulin expression. DSG exerted a dose-dependent inhibition on nuclear translocation of RelB with significant reduction at 5mg/ml. In contrast, no suppression was observed on RelA or c-rel, suggesting a selective action of DSG on NFκB2, a transcription factor critical for mDC. Consistent with this finding, expression and nuclear translocation was 50% inhibited for the NF-κB2 partner p52 and p100, but not inhibited for the NF-κB1 partner p50 or p105. DSG also blocked expression and nuclear translocation of HSP70 in response to TNFα, and reduced MyD88 expression by 60%, independent of TNFα activation. Expression of TNF family members was differentially targeted by DSG; anti-apoptotic TRAF2 was inhibited; TRAF6 was unaffected, and the proapoptotic TNF-death receptor 4 (DR4) was increased. Addition of the DR4 ligand, TRAIL, to DSG-treated, TNFα-activated mDC induced 88% apoptosis, confirming a proapoptotic effect of DSG on mDC. Immunoprecipitation studies showed MyD88 and TRAF2 to be complexed in mDC, consistent with the linked inhibitory effect of DSG on MyD88 and TRAF2 expression. These data provide new insights into the molecular action of DSG in downregulating the innate immune pathway in primate mDC. DSG disrupts mDC maturation by inhibiting NFκB2 and down-regulating expression of MyD88 and anti-apoptotic TRAF2, while up-regulating expression of pro-apoptotic DR4. Since apoptotic DC are tolerogenic, the inhibitory action of DSG on innate immune signal pathways in mDC offers a plausible explanation for the synergistic effect of DSG in tolerance induction and a rational basis for its application. Alla Gomer, 1 Peter S. Heeger. 1 1 Immunology, The Cleveland Clinic Foundation, Cleveland, OH. It has been previously reported that memory alloreactive T cells prevent the graft prolonging effects of costimulatory blockade. One of the strategies to overcome this problem and to control memory T cells is to inhibit their infiltration into the graft. FTY720 prolongs survival of solid organ allografts through retention of lymphocytes in secondary lymphoid organs without affecting their priming. The effect of FTY720 on the trafficking and functions of alloreactive memory T cells has not been previously addressed and was the focus of these studies. We first tested how FTY720 affects migration of memory T cells, using the ability to rapidly produce IFNg after in vitro restimulation as a marker for memory and not naïve T cells. 4 weeks after placement of C3H skin grafts onto B6 recipients, the animals were either fed with FTY720 (3mg/kg/day) or with water as a control. 4 days later, the total numbers of donor reactive IFNg producing cells in the LNs and spleen were determined by a short term recall ELISPOT assay. The FTY720 treatment resulted in a dramatic increase in the absolute number of C3H-specific T cells in the lymph nodes (7,324±923 vs 867±302 in control group) and also facilitated the retention of C3H-specific T cells in the spleen (24,922±498 vs 15,248±883). We next tested if the addition of FTY720 to the costimulatory blockade restored the ability to prolong skin graft survival in primed recipients. B6 recipients were primed to C3H alloantigens through the rejection of C3H skin allografts. 6 weeks later, all recipients were given C3H donor specific transfusion plus anti CD154 mAb MR1 (DST/MR1) treatment followed by placement of second C3H skin allografts. One group of recipients was fed with FTY720 (3mg/kg/ day on days -2 through 4), and the control group was given water. The control sensitized recipients (fed with water) rejected the skin grafts with MST of 10.2±0.2 d despite DST/ MR1. Combined treatment with FTY720 and DST/MR1 resulted in modest but statistically significant prolongation of skin graft survival (MST of 13.4±0.6 d). Notably, this delay of graft rejection was comparable to the duration of treatment with FTY720 (4 days posttransplant). These findings demonstrate that FTY720 results in relocation of alloreactive memory T cells (along with naïve T cells) to the secondary lymphoid organs and, for the duration of therapy, prevents the memory cells from infiltrating the graft and mediating rejection. As human T cell repertoires contain memory alloreactive T cells, this finding may have important therapeutic implications in transplant recipients. Introduction: The immunosuppressive agent FTY720 prolongs allograft survival by promoting lymphocyte sequestration in secondary lymphoid tissues. Its active metabolite FTY720-phosphate (FTY720P) is a structural homologue of the bloodborne bioactive lipid mediator sphingosine-1-phosphate (S1P) that regulates cell migration, proliferation, and survival. G-protein-coupled S1P receptors have been identified as the molecular targets of S1P and its synthetic analogue FTY720P. We hypothesized that the S1P receptor agonist FTY720 might regulate dendritic cell (DC) trafficking and function via the S1P receptor pathway. Methods: Immature murine (C57BL/10) bone marrow (BM)-derived myeloid DC (BMDC) were generated by 7-day culture in rmGM-CSF and IL-4. Blood and splenic DC were isolated from mice mobilized with the DC poietin fms-like tyrosine kinase 3 ligand (Amgen; 10 µg/day for 10 days). Immunobead-purified CD11c + DC (>90% purity) were then exposed to FTY720P (1 µM, Novartis, Basel, Switzerland) for various times. S1P receptor and adhesion molecule expression were determined by RT-PCR and by flow cytometry respectively. In vitro chemotaxis assay was used to measure DC migratory response to S1P. Results: Using RT-PCR, we demonstrate for the first time, the differential expression of all 5 S1P receptors subtypes (S1P 1-5 ) by murine blood and splenic DC. S1P 5 was not expressed by BMDC. DC (blood, splenic and BMDC) matured in vitro following 24 h exposure to LPS (1 µg/ml) showed increased S1P receptor expression (S1P 1-5 ) compared with immature DC. This correlated with migration of splenic DC to S1P in an in vitro chemotaxis assay. In vivo treatment of mice with an immunosuppressive dose of FTY720 (0.35 mg/kg) i.p. for 24 h reduced CD11c + DC in the spleen but not in the blood. Flow cytometric analysis revealed that treatment of BMDC with FTY720P (4 h, 24 h) downregulated expression of CD11b and CD31/PECAM, surface adhesion molecules required for transmigration from blood into tissue. Further, CD54/ICAM-1 and CD44, which are costimulatory molecules important during DC-T cell interaction, were also downregulated after FTY720P treatment. Conclusion: In contrast to its well-recognized capacity to deplete circulating lymphocytes, the immunomodulator FTY720 reduced CD11c + DC in the spleen, but not in the circulation. Our data suggest that FTY720 may regulate DC trafficking via modulation of surface intercellular adhesion molecule expression, a mechanism that may contribute to its immunosuppressive effects. Introduction: FTY720 has a unique immunomodulatory mechanism that targets migration of lymphocytes via sphingosine-1-phosphate receptors (S1P-R) and is highly effective in animal models of organ transplantation and autoimmunity. NIBR713 is a novel FTY720 analogue which displays increased selectivity on S1P-receptors by lacking S1P-3 activity. Reduction in S1P-3 activity may potentially result in an improved side-effect profile for this drug class. Method: NIBR713 was selected because of its potent S1P-1-, but lack of S1P-3 activity in [GTP-γ 35 S] binding assays. NIBR713 was compared with FTY720 to determine its effects on peripheral lymphocyte depletion in rats and monkeys, and its efficacy in the DA-to-LEW heterotopic heart allotransplantation model in combination with everolimus. Results: NIBR713 depleted peripheral blood lymphocytes in both rats and cynomolgous monkeys with similar potency and efficacy but longer duration of action than FTY720. In rats, NIBR713 had an ED 50 of 0.2 mg/kg (FTY720: 0.1 mg/kg) and a median plasma half-life of 6d (FTY720: 1.5d). Remarkably, in cynomolgous monkeys one oral dose of NIBR713 at 1 mg/kg reversibly depleted blood lymphocytes for >35d, while the effect of FTY720 at the same dose lasted for 5-7d. The median plasma half-life of NIBR713 in monkeys was 16d versus 4d for FTY720. Blood level data on NIBR713 and its phosphate demonstrated an almost complete in vivo phosphorylation and a close correlation between phosphate concentrations and the degree of peripheral lymphocyte depletion (PK/PD correlation). In transplantation, the combination of NIBR713 with everolimus was equipotent and equi-efficacious compared to FTY720 in preventing rat heart allograft rejection: NIBR713 at 0.1 mg/kg/d in combination with everolimus at 0.3 mg/kg/d led to graft prolongation of >28 days in all animals treated. Histological and early toxicological examinations indicated a good tolerability of NIBR713. Conclusion: NIBR713 is a novel S1P receptor agonist lacking S1P-3 activity that demonstrated equivalent efficacy in a rat heart allotransplantation model and a prolonged duration of action compared with FTY720. This study demonstrates for the first time, that the lack of S1P-3 activity does not compromise the immunomodulatory potency of FTY720 analogues. THE NOVEL IMMUNOMODULATOR FTY720 DESENSITIZES AND DISRUPTS ENDOGENEOUS S1P RECEPTOR SIGNALING PATHWAYS IN VITRO. Danilo Guerini, 1 Rao Movva, 1 Thi-Thanh-Thao Tran, 1 Christoph Hangartner. 1 1 Transplantation & Immunology, Novartis Institutes for Biomedical Research, Switzerland. Introduction: FTY720 is a novel immunomodulator whose mode of action is different to that of other current drugs in transplantation. FTY720 was efficacious in allograft protection in Phase II trials of kidney transplantation in humans. FTY720 prevents the egress of lymphocytes from secondary lymphoid organs. A hypothesis is that FTY720 is phosphorylated in vivo, becoming a potent agonist at four sphingosine-1-phosphatereceptors (called S1P1, S1P3, S1P4, S1P5), which belong to the family of GPCR receptors. Some members of this protein family are removed upon activation from the surface of the cells and transported into intracellular compartments (internalization). This novel study was designed to evaluate receptor internalization by FTY720 and some close analogues. Methods: We generated stable cell lines (using CHO and HeLa cells) expressing myctagged receptors S1P1, S1P3, S1P4 and we developed a method to quantify the agonistmediated internalization process. Results: Phosphorylated FTY720 caused a complete (>95%) and long lasting (>3h) disappearance of the S1P1 from the surface of the cells, in contrast to equimolar sphingosine-1-phosphate (S1P), which had a partial (40-50%) and reversible effect (reversed in 1-2 h). S1P, however, promoted the complete (>90%) internalization of S1P3 while after treatment with phosphorylated FTY720; nearly half (40 to 60%) of S1P3 was still present on the surface of the cell. S1P and phosphorylated FTY720 caused only moderate (35-45%) and transient internalization of the S1P4. Since a derivative, which cannot be phosphorylated, did not promote internalization, our data support the idea that internalization requires phosphorylated FTY720. Internalization mediated by phosphorylated FTY720 at S1P1 was invariably followed by the loss of responsiveness toward agonists that lasted up to 3 hours. In contrast, the complete internalization of S1P3 mediated by sphingosine-1-phosphate was transient and full surface expression was restored after around 1 h. Conclusions: These results show for the first time that phosphorylated FTY720 promotes the post-transcriptional reorganization of S1P receptors. The long lasting Abstracts receptor internalization of the S1P1 (the most widely expressed receptor of this family) by phosphorylated FTY720 is expected to dramatically affect signaling by the endogenous agonist S1P. We propose that the agonist dependent internalization of S1P receptors is an important molecular component of the action of FTY720 in vivo. 3 Keith Rolles, 2 Luciano de Carlis, 1 Andrew K. Burroughs, 2 Giovambattista Pinzello. Ospedale Niguarda, Milan, Italy; Royal Free Hospital, London, United Kingdom; 3 Histopathology Department, Royal Free Hospital, London, United Kingdom. Background: A worse outcome in HCV positive recipients with a faster progression of recurrent disease to overt cirrhosis has been reported in recent years and increased donor age and stronger immunosuppression have been indicated as major contributors to these worse results (M Berenguer, Hepatology 2002; 36:202-210) . We have reviewed the experience of two European Centers with a particular attention at histologic disease progression over the last 13 years Methods: a retrospective analysis conducted on 404 HCV positive recipients transplanted between January 1990 and December 2002 (216 pts in Milan and 188 in London). Protocol liver biopsies available at regular intervals for all patients. Different immunosuppressive protocols used in the two Centers and over time. 30 patients transplanted after 1992 in Milan were treated with combined antiviral therapy (interferon + ribavirin). Results: Overall survival progressively improved over the last 13 years table1(p=0.02). As for the disease progression study, the histological outcome was assessed for a total of 327 patients with at least 6 months of follow up (168 pts in Milan and 159 in London). A total of 960 protocol liver biopsies were reviewed corresponding to a median of 3 biopsies per patient (range 1-10). The cumulative probability of developing a severe fibrosis (≥ 4 Ishak) over the last 13 years are reported in tab.2. Finally, severe fibrosis progression was observed in 12 of 39 (30%) and 52 of 288 (18%) pts receiving a graft respectively from donors older and younger than 60 yrs (p=0.06). Conclusions: In our experience patients and graft survival in HCV patients improved over the last 13 years despite the increasing number of older donors. The recent adoption of lighter immunosuppressive protocols and the selective use of antivirals may justify these favorable results. Background: Hepatitis C virus (HCV)-related liver failure is the leading indication for liver transplantation worldwide. Post-transplantation, virological recurrence is the rule, but the spectrum of histologic injury is wide, ranging from the development of allograft cirrhosis within a few years to minimal hepatitis despite long-term follow-up. The immunological correlates of the variable natural history are poorly understood. Methods: we studied the kinetics of the cellular immune responses, viral replication, and allograft histology in 20 patients who had undergone liver transplantation for HCV-related liver failure. We prospectively tracked T-lymphocyte responses in 3 groups of HCV-seropositive patients who underwent liver transplantation: patients who received pre-emptive antiviral therapy (or placebo) starting within the first month after transplantation (patients 1 -10), patients who received antiviral therapy for severe histologic recurrence more than 3 months after transplantation (patients 11-17), and patients with long-term follow-up who have demonstrated minimal evidence of histologic recurrence and have not required antiviral therapy (patients 18 -20) Results: Using direct ex vivo methodologies, i.e., interferon-gamma ELISPOT and major histocompatibility complex class I-peptide tetrameric complexes, we found that patients who experienced viral eradication following antiviral therapy demonstrated restoration of HCV-specific T cell responses whereas patients with progressive HCV recurrence that failed to respond to therapy showed declining frequencies of these viral-specific effector cells. The HCV-specific CD8+ T cells that peripherally reconstituted after liver transplantation were clonotypically identical to CD8+ T cells present within the recipient's explanted liver (identical Vbeta chain and CDR3 sequence). Moreover, the subset of patients who spontaneously demonstrated minimal histologic recurrence had more vigorous CD4+ T cell responses targeted predominantly against the HCV nonstructural (NS3, NS4, and NS5) proteins at 2 and 3 months after transplantation. Conclusions: These results may help identify patients more likely to develop severe HCV recurrence and therefore benefit from current antiviral therapy, as well as provide a rationale for the future use of novel immunotherapeutic approaches. The rapid recurrence of severe hepatitis C viral (HCV) hepatitis after liver transplantation remains a major problem. To avert recurrent HCV in our transplant population, our patients are aggressively treated with interferon alpha and ribavirin following transplantation. Herein we analyze the effect of sustained response to anti-viral treatment (vs. no sustained response). Methods: Using our longitudinal database, survival for all recipients of liver transplants for cirrhosis due to HCV was analyzed (1991 onward). Fifty-three of the 112 patients in the database were treated with anti-HCV therapy. All liver transplant biopsies were analyzed (in a blinded fashion) and activity and fibrosis were graded using the Batts-Ludwig scale. Severe activity was defined as moderate or worse activity and severe fibrosis was defined as bridging fibrosis or cirrhosis. Graft survival, graft survival until the onset of severe activity, and graft survival until the onset of severe fibrosis were determined using Kaplan Meier estimates. The log rank test (LR) and Wilcoxan test (WC) were used to compare survival and Cox multivariate analysis was used to determine relative risks (RR). Results: Of the 53 treated patients, 13 (25%) had a sustained response, while 40 (75%) did not. WC WC WC Patients who did not achieve a sustained response to anti-HCV therapy were at 3.9 times greater risk for death, graft loss, or severe HCV activity and at 3.8 times greater risk for death, graft loss, or severe fibrosis. Conclusions: A sustained response to HCV antiviral therapy results in significantly less severe HCV activity and fibrosis after transplantation and improved graft survival. Thus, every effort must be made to treat liver transplant recipients at risk for severe HCV recurrence. Although recurrent HCV infection occurs universally after liver transplantation (LT), disease progression is variable. We hypothesized that recipient class II HLA antigens, which regulate the immune response to HCV, contribute to this variability. OBJECTIVES: To determine whether specific HLA class II antigens and donor-recipient mismatch (MM) are associated with disease progression and graft survival (GS) in LT recipients with recurrent HCV. METHODS: HLA genotypes were determined by PCR in 409 LT recipients and their donors, including 147 HCV RNA (+) recipients with ≥ 1 liver biopsy obtained ≥ 1 yr after LT. Recurrent HCV was defined as necroinflammatory activity of ≥ 3 points (Knodell) at 1 yr. Fibrosis rate (FR) was defined as fibrosis score/ yr post-LT. Acute rejection was assessed histologically. Patients with chronic rejection and vascular/biliary complications were excluded. RESULTS: Actuarial 10 yr GS was 45.6% in HCV-infected LT recipients compared to 61.7% transplanted for other liver diseases (P=.001). In recipients with HCV, the prevalence of HLA DR3 was higher (18.9 vs 9.8%; P=.0002), and HLA DR2 (10.9 vs 17.9%; P=.007) and DR5 (5.9 vs 11.3%; P=.007) lower, than in patients transplanted for other diseases; therefore, these genotypes were chosen for further study Patients with HLA DR3 had reduced GS, and higher regrafting percent, HCV recurrence, and FR than other genotypes. In contrast, patients with HLA DR5 had lower regrafting percent, HCV recurrence and FR than other genotypes. HLA genotype had no effect on acute rejection rates. There were no significant differences between recipients with 0, 1, or 2 DR MM in HCV recurrence or in FR, although GS in 0 MM recipients was significantly better than 1 or 2 MM (75 vs 46%; P<.05). CONCLUSIONS: Recurrent HCV in LT recipients expressing the DR3 genotype follows a more aggressive clinical course, while DR5 confers relative protection, compared to other HLA DR genotypes. These observations occur independently of rejection, and suggest that host genetic factors play a significant role in the severity of recurrent HCV after LT. Hepatitis C (HCV) recurrance following liver transplantation (LT) is universal. The speed and severity of recurrance may be linked to the level of immunosuppression. Since HCV infection reduces cellular immunity, we speculated that soluble HCV proteins might potentiate the effect of cyclosporine (CsA) upon T cell responses. This study examined the impact of HCV Core protein upon T cell proliferation, expression of early activation markers and proliferative suppression by CsA. T cells were activated with anti-CD3 for 2-6 days. Cultivation with 1, 2, 4 and 8 ug/ml Core reduced the stimulation index from 163 ±41 to 152 ±11(p=ns), 60 ±20 (p<.001), 49 ± 10 (p<.001) and 13 ±3 (p<.001), respectively, without evidence of toxicity. Actual cell counts (X 10 4 ) confirmed the lack of clonal expansion in untreated vs treated cultures after 2 days (72 ±11 vs 11 ± 3, p<.05), 4 days (397 ±50 vs 28 ±5, p<.05) and 6 days (250 ±100 vs 29 ±3, p<.05). Expression of activation markers was reduced in Core treated cells. Treatment with 4 ug/ml Core for 2, 4 and 6 days reduced CD3+CD25+ from 37 ±3% to 12 ±3% (p<.05), 12 ±1% (p<.05) and 18 ±4% (p<.05), respectively. Similarly, CD3+DR+ was reduced from 13 ±6% to 6 ±2% (p<.05), 7 ±2% (p<.05) and 6 ± 2% (p<.05), respectively. Thermal inactivation of Core abolished proliferative suppression (p=ns). Core protein was subsequently titrated into cultures containing various concentrations of CsA. The impact of combining Core with CsA upon proliferation was analyzed by isobologram analysis, a mathematical method which indicates whether interactions are synergistic, additive or inhibitory. Combining Core with CsA resulted in an additive effect upon proliferative suppression. For example, the proliferative suppression induced by the combination (70 ± 10%) was equivalent (p=ns) to the suppression produced by individual exposure to CsA (30 ±5%) or Core (43 ±7%). Linear regression of the data confirmed an additive interaction between Core and CsA with an r ² value of 0.98 (p=ns). The data show that HCV Core protein, at physiologically attainable levels, specifically inhibits T cell activation at an early stage which blocks T cell clonal expansion. In addition, Core protein mitigates proliferative suppression by CsA. These results suggest that release of Core during periods of viremia may potentiate clinical immunosuppression with CsA. Although preliminary, we suggest that reduction in pharmacologic immunosuppression may be feasible and beneficial in HCV+ recipients following LT. Purpose: To assess the effect of mycophenolate mofetil (MMF) and steroid (Pred)-free immunosuppression (IS) identifying the most effective IS that will minimize incidences of HCV recurrence (HCVR), acute rejection (ACR) and adverse events (AE) post OLT. Trial designed as open label, prospective, randomized multicenter study involving adult HCV-OLT recipients (pt) allocated into 3 IS regimens. Arm 1: tacrolimus (TAC) + Pred; arm 2: TAC + Pred + MMF and, arm 3: TAC + MMF + 2 dose-daclizumab, but no steroids. HCV-OLT pt (n = 312) from 17 different institutions are allocated at a 1:1: 2 ratio, respectively. Laboratory data and graft histology (LBx) are evaluated when clinically indicated and by protocol at 90, 365 and 730 days (d). HCVR is staged according to Batts and Ludwig. ACR is graded according to Banff schema. Primary endpoints are clinically significant: HCVR (fibrosis stage [Sg] ≥ 2 at 90, 365 d. or Sg ≥ 3 at anytime) and ACR (Banff grade [Gr] ≥ 2 + RAI ≥ ≥ ≥ ≥ ≥ 4). Results: To maintain the validity of the trial, report based on Data Safety Monitoring Board assessment from 11/2003. Out of 228 pt enrolled, 182 pt have a median follow up of 89 d. Day 90 protocol LBx were done in 82% pt. Current information on 98 pt: HCVR (Sg ≥ 2) found on 8 pt (8.2%), median of 94 d. post OLT and placed on antiviral therapy (endpoint). ACR found in 10 pt (10.2% -Gr: 2 [6]; 3 [4] ), median = 13 d. post OLT. Five pt died due to: lung failure + cancer; CVA; idiopathic lung disease; HA thrombosis; sepsis. Study withdraw = 3 pt. No major AE reported in any arms. Conclusions: This 90-d. preliminary report suggests the safety of the IS used (Pred, TAC, MMF, daclizumab and a steroid-free regimen) in the trial, since no major AE has been reported in any group. HCVR and ACR incidences appear encouraging at this point. High protocol LBx rate obtained will provide invaluable information on natural course of HCV post OLT on pt treated similarly and followed up with same standards in multiple centers. The long term outcome following transplantation for HCV has deteriorated questioning whether use of more potent immunosuppressant agents contributes to lower patient and graft survival. This study was designed to evaluate outcome differences in patients receiving Neoral C2 as compared to tacrolimus C0 following liver transplantation for HCV. 128 patients underwent liver transplantation for HCV at our center; 48 received Neoral and 80 Tacrolimus. All received Solumedrol/prednisone. C2 levels of Neoral were adjusted to 1000 ng/mL for the first 6 months whereas C0 levels of Tacrolimus were kept between 10-15 ng/mL for the first 6 months; subsequently C2 levels were lowered to 800 ng/mL months 6-12 and then 600 ng/mL thereafter whereas tacrolimus levels were reduced to 5-10 ng/mL after month 6. Rates and severity of rejection were similar for patients receiving Neoral or Tacrolimus (Neoral 37% vs Tacrolimus 34% p=NS). Patient and graft survival was higher in patients receiving Neoral compared to those receiving Tacrolimus (90% vs 78% p<0.05). The incidence and severity of recurrent HCV was much higher in patients receiving Tacrolimus with many patients progressing to severe fibrosis or cirrhosis by year 3 post transplant. Genotype differences could not explain the results seen. HCV viral titers increased more rapidly in patients treated with Tacrolimus than Neoral and by 3 years mean viral titers were 3.2 million copies in patients receiving Tacrolimus versus 1.05 million copies in patients treated with Neoral (p<0.05). Levels of AST, ALT, ALP and bilirubin were more markedly disturbed in patients receiving Tacrolimus compared to patients who received Neoral. Use of rebetron only provided a sustained virologic response in patients on Neoral as previously reported by another group of investigators, although the sustained virologic response was only seen in 20% of treated patients. The incidence of diabetes mellitus was significantly higher in patients treated with Tacrolimus. Addition of Cellcept (MMF) resulted in more severe recurrent HCV infection in both Neoral and Tacrolimus treated groups. These data strongly suggest that use of Tacrolimus in comparison to Neoral leads to a poorer outcome following transplantation for HCV. Multivariate analysis of this single center data as well as analysis of the Lis2t study will hopefully provide further insight into factors contributing to these results which will allow us to design better immunosuppressive protocols for this group of patients. Tae W. Chong, 1 Robert L. Smith, 1 Michael G. Hughes, 1 Christine K. Rudy, 1 Robert G. Sawyer, 1 Timothy L. Pruett. 1 1 Surgery, University of Virginia, Charlottesville, VA. Introduction: Allograft infection after liver transplantation for chronic Hepatitis C (HCV) is nearly universal, and the presumed mechanism of infection is related to HCV envelope protein interaction with putative liver receptors. This represents a unique opportunity to study an early event in HCV infection, quasispecies (QS) selection by the allograft. We have developed a primary hepatocyte culture system derived from donor livers to study QS selectivity in vitro and to correlate it with allograft association in vivo. Methods: In two patients undergoing liver transplantation secondary to HCV infection, the following samples were obtained: blood prior to the removal of the explant (inoculum), biopsy from the donor liver before transplantation (hepatocytes), and allograft after two hours of perfusion (in vivo selection). Hepatocytes were obtained using a two-step collagenase perfusion and cultured at 3.0x10 5 cells/ml. The cells were inoculated with serum obtained from the patient's blood and harvested after 5 days. RNA was extracted from the samples and reverse transcribed. The cDNA was amplified using a semi-one-sided PCR for the HVR1 region. Single strand conformational polymorphisms (SSCP) were analysed by denaturing the PCR product and separating them by electrophoresis. Results: The HVR1 sequences were amplified and demonstrated similarity in band patterns for both patients when analyzed by SSCP. In patient A, there were 3 bands in the serum and 6 bands in the 2 hour post perfusion biopsy. The post perfusion sample demonstrated differential selectivity with 3 new bands represented that were not seen in the inoculum. In vitro, there were 3 bands in the primary hepatocytes with 2/6 bands from the 2 hour post perfusion represented in the in vitro sample. A single band persisted in all three samples. There was a higher degree of correlation between the in vitro and in vivo samples for patient 2. The in vitro sample had 6 bands which all demonstrated a similar pattern with the 2 hour sample (9 bands). Conclusion: There is a differential association of the viral quasispecies for the allograft, and primary hepatocytes derived from the same liver demonstrate some similarity to the in vivo sample in this selectivity. The differences may represent phenotypic differences in cultured hepatocytes or viral adaptation to the host without immunologic pressure. This represents a unique model to study microenvironment and quasispecies selection. Hepatitis C virus (HCV) is the most common indication for liver transplantation in patients suffering from end stage liver disease. Recurrence post liver transplantation is common and frequently results in progressive liver disease requiring treatment with a pegylated interferon and ribavirin. However, relapse upon withdrawal of combination therapy is very common. Therefore, optimal duration and the stopping point of therapy in this patient population remain unclear. Aim We wanted to determine if the presence or absence of HCV RNA liver tissue would be a better predictor of success following 48 weeks of therapy with treatment of pegylated alpha-2b interferon and ribavirin of therapy in patients treated for recurrent HCV post liver transplantation than the present detection of HCV RNA in serum. All recipients received combination pegylated alpha-2b interferon (1.5mcg/kg) and ribavirin (200-600mg/d) therapy for at least 48 weeks of therapy. The diagnosis of serum HCV-R was determined histopathologic findings with inflammation along with viral recurrence using COBAS AMPLICOR™ Hepatitis C virus Test, version 2.0 (HCV RNA Qualitative PCR) and COBAS AMPLICOR™ HCV MONITOR TEST-version 2.0 (HCV RNA quantitative PCR) assays. Tissue HCV PCR confirmation was done using Light Cycler (Roche) Real time PCR for the HCV quantitation. SVR was determined as serological testing nondetectable at 6 months. Ten transplant recipients were included: 3 (30%) females and 7 (70%) males, mean age (53 yrs), mean time from OLT was 29.2 months. Four (40%0 were Caucasians and 6 (60%) were Hispanic. All 10 were nondetectable HCV RNA in their blood at the end of therapy. However, 7/10 (70%) remained HCV PCR detectable in their liver tissue and 3/10 (30%) were nondetectable. SVR without relapse only occurred in the 3/10 that were liver tissue HCV PCR negative at the end of therapy. Direct detection of HCV PCR in liver tissues appears to be a more accurate predictor of SVR following PEG interferon and ribavirin therapy for recurrent HCV post liver transplantation. This information will require long term follow up and further confirmation. Interactions between T cells and both donor and recipient dendritic cells (DC), characterize alloresponses in organ transplantation, where recipient T cells respond either directly to donor MHC or indirectly to processed donor MHC allopeptides in context of recipient MHC molecules. The present study evaluates the nature of the DC-CD4 T cell interactions in the lymph node (LN) during AR after murine cardiac transplantation. CX3CR1 + /GFP + BALB/c (I-A d ) donor hearts were transplanted into C57BL/6 (I-A b ) mice; AR occurred by 7 days. LNs were harvested after 12, 24, 48, 72, 120, 144 and 168 hours, and fluorescent immunohistological staining was performed for CDllc, CD4, and Y-Ae mAb (which recognizes the complex of I-Eα 56-73 donor allopeptide in the context of recipient I-A b , thus evaluating indirect presentation). Confocal microscopy enabled visualization of donor DC (CD11c + GFP + cells), as well as indirect (CD4 + Y-Ae + ) and direct (CD4 + donor DC) interactions. The number of donor and recipient DCs, CD4 + , Y-Ae + cells, and their interactions per 0.6 mm2 of LN tissue over time were enumerated. Our data indicate an early increase of DC-CD4 + interactions in the LN following AR. Donor GFP + DC-CD4 + direct interactions are present as early as 12 hours following engraftment, peak at day 3, but disappear by the 6th day. Conversely, recipient DC-CD4 + indirect interactions increase to a maximum between 48-120 hours, but persists for at least 7 days. Y-Ae-CD4 + interactions, which evaluate indirect presentation of the specific allopeptide pEα:I-A b , similarly persist and even increase at day 7. In conclusion, both direct and indirect interactions between CD4 T cells and donor and recipient DCs occur shortly after engraftment in AR episodes. However, only donor MHC peptides are still presented in context of recipient DC to CD4 T cells at posttransplant day 7, indicating persistent indirect allorecognition. Soren Schenk, 1 Danielle D. Kish, 2 Anton V. Gorbachev, 2 Kiyotaka Fukamachi, 1 Peter S. Heeger, 2 Robert L. Fairchild. 2 1 Biomedical Engineering, The Cleveland Clinic Foundation, Cleveland, OH; 2 Immunology, The Cleveland Clinic Foundation, Cleveland, OH. Heart allografts expressing single MHC II disparities are not acutely rejected. We hypothesized that CD25 + regulatory cells prevent the expansion of alloreactive T cells. The goal of this study was to test if increased donor-specific priming or direct blockage can impede early regulation. We first confirmed that C57BL/6 (B6) (H-2b) recipients accept MHC II mismatched B6.H-2 bm12 (bm12) heart grafts (MST >100d, n=10). Control B6 acutely rejected complete mismatched A/J (H-2a) grafts by d8, and priming with A/ J dendritic cells (DC) 3 days before transplant, reduced graft survival to d5. In contrast, bm12 DC priming of B6 (n=10) only resulted in a 50% loss of bm12 grafts by d60 (P NS vs non-primed B6). Although bm12 grafts of primed B6 had severe cellular infiltration at d7 (ISHLT grade 3A-3B) compared to non-primed B6 (1A-1B), the number of alloreactive T cells increased less than 2-fold (from 74 cells/ 6 x 10 6 in non-primed to 110/ 6 x 10 6 in primed B6), smaller than observed in complete mismatched grafts (950/ 6 x 10 6 ). By d21, donor-specific T cells decreased 5-fold in primed vs non-primed B6 (21/ 6 x1 0 6 vs 60/ 6 x 10 6 , P<0.05), suggesting down-regulatory mechanisms augmented by DC priming. Accordingly, multiple bm12 DC priming of B6 led to indefinite graft survival. We next blocked CD25 + cells with PC61 mAb (0.25 mg on d -1 to d9), which resulted in a striking 80% loss of bm12 grafts by d25 (n=6, MST 21d, P<0.01 vs control B6). Recall assays on d21 revealed a 15-fold increase of alloreactive T cells compared to untreated B6 recipients (750/ 6 x 10 6 vs 50/ 6 x 10 6 , P<0.01). PC61-blocked CD25+ cells were present 1 day after the treatment; but not at 9 days after treatment, when all grafts were already severely infiltrated. We finally transplanted bm12 hearts into T cell deficient B6 Rag -/recipients, which were then reconstituted with 30 x 10 6 B6 splenocytes that did or did not contain CD25 + cells. Reconstitution with CD25 + depleted cells resulted in acute rejection of bm12 grafts (n=6, MST 12d, P<0.001 vs control B6), whereas CD25 + containing B6 cells did not. Thus, donor-specific priming cannot mediate rejection of grafts with a single MHC II disparity, possibly due to the presence of strong regulation. This study is the first to show that temporary blockage of CD25 + cells can lead to increased priming of alloreactive cells and to acute rejection of MHC II mismatched grafts. Wei-Ping Min, 1, 2, 3 We have previously demonstrated that targeting both dendritic cells and T cells simultaneously achieved reliable tolerance in a mouse transplant model, which depended on a regulatory feedback loop between tolerogenic dendritic cells (Tol-DC) and T regulatory cells (Treg). The present study was undertaken to determine if there is synergy in the induction of tolerance by simultaneous adoptive transfer of Tol-DC and Treg. METHODS: BALB/c mice receiving C57BL/6 cardiac allografts were treated with LF 15-0195, a novel analogue of 15-deoxysperqualine at dose of 2 mg/kg for 20 days and long-term survivors (>100 days) were defined as tolerant recipients. CD11c + dendritic cells, CD4 + CD25 + and CD8 + CD28 -Treg cells were isolated from spleens of tolerant recipients. Tol-DC alone (5x10 6 /mouse), Treg alone (1x10 6 /mouse) or a combination of Tol-DC and Treg were intravenously injected into naïve BALB/c recipients prior to and/or after allogeneic (C57/BL6->Balb/c) transplantation. The recipients received sublethal radiation and no additional immunosuppressants were used. RESULTS: The CD4 + CD25 + and CD8 + CD28 -T cells were significantly increased in the tolerant recipients as compared with rejecting recipients and non-transplanted mice. Both ex vivo isolated CD4 + CD25 + and CD8 + CD28 -T cells potently suppressed ongoing MLR in vitro, suggesting they are Treg. The Tol-DC isolated from tolerant recipients demonstrated immature phenotypes, impaired antigen presenting function, and inhibited allostimulatory capacity. Adoptive transfer of Tol-DC significantly prolonged allograft survival to a median survival time (MST) of 38 days, while recipients without transfer survived to MST of 13 days. Transfer of a subtherapeutic dose of Treg did not prolong the allograft survival. In contrast, adoptive transfer of a combination of Tol-DC and the subtherapeutic dose of either CD4 + CD25 + or CD8 + CD28 -Treg significantly prolonged survival to MST of 85 days (P<0.05) and MST of 105 days (P<0.01), respectively. CONCLUSIONS: This study provides the first direct evidence that Tol-DC synergize with Treg in the induction of "infectious tolerance". These findings shed light on mechanisms of tolerance induction through the regulatory feedback loop between Tol-DC and Treg. Atsushi Toyofuku, 1, 5 Yohichi Yasunami, 1 Kentaroh Nabeyama, 1 Masahiko Nakano, 1, 5 Masayuki Satoh, 1 Nobuhide Matsuoka, 1 Junko Ono, 2 Toshinori Nakayama, 3 Masaru Taniguchi, 4 Masao Tanaka, 5 Seiyo Ikeda. 1 1 SurgI, Fukuoka Univ, Fukuoka, Japan; 2 Lab Med, Fukuoka Univ, Fukuoka, Japan; 3 Mol Immunol, Chiba Univ, Chiba, Japan; 4 Allergy and Immunology, RIKEN Research Center, Yokohama, Japan; 5 SurgI, Kyushu Univ, Fukuoka, Japan. Natural killer T (NKT) cells have been recently identified as a novel lymphoid lineage distinct from T , B and NK cells and their role in transplant immunology remains undetermined. Previously, we have shown that NKT cells are essential for accptance of rat islet xenografts in mice (JCI 105:1761 (JCI 105: , 2000 . The aim of the present study was to determine a role of NKT cells in islet allograft rejection. For those purposes, CD1ddeficient (CD1d KO) as well as Va14NKT cell-deficient (NKT KO) mice were used for the experiments. When BALB/c islets (500) were grafted into the liver of STZ diabetic wild-type C57BL/ 6 mice, islet allografts were rejected at 9.2±1.4 (n=8) days after transplantation (tx). Morphologically, islets infiltrated with mononuclear cells were seen in the liver. FACS analysis revealed an up-regulation of intracytoplasmic IFN-γ expression as well as CD25 expression of NKT cells in the liver at the time of rejection. When islet allografts were grafted into CD1d KO mice without treatment, the MST was prolonged to 26.0±9.3 (n=5) days. When diabetic wild-type or CD1d KO mice were treated with 0.2 mg/kg rapamycin (ip, day 0-6), the MST of islet allografts was 16.2±6.9 (n=9) or >64.5±24.1 (n=6) days, respectively. When the dosage of rapamycin was increased to 1.0 mg/kg, the MST in wild-type or CD1d KO mice was >41.3±22.2 (n=8) or >86.7±10.9 (n=11) days, respectively. One out of 8 wild-type and 10/11 CD1d KO mice receiving islet allografts and treated with 1.0 mg/kg rapamycin were normoglycemic at 90 days after tx. Morphologically, intact islets with well-granulated β cells were seen in the liver of normoglycemic mice. The similar findings were obtained when NKT KO mice were used as recipients. The survival rate of islet allografts in CD1d KO mice treated with rapamycin (1mg/kg) and receiving the intraportal transfer of hepatic MNC (5x10 6 ) from wild-type (n=3) or CD1d KO (n=3) mice at the time of islet tx was 0 or 100%, respectively, at 60 days after transplantation. These findings clearly demonstrate that NKT cells play a significant role in acute rejection of islet allografts in the liver of mice and indicate that NKT cells might be considered as a target for prevention of rejection. Further studies are in progress to determine whether it is also the case with the use of calcineurin inhibitors instead of rapamycin. We have previously demonstrated a critical role of multidrug resistance (MDR1) Pglycoprotein (P-gp) in alloimmunity: P-gp blockade inhibits human alloimmune T cell activation in vitro via both T cell-and antigen presenting cell (APC)-dependent mechanisms. A possible in vivo immunoregulatory role of P-gp has not been investigated to date. We used a vascularized murine heterotopic cardiac transplant model to examine the effects of P-gp blockade on allograft survival and alloimmune T cell responses. RT-PCR detection of mdr1a mRNA and flow cytometric detection of a P-gp antagonistsensitive calcein-AM efflux capacity demonstrated functional expression of mdr1a Pgp in murine splenocytes. P-gp blockade in BALB/c recipients of C57BL/6 allografts via i.p. administration of the specific P-gp antagonist PSC833 prolonged mean allograft survival from 8.5+/-0.5 to 11.7+/-0.5 days vs. controls (p<0.01), and inhibited recipient IFN-γ but not IL-5 production, while enhancing the IgG1/IgG2a ratio of donor-specific alloantibody (p<0.05). The results suggested that delayed rejection in P-gp-blocked animals occurred predominantly via a Th2-dependent humoral response. Concurrent blockade of P-gp and CD86, which regulates Th2 responses, markedly prolonged allograft survival (40.1+/-3.9 days) vs. either P-gp-blockade or CD86 blockade alone or vs. concurrent P-gp and CD80 blockade (p<0.01). Mice exhibited suppressed Th1dependent IFN-γ production and blocked Th2-dependent humoral immunity, demonstrating in vivo synergy of P-gp inhibition and CD86-directed costimulatory blockade. When hearts from MHC class II knockout (KO), but not wildtype mice, were transplanted into mdr1a/b KO mice, graft survival was also significantly prolonged compared to wildtype recipients (p<0.05), suggesting that P-gp functions predominantly in indirect allorecognition. Our findings demonstrate that P-gp is a critical regulator of Th1 responses in vivo and show that the molecule functions in indirect allorecognition. Thus, our results raise the possibility that P-gp-targeted approaches may open new therapeutic approaches in clinical allotransplantation, and in particular in chronic allograft rejection. We have previously shown that CD4 T cells lacking the costimulatory molecule LIGHT ineffectively reject allografts. We now examine the role of LIGHT in rejection mediated by CD8 T cells and contrast its effect on CD8 and CD4 T cell function. Methods. Two approaches were used to examine the effect of LIGHT on T cell-mediated rejection. 1) RAG -/allograft recipients reconstituted with wild-type (WT) CD8 or CD4 T cells were treated with fusion proteins that block LIGHT (LTβR-Ig or HVEM-Ig). 2) CD4 or CD8 WT or LIGHT -/-T cells were transferred into RAG -/allograft recipients. The effect of LIGHT on T cell activation after transplantation was assessed by IFNγ Elispot and MLR. Results. Blocking LIGHT with LTβR-Ig or HVEM-Ig prolonged skin allograft survival in mice reconstituted with WT CD8 T cells (20 and 16 days respectively vs. 11 days for controls, p<0.001) but had no effect on survival in CD4-reconstituted mice. Relative to RAG -/mice reconstituted with WT T cells, skin allograft survival was prolonged in RAG -/mice reconstituted with either LIGHT -/-CD8 (52 vs. 16 days, p = 0.0003) or LIGHT -/-CD4 T cells (40 vs. 13 days, p < 0.0001). Intestinal allograft rejection at day 28 was also less severe in mice reconstituted with LIGHT -/-CD8 T cells (absent/ mild rejection) compared to WT CD8 T cells (all with severe rejection). The number of T cells recovered from RAG -/mice reconstituted with WT or LIGHT -/-, CD8 or CD4 T cells was similar suggesting that differences in graft survival were not due to differences in homeostatic proliferation. LIGHT -/-CD4 T cells isolated prior to rejection demonstrated significantly impaired priming in vivo and proliferation to donor antigen in vitro relative to WT CD4 T cells. Following allograft rejection, LIGHT -/and WT CD4 T cells demonstrated equivalent priming and proliferation to donor antigen. Surprisingly, even in non-rejecting recipients there were large numbers of primed LIGHT -/-CD8 T cells. These LIGHT -/-CD8 cells also proliferated well to donor antigen in vitro. Although LIGHT contributes to rejection mediated by both CD4 and CD8 T cells, the fusion protein data suggest that LIGHT has a greater effect on CD8 T cells mediating rejection. Following transplantation LIGHT appears to be important for the initial activation of CD4 T cells. In contrast, LIGHT -/-CD8 T cells appear to undergo initial activation but remain relatively ineffective in mediating allograft rejection suggesting that LIGHT acts on CD8 T cells down-stream of initial activation. Diabetes, Univ. Colo. Hlth. Sci. Center, Denver, CO; 2 Dept. Pediatrics, Univ. Colo. Hlth. Sci. Center, Denver, CO. A major role for the B7 family members CD80 and CD86 in providing costimulation to T cells is well established. Interestingly, previous studies show that host, but not donor CD80/CD86 expression is required for cardiac allograft rejection. However, the role for host costimulation by these molecules in the rejection of cellular islet allografts and xenografts is unclear. The purpose of this study was to determine whether islet allografts and/or rat islet xenografts required recipient CD80/CD86 molecules for acute rejection Methods: Streptozotocin-induced diabetic C57Bl/6 (B6, H-2 b ) or CD80/CD86 doubledeficient B6 mice were grafted with allogeneic BALB/c (H-2 d ) islet allografts or with WF rat (RT1 u ) ) islet xenografts. Non-diabetic animals were grafted with BALB/c cardiac allografts. Islet survival was determined by monitoring blood glucose levels while cardiac allograft survival was assessed by graft palpation. Graft rejection was determined by return to hyperglycemia (islets) or cessation of heartbeat (hearts). Rejection was confirmed be histological examination of the transplant. Results: Consistent with previous studies, BALB/c cardiac allografts were acutely rejected in wild-type B6 mice (5/5 grafts rejecting in <12 days) but survived >100 days in CD80/CD86 deficient mice (6/6). In stark contrast, both islet allografts (10/10) and rat islet xenografts (6/6) demonstrated acute rejection in both control B6 and in CD80/ CD86 deficient hosts (p<.01 relative to cardiac allografts). Parallel studies using CD80/CD86 deficient islet donors failed to demonstrate a requirement for donor B7 expression in islet allograft rejection. Thus, neither host nor donor CD80/CD86 expression was required for islet allograft rejection. Conclusion: Varied studies imply that the inherent pathways for rejecting primarily vascularized versus cellular allografts or xenografts may be distinct. The present study illustrates this concept by showing a marked difference in the role of host-derived CD80/CD86 costimulatory molecules for cardiac allograft versus islet allograft/xenograft rejection in vivo. While such costimulation is rate-limiting for cardiac allograft rejection, these same molecules are not necessary for acute rejection of either islet allograft or xenografts. 1 1 Transplantation Branch, NIDDK, Bethesda, MD; 2 Organ Transplant Service, Walter Reed Army Medical Center, Washington, DC. Alemtuzumab facilitates marked depletion of T-cells and monocytes following organ transplantation and allows for reduced maintenance immunosuppression. Peripheral cellular phenotyping of renal transplant recipients showed that naïve T-cells and Tcells with potential regulatory function (CD4+CD25+) were not prevalent following aggressive depletion with Alemtuzumab. Rather, post-depletion T-cells were of a single depletion-resistant effector memory phenotype (CD3+CD4+CD45RA-CD62L-; M1) that expanded in the first month and were uniquely prevalent at the time of rejection. These cells were resistant to steroids, deoxyspergualin and sirolimus, in vitro. In this study, we examined patients undergoing profound T-cell depletion with Alemtuzumab following renal transplantation, evaluating the phenotype and transcriptional characteristics of their residual PBMC and cellular infiltrates present on renal biopsies. Patient PBMC were isolated by ficoll and characterized by flow cytometry. RNA was extracted from sorted naïve and M1 memory cells, following 2-hour stimulation with PMA/ionomycin, and from renal protocol biopsies for quantitative real-time PCR. Recipients typically experienced a reversible acute rejection within the first month after transplantation and these rejection biopsies were characterized by a significant elevation of costimulatory molecule transcripts (CD80, CD86 and CD154), proinflammatory cytokines (IL10, IFNg, TNFa, TNFb) and chemokines and growth factor transcripts (MIP1a, RANTES, MIG, IP-10 and GM-CSF). Stimulation of M1 memory cells from patient PBMC, post-depletion, also generated a significantly elevated, effector memory transcriptional profile (RANTES, MIP1a, IFNg, MIG, IP-10, GM-CSF, and TNFa), remarkably similar to transcripts observed in renal biopsies. In addition, CCR7 and CD62L, critical regulators of immune cell trafficking, were significantly upregulated in renal biopsies but were strongly down-regulated in circulating, activated M1 memory. These data demonstrate that a limited population of functional memory cells result from aggressive depletion with Alemtuzumab and quickly hone to allograft and mediate rejection following transplantation. Therefore, strategies to account for and restrict the activity of resurgent T-cell populations following depletion must be considered in designing anti-rejection or tolerance-inducing therapies. Alloantibodies are a clinically significant component of the immune response to organ transplants. In our experimental model, B10.A (H-2 a ) cardiac transplants survive significantly longer in C57BL/6 (H-2 b ) immunoglobulin knock out (IgKO) recipients than in their wild type counterparts (WT). Passive transfer of a single 50-200 µg dose of complement activating IgG2b (15-1-P) alloreactive monoclonal antibodies (Allo-mAbs) to IgKO recipients reconstituted acute rejection of cardiac allografts. Although passive transfer of a subthreshold dose of 25 µg of IgG2b or a single 100-200 µg dose of non-complement activating IgG1 (AF3-12.1.3.) Allo-mAbs did not restore acute rejection to IgKO recipients, a combination of these Allo-mAbs did cause acute graft rejection. Histologically, rejection was accompanied by extensive aggregates of platelets that stained intensively for von Willebrand factor. These platelet aggregates occluded the arteries, capillaries and veins of the rejected allografts. Flow cytometry and ELISA assays demonstrated that the IgG1 Allo-mAbs (AF3-12.1.3.) used in our studies did not activate complement on their own and did not augment complement activation by IgG2b Allo-mAbs. However, IgG1 Allo-mAbs specific for mouse SVEC4-10 endothelial cells stimulated them to produce monocyte chemotactic protein 1 (MCP-1), the mouse homolog of human Gro-α that has 80% homology with human IL-8 (KC) and RANTES. Since MCP-1, KC and RANTES are all chemoattractant for macrophages, we cultured antibody-sensitized endothelial cells with macrophages. Using a protein macroarray assay and then Real-Time PCR and ELISA, we found that in addition to MCP-1, KC and RANTES, IgG1 stimulated high levels of MIP-2, IL-1-α and IL-6 and moderate levels of TNF-α in cultures of endothelial cells with macrophages. Exogenous TNF-α was demonstrated to augment the effects of IgG1 on endothelial cells. Our findings indicate that non-complement-activating Allo-mAbs can augment injury to allografts by complement-activating Allo-mAbs. Non-complement activating Allo-mAbs stimulate endothelial cells to produce chemokines (MCP-1, MIP-2, KC, RANTES) which in turn activate macrophages. Subsequently macrophage-secreted cytokines (IL-1-α, IL-6, TNF-α) augment antibody-induced activation of endothelial cells. The interaction of antibody-activated complement with complement receptors on macrophages and T cells is under investigation. Purpose: The use of donor hearts with depressed left ventricular (LV) ejection fraction (EF) for transplantation is a controversial topic. In this single-institution retrospective study, we report our experience. Methods: All patients who were transplanted with donor heart EF≤50% were identified. Statistical analysis was carried out using SPSS and Kaplan-Meier actuarial survival analysis. Results: Seventy-two patients were identified with a mean follow up of 42.4±42.3 months. Mean donor age was 27.8 ± 14.4 (range, 1-61) . Males made up 67.6% of the donors. Donor/recipient height ratio was 1.01± 0.08, and weight ratio was 1.14 ± 0.52. Only the highest EF was analyzed and the mean value was 46.7% ± 5.3% (range, 22-50) . 10% of the donors died of trauma and 21.1% of intracranial bleeding. High dose inotropes (>10µg/kg/min dopamine) were required for 5.6% and LV hypertrophy was present in 1.4%. Mean recipient age was 48.5 ± 19.4 years, 70.4% were male, 44.5% were UNOS Status I, 45.8% Status II and 9.7% alternate. Actuarial survival at 30-day, 1-year, 3years and 10-years was 91.4%, 83%, 75.9%, and 63.1% respectively. Actuarial TCADfree survival 97.2% at 1-year and 91% at 3-years. At five years and ten years, dropped to 74.9% and 34% respectively. 30-day rejection was 15.2% and overall rejection was 29.6%, all within 3 years. Mean EF at latest follow up was 56.3%±7.7%. The mean difference between EF at follow up and harvest was an increase of 9.26% ± 9.11. There was no statistical significance between the group of 50% and the one less than 50% in the incidence of TCAD (p=0.625), rejection within 3 years (p=0.349) and long-term survival (p=0.152) Conclusion: Actuarial survival estimates for donor hearts with EF≤50% is comparable to our overall institution reported survival of 84% at 1 year. Donor hearts with EF≤50%, with appropriate management and careful patient selection may present a viable option for the nationwide shortage of donor hearts. The data in Table 2 show the anti-A IgG titer data (all by non-AHG method) for the 24 patients who did not have anti-A titer reduction therapy before transplantation. The 1 month graft failure rate was greater (50%) in patients whose anti-A titer was ≥8 (50%) compared to those with low titers (5%) (Fisher's exact test p=0.06). Twelve patients, 10 of whom had high anti-A titers before transplantation, were treated with either PP alone (n=10) or PP and IVIg. The UNOS algorithm for renal allocation has removed all HLA points for Class I HLA matching. There was some concern in our transplant program that this may increase the level of sensitization to HLA and make re-transplantation much more difficult. Review of UNOS data also suggested that long term graft survival may be adversely affected by increased mismatching of Class I HLA. The use of cross-reactive groups (CREG) for Class I matching was suggested as an option to decrease sensitization to broad public epitopes without discriminating against recipients with rare HLA antigens. Our local sharing region has a UNOS variance that incorporates CREG matching points into the algorithm. To help justify the use of this variance in allocation, we have examined graft survival retrospectively when stratified by HLA matching and CREG matching. We have also looked the level of HLA antibody (Panel Reactive Antibody -PRA) in patients activated for re-transplant and compared this with the degree of HLA mismatch of the first transplant. Graft survival was examined in transplants performed from January,1990 through December,2000. The results show very little differences in one year graft survival, except for a slight decrease in graft survival for the 5-6 HLA mismatched grafts. Graft survival rates for 0ABDR, 0CREG-0DR, 0DR, and 5-6 ABDR mismatches were 93.5%, 88.9%, 84.5%, and 81.9%, respectively at one year. Five year graft survival, however, showed increased survival in the grafts that were better matched Graft survival rates for 0ABDR, 0CREG-0DR, 0DR, and 5-6 ABDR mismatches were 70.9%, 70.4%, 54.5%, and 59.6%, respectively at five years. This suggests that Class I matching may be important in long term graft survival. Broad Class I matching using the public epitopes identified as the CREG groups appeared to improve graft survival at five years, especially when there were also no mismatches at the DR locus. 72% of patients re-listed for transplantion were sensitized to HLA, compared to 21% of patients waiting for first transplant. Fifty re-transplant patients were evaluated for sensitization to HLA following the loss of the first transplant. Only 2 of 10 patients were sensitized to Class I HLA antigens when the first transplant was a 0 CREG mismatch. 78% of the patients were sensitized if there was a CREG mismatch. All six patients with 3 or more CREG mismatches were sensitized when re-listed. Donorspecific antibody was detected in all but three of the sensitized patients. Background: In heart transplantation, pre-transplant HLA-sensitization increases risk of antibody-mediated rejection and other post-transplant problems. Tissue allografts (homografts) used for aortic arch reconstruction and blood products used in the Norwood procedure may cause HLA sensitization in infants, some of whom will need subsequent heart transplantation. This study aimed to determine the infant immune response to tissue allograft placement. Methods: In this cross-sectional analysis, serum from patients who underwent the Norwood procedure in infancy (n=11) were tested post-surgery and compared with control patients who received blood products during infant cardiac surgery without allograft placement (n=4). HLA sensitization was detected using Panel Reactive Antibody screening tests (PRA) and ELISA assays to detect antibody to HLA Class I & II antigens. Development of anti-blood group antibodies (isoagglutinins) was also investigated in patients' serum by reverse blood typing. Results: Median age at surgery was 6 days (0-62d) in allograft recipients, and 9 days (0-41d) in controls. Median age at testing was 10 months (4mo-4yrs) in allograft recipients and 4 years (2-6yrs) in controls. 10/11 (91%) allograft recipients were sensitized (PRA≥10%), with 82% highly sensitized (PRA≥4/12); no control patients were sensitized. All allograft recipients with elevated PRA showed positive ELISA to HLA Class I & II antigens. Two allograft recipients have undergone subsequent heart transplantation. Their HLA antibodies were shown in antibody-specificity assays to be directed against the HLA type of their homograft donors and cross-reactive with their heart graft donor. Antiblood group antibodies developed normally, even in patients whose tissue allografts were from ABO-incompatible donors (n=4). Conclusions: HLA sensitization develops in infants following tissue allograft placement, but not after exposure to blood products during cardiac surgery. ABO incompatible allografts did not affect normal development of isoagglutinins. These results show divergent effects on the infant immune system by exposure to T-dependent vs T-independent antigens, and have important implications for infants eventually needing heart transplantation after Norwood palliative surgery. Previously we showed that acute rejection (AR) surveillance by intramyocardial electrogram (IMEG) recordings, make routine biopsies in children unnecessary and reduce their mortality due to ARs early after heart transplantation (HTx) close to zero. Due to the lower incidence of late ARs and their usually harmless course, IMEG recordings were discontinued after the second post-transplant year. However, the impact of late ARs on the long-term outcome, especially in children, is controversial. Therefore, we reviewed the late ARs in our pediatric patients to evaluate their clinical relevance. Methods: All children transplanted between 1986-1999 with post-HTx time >2 years Abstracts were analysed to evaluate prevalence and severity of late ARs occurring after we discontinued IMEG monitoring. Attention was also focused on the temporal relationship between late ARs and the new appearance or aggravation of transplant coronary arteriopathy (TCA). Results: Of the 68 patients included in the study (age 8.3± 5.4 years at HTx), 22 (32.4%) showed between 1 and 5 biopsy proven and clinically relevant late ARs, which occurred for the first time at 5.0± 3.0 years post-HTx. Of all reviewed patients, 16 (23.5%) died during the study period. Of these deaths, 7 (43.7%) occurred during AR, 3 (18.8 %) were sudden deaths shortly (2-6 weeks) after an AR episode, 3 (18.8%) were related to severe TCA and 1 (6.2% ) to infection. In 2 patients (12.5%) the cause of death was unknown. Of the totally 10 patients who died in connection with biopsy-proven AR, 6 also had TCA that developed after the second post-HTx year. In 11 patients with relevant late ARs, but without TCA during their first episode of late AR diagnosed at 4.6± 3.0 years after HTx, the angiogram showed significant TCA lesions at 2.4± 1.3 years after their first late AR episode. The mean number of late ARs/patient/year was higher in those with angiographic TCA that developed after the second post-HTx year than in those without TCA after more than 2 years since HTx (p<0.01). Conclusions: ARs which occurred beyond the second post-HTx year are the major cause of acute and chronic allograft dysfunction in children later after HTx and are also linked to the development of TCA. Late ARs and TCI are the dominant cause of death after the second post-HTx year in this less compliant age group and thus more careful rejection surveillance also late after HTx is justified to improve the long-term outcome. Background: Previous studies showed that ELISA is a sensitive method for detection of antiHLA antibodies. Aims: In this study, we investigated allosensitization after pediatric heart transplantation (Tx) to determine whether the presence of ELISA-detected antiHLA antibodies (ab) pre-and/or post-transplant, correlate with acute and chronic rejection. Material and methods: 45 patients, who had serial ELISA pre-and post-Tx were studied. Age at Tx was 8 ± 7 years. Acute rejection (AR) was defined as ISHLT grade > 2. Patients were defined as rejectors (group A= 22 cases) if they had recurrent AR (more than 2 Transfusion of red blood cells (RBC) during the administration of calcineurin inhibitors has been shown to enhance regulatory T-cell populations to exogenous antigens. Infants with complex congenital heart disease who have prolonged waiting for heart transplantation frequently require transfusion for hemodynamic stability. We reviewed the post-transplant clinical results of infants who were transfused pre-transplant with random donor RBCs while also receiving cyclosporine (TCSA) and compared the results to control infants without transfusion. Two doses of cyclosporine were administered orally (2 mg/kg every 12 hrs) prior to and post transfusion. Thirty infants were in the TCSA group and 30 infants in the control group. Infants in the TCSA group received 1.6 ± 1 (range 1 -6) transfusions with cyclosporine while awaiting trasplant. The groups were contemporaneous with similar induction protocol and followup to 3.6 ± 2.8 vs 2.6 ± 2.3 years (p=ns). The mean age at transplant was 2.4 months in control and 3.9 months in TCSA and wait time was longer in TCSA group (108 vs 60 days) (p<0.05). Infants were followed identically. Ninety percent of TCSA group were on cyclosporine as the only immunosuppressant by 1 year post transplant versus 80% for the control group. Freedom from rejection and rejection frequency (0.63 vs 0.33 episodes/patient) was less in TCSA infants. No untoward events related to transfusion were seen. PTLD was not seen in either group. There was no difference in incidence of infections. A composite end point of death, retransplantation or coronary vasculopathy was compared as shown in the figure. TCSA group had 100% freedom from event to 5 years which was significantly different from control. We conclude that pre-transplant transfusion with cyclosporine coverage was safe and associated with marked improvement in clinical outcome post infant heart transplantation. 1 1 Cardiac Surgery, Klinikum Grosshadern, Munich, Germany. Background: Sirolimus is an immunosuppressive agent of increasing importance for prevention of acute and chronic allograft rejection in organ transplantation. We investigated the impact of sirolimus on hormone levels involved in the hypothalamuspituitary-gonad axis in male heart transplant recipients. Methods: A pair-matched analysis with 132 male heart transplant recipients being either on a sirolimus based-or calcineurin inhibitor based immunosuppression was performed. Matching criteria were age, years after transplantation and creatinine levels. Measured parameters were testosterone, luteinizing hormone (LH), follicle stimulating hormone (FSH), sexual hormone binding globulin (SHBG) and free androgen index (FAI). Results: Mean testosterone was 3.86 ± 1.41 ng/ml in the sirolimus group and 4.55 ± 1.94 ng/ml in the control (p = 0.025). Serum LH was 12.82 ± 21.19 mlU/ml in sirolimus patients and 6.2 ± 5.25 mlU/ml in the control (p = 0.015). FSH levels were 13.31 ± 18.4 mlU/ml versus 7.32 ± 5.53 mlU/ml, respectively (p = 0.015). The analysis revealed a significant decrease in testosterone and a significant increase in FSH and LH in the sirolimus group. The duration of sirolimus treatment correlated positive with SHBG (p < 0.01), LH (p<0.05 ) and FSH ( p<0.05 ) and negative with the FAI ( p<0.05 ). Sirolimus trough levels correlated with LH and FSH levels (p<0.01). Conclusion: Our study demonstrates that heart transplant recipients treated with sirolimus revealed significantly lower testosterone levels and a significant increase in gonadotropic hormones. These effects were trough level dependent. All candidates awaiting organ transplantation should be informed about these adverse effects. Michel White, 1 Haissam Haddad, 2 Jacques Genest, 3 Marie Helene LeBlanc, 4 Normand Racine, 5 Peter Pflugfelder, 6 Nadia Giannetti, 7 Ross Davies, 8 Eduardo Azevedo, 9 Debra Isaac, 10 Jeffrey Burton, 11 Ralph Ferguson, 12 Heather Ross. 13 1 Research Center, Montreal Heart Institute, Montreal, QC, Canada; 2 Research Center, QE II Health Sciences Centre, Halifax, NS, Canada; 3 Division of Cardiology, McGill University Health Center/Royal Victoria Hospital, Montreal, QC, Canada; 4 Hopital Laval, Ste Foy, QC, Canada; 5 Montreal Heart Institute, Montreal, QC, Canada; 6 London Health Science Centre, London, ON, Canada; 7 Royal Victoria Hospital, Montreal, QC, Canada; 8 The University of Ottawa Heart Institute, Ottawa, ON, Canada; 9 Section of Cardiology, Health Sciences Centre, Winnipeg, MB, Canada; 10 Foothills Medical Centre, Calgary, AB, Canada; 11 University of Alberta Hospitals, Edmonton, AB, Canada; 12 Fujisawa Canada Inc., ON, Canada; 13 The Toronto Hospital, Toronto, ON, Canada. Background: Cardiac allograft vasculopathy, and accelerated atherosclerosis are significant causes of morbidity and mortality in heart transplant recipients. We reported that despite the use of lipid lowering agents, stable cardiac recipients may nevertheless exhibit persistent dyslipidemia not satisfying the current guidelines for high-risk patients. Objectives: To investigate the changes in hemostatic and inflammatory parameters, homocysteine, and adhesion molecules in a large cohort of stable dyslipidemic heart transplant recipients treated with Neoral. The observations from transplant patients were compared with dyslipidemic but otherwise healthy control subjects. Methods: One hundred twenty-nine stable heart transplant recipients aged 56.7±10.1 years, 78±42 months post-cardiac transplantation, had blood drawn for lipid profiles, C-reactive protein, Lp(a), homocysteine, ICAM, and hemostatic parameters. The observations were compared with 26 age-and sex-matched healthy dyslipidemic subjects taking no medications and presenting no concomitant medical conditions. Results: (See Table) . Conclusions: Compared with healthy subjects with similar LDL-levels, stable heart transplant recipients exhibit a biochemical profile consistent with subclinical inflammation. Such abnormalities may contribute to accelerate the atherosclerotic process, and may play a role in cardiac allograft vasculopathy following cardiac transplantation. Lp ( Background: The objective of the present study was to investigate purine nucleotide metabolism in peripheral blood mononuclear cells (PBMC) of cardiac transplant recipients after switch to mycophenolate mofetil (MMF) therapy. Methods: Twenty-seven stable heart transplant recipients were switched from azathioprine to MMF 7.1±3.9 years after transplantation. Blood samples were collected before, 3, 6, and 12 months after onset of MMF therapy. Intracellular concentrations of guanosine 5'triphosphate (GTP) and adenosine 5'triphosphate (ATP) in PBMC were determined by means of HPLC. Inosine monophosphate dehydrogenase (IMPDH) activity was detected by measuring the conversion of inosine monophosphate (IMP) to xanthosine monophosphate and guanosine monophosphate (GMP) in PBMC. The activities of the salvage pathway enzymes guanine phosphoribosyltransferase (GPRT) and hypoxanthine phosphoribosyltransferase (HPRT) were determined by measuring the formation of GMP from guanine (for GPRT) and IMP from hypoxanthine (for HPRT). Results: A significant decline of IMPDH activity was observed 3 and 6 months after switch to MMF with 48.5% (p=.001) and 18.6% of enzyme activity (p<.0001) compared to initial value. Twelve months after onset of MMF-therapy activity of IMPDH was partially restored to 48% (p<.0001) compared to 6 months value, but still remained significantly lower, than initial rate. Intracellular GTP and ATP levels did not change significantly during the entire observation period. To explain this unexpected finding we investigated the activities of the salvage pathway enzymes GPRT and HPRT. The activity of GPRT increased from 7.06 nmol/10 6 PBMC before MMF to 12.03 nmol/10 6 PBMC (p<.0001), 11.15 nmol/10 6 PBMC(p<.0001) and 9.44 nmol/10 6 PBMC (p=.006) after 3, 6, and 12 months of MMF therapy, respectively. HPRT activity was also elevated (p=.003) 6 months after onset of MMF. Conclusion: Significant decrease of IMPDH activity in PBMC of stable cardiac transplant recipients was demonstrated 3 and 6 months after onset of MMF therapy with partial restoration after 12 months. Unexpectedly, intracellular GTP levels were not affected during the observation period. We showed for the first time that MMF therapy induces the activation of purine salvage pathway in PBMC which accounts for the maintenance of intracellular purine nucleotide pools. BK nephropathy has emerged in recent years as a significant cause of renal dysfunction in renal allograft recipients. It is currently the most common viral disease affecting renal allografts, with detectable viruria in 10 to 60% of recipients in the post-transplant period. Without treatment, progression to allograft failure can be seen in up to 45% of all patients. The pathogenesis of nephropathy in these patients, however, remains poorly defined. Renal failure in cardiac transplant patients is common, and has been attributed to a variety of causes. The question as to whether renal disease associated with BK virus plays a role in the renal dysfunction seen in cardiac transplant patients remains to be answered. There is limited data available on the incidence of BKV reactivation in the setting of non-renal solid organ transplantation. Given the correlation of BK infection with the potency of immunosuppression, and the fact that cardiac transplant patients are subjected to high levels of immunosuppression, one expects to see a high level of reactivation in this setting. This is the first study to prospectively examine the prevalence of BK viral reactivation in the setting of cardiac transplantation. We preformed a cross-sectional analysis of 111 cardiac transplant patients and found decoy cells in 28 patients (25%). Of these, we have thus far tested 16 for the presence of BK viral DNA in the blood and urine by PCRbased assay, and of these, 10 patients have evidence of BK viral DNA in the urine. None of these patients, however, have evidence of BK viremia. The mean serum creatinine of patients with and without BK viruria was 2.01 and 1.71, respectively, with a p-value of 0.31. Mean levels of age, gender, pre-transplant creatinine, cardiopulmonary bypass time, and ischemic time were not significantly different between the two groups. From these findings we conclude that there is evidence of BKV reactivation in the setting of cardiac transplantation, at a percentage which is similar to that seen in renal allograft recipients. However, it remains latent and does not appear to be a cause of renal dysfunction in these patients. Furthermore, none of these patients had evidence of Abstracts viremia, which is noticeably different from findings in renal transplant recipients. Thus, even in the setting of established BK viruria, immunosuppression alone is insufficient to cause BK viremia and nephropathy, and a second, organ-specific hit is necessary, such as kidney inflammation/ischemia or donor-recipient HLA mismatch. Background: Acute and chronic nephrotoxicity are well recognized complications of calcineurin inhibitor use following solid organ transplantation. While single center studies have described the prevalence of renal dysfunction in pediatric liver transplant survivors, attempts to identify risk factors have been hampered by the limitations and biases associated with small populations and single centers. In an effort to validate previous observations we examined multi-center data from a population representing pediatric liver transplant recipients cared for at centers throughout North America. Purpose: To determine the prevalence and identify the predictors of decreased glomerular filtration rate (defined as calculated GFR < 80 cc/min/1.73m²) in children 3 years postliver transplantation (LT). Methods: We queried the SPLIT database to determine the prevalence of GFR < 80 in a population of > 200 pediatric patients at 3 years post-LT. The primary outcome was GFR calculated using the Schwartz formula (cGFR). Independent variables were primary diagnosis, age at LT, race, sex, type of insurance, organ type, PELD score, height Zscore at LT, length of hospitalization and primary immunosuppression. We performed univariate analysis to identify predictors of a cGFR < 80 at 3 years post-LT. The median age at LT was 1.8 years, with 37.3% of patients transplanted at < 1 year of age. 97.1% of patients received calcineurin inhibitors as primary immunosuppression (51.9% cyclosporine (CSA), 45.2% tacrolimus (TAC)). At 3 years post-LT, 7.9% of patients had a cGFR < 80. In univariate analysis, only CSA primary immunosuppression was associated with a cGFR < 80, with an odds ratio vs. TAC of 8.4 (CI 1.89, 37.3) . This association remained significant after adjustment for transplant era (odds ratio 5.6; CI 1.25, 25.1). Previous studies have suggested an increased risk of post-LT renal dysfunction associated with CSA compared to TAC. However, this association is frequently confounded by time since transplant and transplant era, which correlate strongly with CSA use. Our data, which controls for these confounding factors, supports the concept of CSA immunosuppression as a risk factor for decreased GFR at 3 years post-LT. These observations are of critical importance as we develop interventions designed to prevent progression from asymptomatic decreased GFR to symptomatic end-stage renal disease. We reported the first two cases of laparoscopic harvesting of left lateral liver grafts (Lancet 2002; 359: 392-96 ) . The aim of this study is to report our 2-year experience with this technique. Methods : From 2001 to 2003, 7 donors underwent laparoscopic left lateral sectionectomy for transplantation in their children. There were 4 mothers and 3 fathers aged 19-37 years (mean 28). Surgical technique included 5 port-laparoscopy (2-12 mm, 1-10mm and 2-5mm) with CO2 pneumoperitoneum, dissection of the left portal pedicle and liver transection without clamping. Grafts were retrieved in a bag through an 8-cm supra-pubic incision. The recipients were 2 girls and 5 boys, with a mean age of 15 months (10-19) and a mean weight of 9.3 kg (7) (8) (9) (10) (11) . Indication was biliary atresia with previous hepatoportoenterostomy in 6 cases and metabolic disease in 1. Donors results : Mean operative time was 4.5 hours (4-6) and mean warm ischemia time was 12 minutes (10-15). There was one intraoperative complication due to hemorrhage from the left portal branch which was immediately sutured (case 4) but conversion to laparotomy was undertaken because of suspected stenosis of the left portal vein. There were no other conversions or intraoperative complications. There was one postoperative complication consisting in a pelvic hematoma from the extraction incision, which required no treatment. No patient was transfused intra or postoperatively. Mean hospital stay was 6.2 days (4-10). Recipients results : All grafts were transplanted and functionned immediately. There were two arterial thromboses. One was asymptomatic (routine Doppler US finding), while the other one was associated with poor portal flow leading to the child's death. One child with cholangitis had a percutaneous biliary drain placed. All children but one are alive with a functionning graft. ABO incompatible liver transplantation (LT) is associated with lower patient and graft survivals compared to ABO identical or compatible grafts. We have examined the effect of ABO match on outcome after LT in children enrolled in the Studies of Pediatric Liver Transplantation (SPLIT) -a consortium of 39 pediatric LT centers in the US and Canada. RESULTS: 46 ABO incompatible grafts were transplanted into 45 of 1491children. Of recipients of an incompatible graft, 78.3% were blood type O, 17.4% type B, and 4.3% type A. The blood type of incompatible donors was A in 71.7%, AB in 10.9% and B in 17.4%. Recipient age was <1year in 38%. The most common diagnoses were fulminant liver failure (33.3%) and biliary atresia (24.4%). 13 children had a prior transplant. 55.6% received a whole organ graft. 68.9% were in the ICU at LT, 20% were not hospitalized. The PELD score was ≥20 in 55.6%, but <10 in 11.1%. The # of ABO incompatible LTs has decreased by half comparing 1996 to 2002. Actual patient and graft survival was 60% and 51.1% respectively. There was no significant difference in patient or graft survival for recipients of ABO incompatible grafts for children <1 year, 1-5 years or > 5 years of age, or for children with fulminant liver failure compared to those with biliary atresia. Of the 18 children who died, 8 died within 30 days of LT. Kaplan-Meier analyses of time to death and graft loss for recipients of ABO identical compared to incompatible grafts showed a highly significant difference as early as 3 months after first LT: graft survival 63.8% and 88.7 % respectively, p<0.0003. Table 1 shows the risk ratios for death and graft loss for blood type match from a Cox-Regression model (first transplant). 13% of ABO incompatible graft recipients received mono/ polyclonal antibody induction therapy and 28.9% had at least 1 episode of rejection. CONCLUSIONS: ABO incompatible LT for children is associated with a significantly lower patient and graft survival, implying this procedure should be performed in only highly urgent patients. Surprisingly in this database only 25 of 45 ABO incompatible recipients had a PELD score >20. In marked contrast to data reported for adults, access of waitlisted children to kidney, liver, and heart transplantation is not influenced by race or gender. Access to pediatric kidney transplantation varies significantly by OPO, insurance and blood type. Access to pediatric liver and heart transplantation varies significantly by blood type and OPO, but not by insurance. However, the OPO effect in pediatric liver and heart transplantation is less than that reported in adults. Abstracts survived more than 5 years. Actuarial patient survivals at 6-month/1-year/2-year were 48%/44%/32%, 61%/54%/47%, 91%/84%/74%, and 60%/47%/47% in Group 1, 2, 3 and 4, respectively. Incidences of severe rejection were 36%, 23%, 11% and 0% in Group 1, 2, 3 and 4, respectively 3 George V. Mazariegos, 1 Jorge Reyes. 1 1 Department of Transplant Surgery, Children's Hospital of Pittsburgh, Starzl Transplantation Institute, Pittsburgh, PA; 2 Division of Gastroenterology and Nutrition, Children's National Medical Center, Washington, DC; 3 Department of Psychiatry, Western Psychiatric Institute and Clinic, University of Pittsburgh Medical Center, Pittsburgh, PA. With extended survival and improved function following pediatric intestine transplantation (ITx), evaluating neurodevelopmental outcomes is an integral dimension of post-transplant care. Methods: The Vineland Adaptive Behavior Scale, Child Behavior Checklist (CBCL), and Child Health Questionnaire (CHQ) were used to assess somatization, anxiety, behavior, functional impairments, physical development, social competence and daily living skills (DLS). Descriptive statistics, between group comparisons and correlations were completed. Results: 35 patients were enrolled (9 ITx, 23 liver-ITx, 3 MVTx); M:F 18:17; mean age 9.4 years (range 1.1-22.7); mean time post-ITx 4.54 years (range 0.25-12.5 years). Weaknesses in all domains of the Vineland were seen for ITx patients compared to the norm (p<.001) with children transplanted at ≤ 4 years of age having greater weaknesses than those > 4 years. There was a significant correlation between age at time of transplant with DLS (r=.368; p=.04) and socialization (r=.442; p=.01). Borderline and/or clinical range scores occurred in all CBCL behavior subscales except anxious/depressed and thought processes. The greatest problems were reported in the domains of school (26.3%), somatic complaints (13.3%), competence (11.8% and 17.6%), internalizing (13.3%) and activities (11.5%). Withdrawn (r=.40; p=.04), internalizing (r-.40; p=.04), and competence (r=-.53; p=.03) behaviors were significantly correlated with age at time of assessment. CHQ results revealed significant differences in the domains of Physical Functioning (p<.001), Social Limitations: Physical (p<.001), General Health Perception (p< .001), Pain (p=.04), Family Activities (p<.001) and Parental Impact: Emotional (p<.001). Improvements were seen over time at ≤ 3 years and > 3 years post-Itx. Conclusions: Following Itx, children have significant weaknesses in several neurodevelopmental and psychosocial domains and have greater limitations in phsyical and psychosocial functioning when compared to the norm. At greater risk are children undergoing Itx at a very young age. However, improvements are seen over time in physical functioning, family activities, parental impact, and mental health with family relationships appearing to be stronger. Early identification of deficits and supporting the child's strengths may lead to an improved QOL following ITx. 1 1 Surgery, University of Wisconsin, Madison, WI; 2 immunohematology and Blood Transfusion, Leiden University Medical Center, Leiden, Netherlands. Background: The generation of CD8+ T memory cells cross-reactive with alloantigens has been proposed as a major obstacle to clinical allo-tolerance after transplantation. When CD8+ cells appear in the graft, it is most often in the guise of T cytotoxic effector cells, in association with rejection. This may not always be the case. Using differential HLA-tetramer staining, we have identified a CD8+ memory T cell population with regulatory properties (CD8 TR) in a patient with over 30 years of tolerance. CD8+ memory T effector (TE) cells specific for the same minor H antigen (HA-1) but with much higher apparent avidity for cognate MHC peptide were also present in two-fold lower frequency, and could be suppressed by the TR cells. Because the HA-1 minor antigen is normally leukocyte restricted we hypothesized that HA-1 microchimerism, in particular in the DC subsets, may influence the type of regulation that develops (TGFβ or IL-10). Methods: We have examined three renal transplant patients who have maintained stable graft function. These patients were closely matched for HLA, but mismatched for the HA-1 minor antigen and were classified as regulators by TGF-β or IL-10 cytokine neutralization using the trans-vivo DTH assay. T cells, B cells, monocytes, and dendritic cells (CD11c+, CD123+) were separated by flow sorting and examined for microchimerism by nested PCR for HA-1. HA-1 microchimerism existed almost exclusively in the dendritic cell population. In addition, a group of six regulators and six non-regulators as determined by DTH assay where analyzed for percentages of myeloid (CD11c+) and plasmacytoid (CD123+) dendritic cells by four color flow cytometry by gating for CD3,14,19-, HLADR+ cells. Results: Individuals with a higher percentage of plasmacytoid dendritic cells were more likely to regulate by the production of IL-10, as opposed to TGF-β. There was no difference between TGF-β regulators and non-regulators in proportion of the DC subsets. Conclusion: The presence of HA-1 microchimerism in the dendritic cell subsets influences the phenotype of the low avidity CD8+ T regulatory cell population seen in these patients; in particular a shift towards the plasmacytoid dendritic cell population (CD123+) may predispose the patient toward IL-10 mediated regulation. Regulatory cells (RegC) play an important role in non-deletionnal tolerance but their phenotype remains controversial as well as the compartment (graft versus lymphoid tissues) where they mature, expand and operate. To study that, we developed a RegCbased model of tolerance after heart Tx in a fully mismatched RA-to-PVG rat combination via Donor-specific-blood-transfusion (DSBT). Adoptive transfer showed presence&expansion of RegC in spleen and lymph nodes, a phenomenon dependent upon the presence of the thymus, the graft and DSBT. By using selective adoptive transfer (MACS system), we demonstrated that RegC are exclusively CD4+/CD45RC-(similar to RegC involved in control of autoimmune diseases in rats) whereas CD4+/ CD45RC-cells acted as effector cells, capable of accelerating rejection. CD4+/CD25+ and CD4+/CD25-evoked identical function. In tolerized grafts, we found rapid (day 5), progressive and sustained (day 5-to-14) infiltration by these regulatory CD4+/CD45RCcells and a lower infiltration by effector CD4+/CD45 RC+ cells, whereas rejecting grafts displayed an exactly reverse profile (progressive predominance of CD4+/ CD45RC+ cells over CD4+/CD45RC-cells). This differential profile between rejecting and tolerant rats was not as clearly seen in the spleen, suggesting that regulatory activity is concentrated in the graft. Finally, exposure of DSBT-treated rats to highdoses CsA (50mg/kg) blocked the generation of RegC (assessed by adoptive transfer), caused rejection and transformed the graft infiltrate profile from a tolerant (CD4+/ CD45RC-predominant) into a rejecting one (CD4+/CD45RC+ predominant), suggesting that RegC exert their protective effect in the graft. Intragraft presence of RegC was then unequivocally proven by the observation that 1x10 7 graft infiltrating cells (at day 14 and 30) or reTx of a tolerized graft (at day 5) could tolerize donorspecific grafts in second set naive recipients (n=6 in each experiment; p <.01=significant). Conclusions. Altogether these data demonstrate that CD4+/CD45RC-RegC are present in tolerated grafts where they protect transplanted tissues from effector CD4+/CD45RC+ cells via a direct intragraft mechanism, the nature of which being under investigation. Fabienne Haspot, Celine Seveno, Flora Coulon, Marcello Hill, Karine Renaudin, Claire Usual, Jean Paul Soulillou, Bernard Vanhove. U437, ITERT INSERM, Nantes, France. B7 (CD80, CD86) interaction with CD28 is essential for optimal activation of naive T cells. On the other hand, B7 can interact with CTLA-4 which inhibits T cell activation and proliferation. In addition, it was recently shown that CTLA4-Ig stimulates B7 on APC, resulting in the production of indoleamine dioxigenase (IDO), an enzyme that catabolizes tryptophan, which in turn inhibits T cell proliferation. The activation of IDO has been associated with tolerance induction in rodents. Therefore, selectively inhibiting the B7/CD28 pathway without blocking that of B7/CTLA-4 may be strongly immunosupressive and facilitate tolerance induction. In this study, we monitored the immune responses in a model of acute kidney graft rejection (LEW 1W -RT1A u -to fully mismatched LEW 1A -RT1A a -), after the selective inhibition of CD28/B7 interaction using the modulating anti-rat monoclonal antibody JJ319. This antibody was previously shown to prevent rejection in the F344 to LEW model of chronic rat kidney graft rejection. A short term treatment with 8 doses of 4mg/kg of JJ319 (day 0 to 7) resulted in 55% of grafts surviving long term (150>days). Treated animals had an increased alloantibody response skewed towards a Th-2 type (IgG1 and IgG2a isotypes) and specifically directed against donor MHC II molecules. This was in contrast with the antibody response of the Th1-type (IgG2b) directed against MHC I and II molecules found in rejected untreated recipients. Three to four months after transplantation, kidney graft function was normal and stable (Urea: 10 mmol, Creatinin: 39 µ mol) and no signs of chronic rejection could be evidenced according to the Banff classification. In these functionally tolerant animals, PBMC and spleen cells were unable to proliferate against donor cells in mixed lymphocyte reactions but could proliferate against third party cells. The blockade of IDO (using 1D methyl tryptophan) and NO generation (using N-Methyl-L-Arginin) fully restored anti-donor reactivity. T cells purified from the same PBMC and splenocytes were fully reactive. Moreover, depletion of OX42 + (CD11b/ c) cells did not restore proliferation. In conclusion, the selective blockade of CD28 generates regulatory mechanisms that do not involve classical regulatory T cells and also induces an anti-class II antibody response of the Th2-type. These regulatory mechanisms are associated with a normal kidney graft function in the long term without histological signs of chronic rejection. Although CD4+ T regulatory cells (Treg) are critically involved in the induction and maintenance of transplantation tolerance, direct evidence of their physiological role in primary recipients (without concomitant exogenous adoptive cell transfers) remains to be elucidated. In this study, we analyzed the immunological mechanisms of allograft prolongation induced by a single dose of anti-CD154 mAb in fully MHC-mismatched recipients (Balb/c to C57BL/6 cardiac model), and focused on the interaction between CD4+ Treg and alloreactive CD8+ T cells during the transplant maintenance phase (>50 days). Donor-specific immune tolerance was established in long-term cardiac allograft recipients, as evidenced by the acceptance of donor-type (MST>30 days; n=6) but rejection of third-party (C3H, MST±SD = 12 ± 4 days; n=3) test skin grafts. In agreement with the allograft survival data, in vivo alloreactive CD8+ T cell activation (% of CD44 high CD62L low in total CD8+ T cell population) induced by donor-type skin was inhibited (3.2%). However, third-party skin grafts readily induced CD8+ T cell activation (3.4%→12.3%). To determine whether anti-donor CD8+ T cells were deleted or remained under the dominant CD4+ Treg regulation, a depleting anti-CD4 mAb was administered prior to skin graft challenge. This resulted in prompt rejection not only of donor-type test skin, but also of the original cardiac grafts. Additionally, alloreactive CD8+ T cell activation was also recovered (4%→25%). To further dissect the mechanism of CD4 Treg-CD8 interaction, we used a depleting CD25 mAb or blocking CTLA-4 mAb protocol. Depletion of CD4+CD25+ cells resulted in rejection of secondary skin grafts in ca. 50% of long-term tolerant hosts (MST±SD=10±4 days; n=4). Alloreactive CD8 activation was restored (1.4%→13.2%) in rejecting, but not in non-rejecting tolerant hosts, despite CD4+CD25+ depletion. Blocking of CTLA-4 uniformly resulted in the rejection of test skin grafts (MST±SD=9±3 days, n=4) and subsequent rejection of the original cardiac grafts, in parallel with activation of alloreactive CD8+ T cells (6.3%→17.7%). In conclusion: 1) CD154 blockade did not deplete donor-alloreactive CD8+ T cells in tolerant recipients; 2) CD4+ Treg prevented activation of CD8+ T cells following secondary allogeneic test skin graft; 3) this active regulation of alloreactive CD8+ T cells by CD4+ Treg was donor-specific; 4) it was mediated by CTLA-4 signaling; 5) CD25+CD4+ Treg were only partially responsible for the dominant regulation. 1 1 Pathology and Laboratory Medicine, Children's Hospital of Philadelphia and University of Pennsylvania, Philadelphia, PA; 2 TolerRx, Inc., Cambridge, MA. Genes such CD25, CTLA-4, and GITR can be expressed by both regulatory T cells (Treg) and other cells, and expression of these genes by even T-reg can vary with activation. By contrast, expression of the Foxp3 transcription factor is restricted to CD4+ CD25+ T-reg, appears stable irrespective of T cell activation, and is necessary for the maintenance and function of T-reg cells. We report in vivo data linking selective Foxp3 expression post-transplant (post-Tx) and specific costimulation blockade protocols which lead to long-term allograft survival and tolerance induction. By real-time PCR, levels of Foxp3 are several hundred-fold higher in normal spleen and thymus than other tissues. Analysis of serially harvested cardiac allografts (BALB/c->B6), shows splenic Foxp3 expression decreases in a stepwise manner with developing rejection and rise within allografts; levels at day 5 post-Tx within cardiac allografts are 160-fold higher level than in isografts and, remarkably, 4-fold higher than splenic levels, indicating the migration of T-reg cells to rejecting allografts. However, Foxp3 expression in day 7 allografts harvested from recipients treated with CD154 mAb plus DST were enhanced >15-fold compared to controls or combined CD28/anti-ICOS mAb targeting. Since both costimulation blockade protocols induce permanent engraftment, comparisons at later intervals was possible; despite similar levels of intragraft T cells, only CD154/DST was linked with high Foxp3 expression and only that protocol induced actual donor-specific tolerance. Immunohistologic studies localized Foxp3 expression to infiltrating mononuclear cells post-CD154/DST. CD154/DST therapy was not accompanied by increased TGF-b, IL-10 or indoleamine 2,3 dioxygenase (IDO) expression as compared to levels in control cardiac allografts. Studies of islet allografts (BALB/c->B6) also showed levels of foxp3 expression which were 20-fold higher in conjunction with CD154/DST vs. CTLA4.Ig or combined anti-ICOS/CTLA4.Ig, and again showed no correlation with intragraft expression of TGF-b, IL-10 or IDO, despite similar T cell infiltration. We conclude that analysis of Foxp3 expression even at early intervals post-Tx indicates key differences between varying protocols which induce long-term engraftment, as well as with rejecting allografts. Ongoing studies are directed towards assessment of (i) whether Foxp3 expression is required for tolerance induction, and (ii) Foxp3 expression in well-functioning vs. rejecting clinical transplants. In a rat model disparate for one class I antigen RT.1A a , PVG.R8 to PVG.1U, we previously showed that intrathymic modulation with donor class I allopeptides or splenocytes resulted in prolonged survival of cardiac allografts associated with chronic rejection. Prolongation correlated with the development of regulatory cells in the primary recipients that were able to prevent both acute and chronic rejection following adoptive transfer into secondary recipients. The goal of this study was to characterize these cells with particular emphasis on CD4 + CD25 + T cells. Tolerant secondary graft recipients had substantially higher percentages of CD4 + CD25 + T cells in the spleen (23 ± 3%) and blood (28 ± 6%) as compared to naïve rats (11 ± 3% and 9 ± 6%). RT-PCR experiments showed high expression of Foxp3 in accepted heart grafts compared to no expression in acutely rejected control grafts. CD4 + CD25 + T cells inhibited donorspecific proliferation responses in vitro. Importantly, depletion of these cells from splenocytes of long term secondary graft survivors abrogated their ability to transfer tolerance to tertiary graft recipients. Furthermore, tolerogenic effect both in vitro and in vivo was found to be associated with high IL-10 production. These data demonstrate that cardiac allograft tolerance, established through intrathymic immune modulation, is mediated by CD4 + CD25 + Treg that express high levels of Foxp3 and are induced by indirect allorecognition. Jose Torrealba, 1 William J. Burlingham, 1 John H. Fechner, 1 Ewa Jankowska-Gan, 1 Krista Haanstra, 2 Jacqueline Wubben, 2 Margreet Jonker, 2 Stuart J. Knechtle. 1 1 Surgery, University of Wisconsin, Madison, WI; 2 Immunotherapy, Biomedical Primate Research Center, Rijswijk, Netherlands. Tolerance to kidney allografts in the Rhesus monkey after anti-CD3 immunotoxin (IT) induction therapy fit the criteria of "metastable tolerance".TGFΒ(latent)+ infiltrates appear in the kidney allograft between 3 mos-1 yr & correlate with resolution of low grade ARej, and with TGFΒ-dependent , regulated anti-donor DTH in the periphery. Methods: Monkeys either fully MHC mismatched (n=6), or matched for at least one Mamu-DR (n=5) with their kidney donor were induced with anti-CD3 IT , Cyclosporin or anti-CD154 mAb. We investigated by 1-and 2-color IP the subtypes of TGFΒ(latent)+ "regulatory" as well as "effector" cells in kidney biopsies. Using a trans-vivo DTH assay, we analyzed the function of effector & regulator graft-infiltrating cells(GIC) harvested from 2 ARej grafts using collagenase. Results: Graft survival of >1000 days was observed in 6/11 monkeys; 5/6 >1000 days were matched for at least 1 Mamu DR antigen , including 1that has retained its allograft since 1982 (>20 yrs!). As shown in Table 1 , control kidneys were devoid of TGFΒ+ cells(<0.1 cells/10 tubules); low nos. were seen in ARej, and a 10-fold higher no.in grafts w/o rejection. Of the TGFΒ+ GIC, 50% co-stained for CD4 (Table 1) , and were roughly equal in no. to the total CD3+ TGFβ+ GIC (not shown). DTH analysis indicated that even during loss of tolerance and onset of rejection, GIC effectors could not mediate DTH unless TGFΒ1 was neutralized-i.e. the small nos. of TGFβ+ GIL in the peritubular areas (mean=0.85) may retard rejection. Conclusion: Metastable tolerance in the Rhesus monkey kidney allograft model is balanced between TGFΒ(latent)+ regulatory GIC-both CD4+ T and non-T-versus effector CD4+,CD8+ and CD68+ cells. Surprisingly, TGFΒ(latent)+ GIC may persist for >20 yrs without causing chronic rejection. Shuiping Jiang, 1 Dela Golshayan, 1 David S. Game, 1 Robert I. Lechler. 1 1 Department of Immunology, Faculty of Medicine, Imperial College London, Hammersmith Hospital, London, United Kingdom. Naturally occurring autoreactive CD4 + CD25 + regulatory T cells play a key role in the prevention of autoimmunity and appear to mediate transplantation tolerance. Although CD4 + CD25 + cells are selected on MHC class II-self peptide complexes, data from several transplantation tolerance models indicate that these cells may have indirect allospecificity for donor antigens. CD4 + CD25 + cells with specificity for a defined peptide antigen have not been described. Methods: In order to establish HLA-A2 (103-120) peptide-specific CD4 + CD25 + regulatory T cell lines, purified peripheral blood CD4 + CD25 + cells from HLA-DR1 + A2individuals were primed with autologous dendritic cells pulsed with the A2 peptide. Results: the cell lines were potent inhibitors of proliferation and IL-2 secretion by CD4 + CD25 -T cell lines specific for the same peptide. The antigen-specificity for the A2 peptide was demonstrated in suppression assays and flow cytometry analysis using a fluorescent tetramer composed of DR1:A2 (103-120) peptide complexes. About 9% of CD4 + tetramer + cells were seen in the CD4 + CD25 + cells, while 68% of CD4 + CD25cells were CD4 + tetramer + , demonstrating that human CD4 + CD25 + regulatory T cells with indirect allospecificity for a defined allopeptide can be selected in vitro. To extend these studies in vivo we have established similar lines from CBA mouse CD4 + CD25 + cells specific for a class I K b peptide. The indirect allospecificity of the murine lines was tested in the setting of skin transplantation. The CD4 + CD25 + cells prolonged CBK (CBA transgenic for K b ) skin graft survival into CBA mice, but not third party BALB/c skin graft without the use of any other immunosuppression, suggesting that the CD4 + CD25 + cells have specificity for the K b peptide. Taken together, these data suggest that self-reactive CD4 + CD25 + regulatory cells can be hijacked into allopeptide specific cells in vitro, and these cells are able to limit alloresponses in vivo, thus paving the way for using CD4 + CD25 + regulatory T cells as cell therapy to promote clinical transplantation tolerance. Brigham and Womens Hospital, Harvard Medical School, Boston. We and others have shown that previously unsensitized B6 mice (H-2 b ) fail to reject MHC class II mismatched bm12 (H-2 bm12 ) cardiac allografts, although they do reject bm12 skin allografts. Spontaneous cardiac graft acceptance is not a result of low alloreactive precursor frequency, as TCR transgenic mice directly reactive to I-A bm12 (termed anti-bm12, ABM mice), in which over 95% of T cells proliferate in response to bm12 cells in vitro or in vivo, also do not reject bm12 cardiac allografts. As CD4+CD25+ regulatory T cells have been shown to play a role in maintaining self-tolerance and regulating graft rejection, we asked whether or not such cells can mediate spontaneous allograft acceptance. To determine if regulatory T cells played a role in spontaneous allograft acceptance, B6 mice were thymectomized and depleted in vivo using the anti-CD25 mAb PC61. Mice treated in this fashion rapidly rejected bm12 cardiac allografts whereas mice thymectomized without CD25 depletion accepted bm12 cardiac allografts, as do control untreated mice. Similarly, treatment of B6 recipients with blocking anti-CTLA4 mAb induced bm12 cardiac allograft rejection. We next studied B6 or ABM mice which were long-term (>100 days) bm12 cardiac allograft recipients. We found that T cells from these mice exhibit reduced proliferative activity (by CFSE dye dilution) and IL-2 secretion against bm12 stimulators compared with naive B6 or ABM responders. Interestingly, the in vitro T cell hyporesponsiveness of long-term cardiac allograft recipients was completely abrogated by depletion of CD25+ T cells from the responding population. Depletion of CD25+ T cells from naive mice had no effect on the anti-bm12 response, suggesting that successful long-term engraftment increased the number and/or potency of CD4+CD25+ regulatory T cells. Strikingly, and consistent with these in vitro findings, B6 mice with spontaneous long-term engraftment of bm12 hearts did not reject bm12 skin transplants, while naive B6 mice rapidly reject bm12 skin (∼12-18 days). To our knowledge, these are the first data showing that pre-existing regulatory T cells can be sufficient to promote spontaneous vascularized graft acceptance. Our results also show that during the process graft acceptance, regulatory T cells expand in number and/or potency, and this is accompanied by the de novo acquisition of skin allograft acceptance. These observations have therapeutic implications for using autologous regulatory T cells to promote transplant tolerance. Deepali Kumar, 1 Michael Drebot, 2 Susan Wong, 3 Gillian Lim, 4 Harvey Artsob, 2 Peter Buck, 4 Victoria Edge, 4 Atul Humar. 1 1 Infectious Disease and Transplantation, University of Toronto, Toronto, ON, Canada; 2 National Microbiology Laboratory, Health Canada, Winnipeg, MB, Canada; 3 New York State Dept.Health, New York, NY; 4 Centre for Infectious Disease Prevention and Control, Health Canada, Ottawa, ON, Canada. Background: West Nile virus causes severe neurological disease in approximately 1 in 150 cases. However, the occurrence of severe disease may be more common in immunosuppressed transplant patients. In the summer of 2002, a West Nile outbreak occurred in the Toronto, Canada area. The area has a large multi-organ transplant program. To determine the spectrum of disease and clinical impact of community acquired West Nile infection and assess public health behavior patterns in transplant recipients, we carried out a seroprevalence study in this patient population. Methods: Patients were enrolled primarily from outpatient transplant clinics and the transplant inpatient ward. Patients who had not left hospital since transplant were excluded. Sera were initially screened for antibodies to West Nile virus by the hemagglutination inhibition (HI) test. HI reactive sera were then tested using a West Nile virus IgM ELISA and a plaque reduction neutralization test. A questionnaire was provided to patients to assess knowledge and behavior patterns with regards to West Nile virus. Results: 855 organ transplant patients were enrolled. Type of transplant included kidney (n=419), liver (n=204), lung (n=94), heart (n=83), pancreas (n=46) and others (n=9). Median time from transplant was 50.6 months (range 2 weeks -410 months). The seroprevalence of IgM antibody to West Nile was 6/855 (0.7%). One patient had IgG antibody alone and likely had remote infection. All 6 patients (100%) had symptomatic disease, and 5/6 (83%) had meningitis, encephalitis, and/or acute flaccid paralysis. The knowledge among patients concerning the risk of West Nile virus infection was incomplete, and behavior patterns reflected a poor rate of compliance. Only 56% knew of at least one protective measure and only 44% had acted on at least one protective measure. Only 33% of patients used insect repellant when outdoors. Conclusions: Community acquired West Nile virus is an important threat to transplant patients. The rate of severe neurological disease is much higher than reported in the general population. Education regarding personal protection measures is critical in these patients. However, despite high public awareness of West Nile virus, and specific educational attempts by the transplant program, incomplete knowledge and poor rates of compliance were reported. Amongst the 9 patients with solid organ transplants infected with West Nile virus reported thus far in literature (see Table) , 8 developed encephalitis and 2 died. This is alarming compared to the less than 1% incidence of serious neurological manifestations and 2% mortality in the general population. Over the past year, we treated 3 solid organ recipients who developed West Nile fever on long-term follow-up (see Table) . They included a 44-year-old male (kidney-pancreas), a 37-year-old female (pancreas) and a 2½-year-old (living donor kidney). Their immunosuppression is detailed in below. Two of them developed serious meningoencephalitis requiring ventilator support for up to a week. The management strategy included prompt reduction of immunosuppression and supportive care. The fever subsided over a week in all patients, but the neurological recovery was slower. The child recovered to near normalcy in two weeks and was discharged home. In contrast, the adult male who developed encephalitis has significant neurological sequelae and is undergoing rehabilitation. Immunosuppression was restored upon recovery in all. Transplant physicians, particularly in West Nile virus endemic areas, must be alive to the possibility of West Nile fever in their patients developing fever in the summer months. Prompt reduction of immunosuppression is probably the most vital step in the successful management of these patients. During this outbreak, we encountered both MENC and acute flaccid paralysis (AFP) due to community-acquired WNV disease in our otherwise stable population of transplant recipients. Design/Methods: 10 transplant recipients (4 renal, 1 renal/ pancreas, 2 liver, 1 lung, 1 bone marrow) were hospitalized with severe WNV infection documented by WNV IgM in CSF or serum. Clinical features, diagnostic and laboratory studies including CSF examination, neuroimaging, EEG, EMG/NCV, and neuropathology were characterized. Results: Estimated incidence of MENC+AFP due to WNV virus infection in Colorado was .014% for the general population of approximately 4 million, and .250% for the approximately 4000 transplant recipients in the state. All patients acquired infection in the community, none had history of recent (<3 mos) transfusions, and all received their transplants 8 mos-15 yrs prior to WNV infection and were on maintenance immunosuppression. 9 of 10 patients had MENC and 3 had associated AFP with quadriparesis. One case had AFP alone. All AFP cases required at least transient mechanical ventilation. CSF examination demonstrated pleocytosis (10/10, counts ranged between 5-5400), elevated protein (10/10), and lymphocytic predominance (9/10). Brain MRIs demonstrated abnormalities of white matter (7/8) and thalamus, basal ganglia and brainstem (3/8) . EEGs were abnormal (7/ 7) and showed generalized slowing (7/7), triphasic slow waves (2/7), and PLEDs (2/ 7). 3 patients had seizures. Neuropathology of our fatal case demonstrated multifocal necrosis in thalami, s. nigra, pons, cerebellum and anterior spinal cord. Management strategies included reduction in immunosuppression and use of interferon (6), standard IV Ig (4), WNV IV Ig (2), and ribavirin (1) Background: SARS is an emerging infection caused by a novel coronavirus (CoV). During the worldwide SARS epidemic, two transplant patients developed infection at our center. SARS posed several problems unique to transplantation. First, transplant patients with SARS may have more severe disease with greater infectivity. Second, SARS could theoretically be transmitted from a donor to a recipient. To prevent the latter, a clinical SARS donor screening tool was implemented. We report a case of SARS and provide our experience with SARS screening. Methods: SARS CoV viral load was performed using RT-PCR on tissue obtained at post-mortem and compared to a cohort of non-transplant SARS patients (21 patients). The donor SARS screening tool was prospectively utilized and assessed during the outbreak. Results: A 57 year old man with emphysema received a double lung transplant. Postoperatively he had a stroke and was in a rehabilitation facility. At 4 months posttransplant he developed fever, myalgias, diarrhea, and progressive dyspnea requiring intubation. Despite aggressive treatment, he died two weeks after onset of disease. BAL and stool samples were positive for the SARS-CoV. The illness was transmitted to several others including his wife and three healthcare workers. The CoV viral load was higher in all tissues compared with post-mortem tissues from a non-transplant cohort with SARS (Table 1) . Specifically, the lung viral load was 10,000-fold higher in the transplant patient than the cohort mean. SARS resulted in the temporary closure of transplant programs until a clinical screening tool for donors was implemented. During the SARS outbreak, 22 cadaveric donors were screened for SARS with this tool. Two donors were determined high risk and refused. No evidence of SARS transmission was seen with use of the remaining donors. SARS CoV viral load from a transplant patient vs. mean tissue viral load from non-transplant cohort (n=21) with SARS Tissue Transplant patient viral load (copies/g) Cohort viral load (copies/g) Lung 8.8 x 10 9 3.6 x 10 5 Lymph node 8.9 x 10 8 7. Background: In patients with CMV disease, a high rate of viral co-infection with other herpesviruses, specifically HHV-6 and 7, has been reported. The effect of herpesvirus co-infections on the rate of response of CMV disease to therapy is largely unknown. We prospectively analyzed herpesvirus co-infections in a cohort of organ transplant recipients with CMV disease and assessed their effect on clinical and virologic outcomes. Methods: In solid organ transplant patients with CMV disease, about to start ganciclovir therapy, samples were collected at baseline (disease onset) and then at a minimum of 1week intervals. Viral load testing for CMV, HHV-6 and HHV-7 was done using realtime PCR assays. Qualitative PCR for EBV was also done. Results: 50 transplant patients with CMV disease were analyzed. 36/50 (72%) patients had received previous prophylaxis, and therefore the time of CMV disease onset was quite late post-transplant (median 152 days). Herpesvirus co-infection was detected in only 5/50 (10%) of patients and included HHV-6 infection in 2 patients (4%) and EBV infection in 3 patients (6%). HHV-7 co-infection was not detected in any patient. CMV peak viral load, and viral load at onset of disease was similar in patients with and without other herpesvirus co-infections (p=NS). Time to clearance of CMV viremia (after starting ganciclovir) was 20.5 ± 12.4 days in patients with viral co-infection vs. 17.6 ± 8.6 days in patients without (p=NS). In the two patients with HHV-6 co-infection, HHV-6 viral copy number was 100-1000 fold higher than CMV copy number and was unaffected by ganciclovir therapy. Conclusion: Herpesvirus co-infections were uncommon in patients with CMV disease. Specifically a low rate of HHV-6 co-infection (4%) and no HHV-7 co-infections were seen. This is in contrast to previous reports, and may be due to the late onset of CMV disease in a heavily prophylaxed population. Co-infection did not affect CMV viral clearance rates or clinical response to therapy. In patients with HHV-6 co-infection, HHV-6 viral load was unaffected by ganciclovir therapy. Purpose: To determine whether transplant surgeons believe patients infected with HBV, HCV, or HIV should be candidates for transplantation, and to clarify the factors that influence these views. Methods: We mailed a 3-page questionnaire to all U.S. transplant surgeons included in the American Society of Transplant Surgeons' (ASTS) mailing list, along with a cover letter explaining the purpose of the study and a $10 incentive. We sent a second questionnaire to all surgeons not responding within 5 weeks. We cross-matched the ASTS list with the American Medical Association's Master File to obtain data on surgeons' practice-related and demographic characteristics, and compared characteristics of responders and nonresponders to assess the potential for nonresponse bias. Results: Of 619 eligible transplant surgeons, 347 (56%) provided complete responses. Overall, 69%, 71%, 36%, and 6% of surgeons believed that HBV+ patients, HCV+ patients, HIV+ patients, and patients with AIDS, respectively, should be candidates for transplantation. For each patient population, surgeons' perceived post-transplant survival among infected patients was a strong predictor of their willingness to allocate organs to these patients (all p < 0.001). In a multivariable logistic regression model, older surgeons (odds ratio (OR) = 1.9; p = 0.02), thoracic surgeons (OR=20.0, p < 0.001), and surgeons with greater fears of becoming infected with HIV intraoperatively (OR=1.9, p = 0.02) were less likely to consider HIV+ patients as transplant candidates. Most surgeons (56%) erroneously believed that the intraoperative patient-to-surgeon transmission risk of HIV was greater than that for HBV, HCV, or both. Nonresponse bias is unlikely to have influenced these results because the only difference between responders and nonresponders was that surgeons board-certified in urology were less likely to respond than surgeons with other certifications (p <0.0001). Conclusion: The majority of surgeons do not believe that HIV-infected patients should be candidates for transplantation. Surgeons' expressed willingness to allocate scare organs to such patients is most strongly associated with their estimates of posttransplant survival, but is also associated with practice type and fear of patient-tosurgeon transmission. The availability of more clinical data may ultimately influence surgical opinion. BACKGROUND: Solid organ transplantation is frequently complicated by viral infections. Cidofovir is a broadspectrum antiviral agent with activity against CMV, HSV, VZV, EBV and HHV6, 7, 8 but also against Adeno-, Pox-, BK-and Papilloma viruses. It is also active against UL87 (transphosphorase) mutants of CMV showing Gancyclovir (GCV) resistance. AIM: To retrospectively review our experience with the use of systemic and topical Cidofovir in transplant recipients. PATIENTS AND METHODS: Between 1. 1.2000 and 30.11 .2003 more than 700 solid organ transplants were performed at our centre. Standard immunosuppression consisted of Calcineurin inhibitor based triple drug therapy with or without ATG or IL2 receptor antagonist induction. A total of 6 patients (1% of patients transplanted during the study period) received systemic Cidofovir including 1 kidney, 1 pancreas, 1 lung, 1 small bowel and both limb recipients. In addition three transplant recipients were treated with topical Cidofovir for Papilloma virus associated skin lesions. RESULTS: In all three cases regression of skin lesions was achieved. Cidofovir was given for prophylaxis in one bilateral hand transplant recipient after development of GCV associated neutropenia. The remaining 5 patients received Cidofovir for CMV infection or disease. Four patients had developed breakthrough CMV disease during GCV prophylaxis, in two cases a UL97 mutant was isolated and in two patients we observed clinically GCV resistant CMV disease. In the remaining patient the indication was GCV associated neutropenia. The patient who received Cidofovir prophylaxis developed CMV infection after withdrawal, but responded to a second antiviral course. All cases of CMV disease responded to therapy. Only in the lung recipient we observed a relapse and one kidney recipient who received only one therapy cycle developed repopulation with a wild type CMV strain and required ValGCV therapy. No severe side effects were observed in this cohort, in specific no sustained renal impairment. CONCLUSIONS: Cidofovir was found highly active in the treatment of Papilloma virus associated skin lesions and GCV resistant CMV disease in SO recipients and was well tolerated. Infectious diseases and cancer development are major complications in immunosuppressed transplant patients. The diminished cellular immunity is thought to be causative for EBV-associated lymphomas, skin cancers or severe HCMV-infections. Restoring immunity by adoptive lymphocyte therapy has been shown to be effective in tumour and infectious diseases. In SOT, the infused lymphocytes are of recipient origin and generated to recognize relevant proteins presented by self MHC-molecules. However, current generation procedures have major limitations. In this report we present a novel time and cost effective generation procedure for HLAtype independent production of specific T cells. It is based on short-time stimulation with 15-AA overlapping peptide pools, selection of activated, IFN-γ secreting cells and non-specific expansion (A). We applied the protocol to generate T cells specific for the HCMV proteins pp65 and IE-1 and compared it with current procedures. Generation of pp65 and IE-1 specific T-cells from the same volunteers was successful in 7/8 experiments. Cell lines consisted of CD8 + and CD4 + cells with multiple pp65/IE-1 epitope specificities that showed lysis of autologous pp65 + or IE-1 + targets (no killing of pp65 -/IE-1 +auto/allo targets) (B). The presented procedure allows the generation of tumour and pathogen specific T cells, even with unknown epitopes, for a broad patient population. We evaluated our laparoscopic donor-recipient database to study the factors influencing the outcome of allografts from laparoscopic kidneys. Record of all laparoscopic donor-recipient data was retrieved from transplant database. Short-term allograft function was determined based on serum creatinine at day 1-5, 10, 20 and 30, serum creatinine >2.5 at day 10 and serum creatinine >1.5 at day 30, delayed graft function (DGF) and long-term outcome were assessed. Donor factors analyzed were age, sex, BMI, side left versus right, warm ischemia time, cold ischemia time, use of heparin, transperitoneal versus retroperitoneal, multiple artery versus single artery. A statistical model was built and logistic regression was used to determine the factors influencing the outcome. Univariate and multivariate analysis were performed to determine the factors influencing the functional outcome. There were 207 patients in the study, mean donor age of 42.5± 9.2 years, mean BMI 26.7 ± 4.0 kg/m² mean WIT was 249.3 seconds (range 120-540 seconds). Mean age of recipient was 44.4± 13 years and BMI was 26.2± 4.7 kg/m². The table shows the details of significant results in each category. Higher BMI of recipient, rights kidney and retroperitoneal surgery were found to be risk factors for early graft dysfunction. Older donor and obese recipient were risk factors for DGF. Donor BMI directly influences the long-term allograft function. Also allograft did better in female and white recipients (P=<. 001). Warm ischemia time, total operating room time and cold ischemia time did not play a significant role in the outcome. Conclusions: Outcome of laparoscopic donor nephrectomy is largely dependent on non-surgical factors: older donor age, higher recipient BMI and male sex of recipient are important for short-term outcome, obese donor is a risk for long-term outcome. Introduction: The aging donor (D) and recipient (R) population has led to new challenges in kidney transplantation (KT). Controversy exists regarding the optimal approach to the elderly D or R. A number of strategies have been proposed including matching by age, medical risk, serology, HLA, size, or nephron mass. The purpose of this study was to retrospectively review our single center experience in deceased donor (DD) KT with respect to age. Methods: From 10/1/01 through 11/15/03, we performed a total of 129 DD KTs, including 33 (26%) in Rs ≥60 years and 96 (74%) in Rs 19-59 years of age. The DD pool included 51 expanded criteria donors (ECD; defined and allocated according to UNOS policy) and 78 standard criteria donors (SCD; defined as non-ECD). ECD kidneys were utilized by matching estimated renal functional mass to recipient size (BMI <25 kg/m²), including the use of dual KTs (N=8). ECD kidney Rs were further selected based on age >40 and low immunologic risk. Rs received rATG or alemtuzumab induction in combination with tacrolimus, MMF, and steroids; those ≥60 had lower tacrolimus targets and MMF doses. Results: The mean age differed significantly between R groups (65 vs 46 years, p<.001), including 7 patients >70. In Rs ≥60, 22 (67%) received KTs from ECDs compared to 29 from ECDs (30%, p<.001) in Rs <60. Other demographic and transplant characteristics were similar among groups. Patient survival is 97% in Rs ≥60 compared to 99% in Rs <60 (p=NS) with a mean follow-up of 12 months. Kidney graft survival rates are 94% in Rs ≥60 vs 89% in Rs <60, p=NS. Initial and subsequent graft function, rejection, infection, re-operations, length of stay, re-admissions, and resource utilization were similar among groups. Nine patients had opportunistic viral infections (5 CMV, 3 polyomavirus nephropathy, 1 EBV-associated PTLD), including 4 (12.5%) in Rs ≥60 compared to 5 (5%, p=NS ) in Rs <60. However, all 9 cases occurred in ECD Rs (18% ECD vs 0 SCD Rs, p<.001). Conclusions: By matching nephron mass with R size and avoiding the use of ECD kidneys in high immunologic risk Rs, short-term outcomes that are comparable to SCD kidneys in younger patients can be achieved with either older Ds or Rs, regardless of age. However, the risk of viral infection is higher in ECD Rs, and long-term follow-up is needed to ultimately determine the risks and benefits of KT in this setting. Background: Elevated serum creatinine value in a young deceased donor can be due to multiple reasons including dehydration, acute tubular necrosis, effect of brain death, rhabdomyolysis and drug or contrast nephrotoxicty. There is wide variation in clinical practice regarding the acceptance of kidneys from young donors with elevated creatinine and it is not clear what the impact of this acute insult is on the long-term graft survival following kidney transplantation. Methods: Using United Network for Organ Sharing (UNOS) database, all primary kidney transplants (N=27650) performed between 1994-2001 from deceased donors aged 10-39 were analyzed. Kidney transplants from donors with history of diabetes, hypertension, or kidney disease and multi-organ transplant recipients were excluded from this analysis. Other donor and recipient factors which can influence the long-term graft survival were included in the multivariate model along with donor creatinine. Results: The distribution of donor creatinine level is as follows: 0-0.9: 48%; 1.0-1.4: 37%; 1.5-1.9: 9%; 2.0-2.4: 3% and >2.4: 3%. Unadjusted Kaplan-Meier graft survival rates are shown in the following table. In the multivariate Cox regression analysis, the relative risk (RR) of 6 month graft survival between a donor with a creatinine level 0.5 unit more than another donor is 1.12. The RR for longterm graft survival for those kidneys functioning at 6 months with a 0.5 unit change more in donor creatinine is 1.04 when death was treated as graft failure and 1.05 when death with functioning graft was censored. Conclusions: Kidneys from young deceased donors with elevated creatinine provide excellent short and long-term graft survival. Terminal creatinine value in young donors is not a risk factor for long-term graft survival. Although these results represent a select group of kidneys actually transplanted, further evaluation for expanding use of this group of donors is warranted. A predominant economic focus in renal transplantation has been to reduce costs for the initial hospitalization and first 90 days. We determined factors which effect early readmission, and their financial impact. Methods: We performed a retrospective analysis of all kidney transplants performed from July 1, 2001 to June 30, 2003 . The following variables were analyzed: Demographics, CIT, HLA mismatches, ATN, Induction therapy, Diabetes, Coronary artery disease, BMI, education level, and insurance status. Chi-square and Student's t-test were performed for all variables under investigation versus a dependent variable of readmission within 30 days. Logistic regression was performed with demographic variables and variables considered to be clinically significant. All variables were dichotomized at clinically important levels. Causes of readmission, hospital days, number of readmits, and time from discharge were also collected. Charges were captured for the first time readmissions in ATN recipients. Results: 195 kidney transplants were performed and a 28 % readmission rate was observed. Multivariate analysis revealed ATN and African American to be significant statistical predictors for readmission within 30 days (p=0.0017 and p=0.0412), respectively. CIT greater than 420 minutes had a 2. Oxidative stress associated with ischaemia/ reperfusion and operative trauma influences clinical outcome in kidney transplantation. Older recipients tend to have more complications and a less favourable outcome in kidney transplantation. While this may be associated with increased oxidative stress, little is known about a possible dysbalance of the antioxidative capacity in these patients. Blood samples were drawn from 31 patients undergoing kidney transplantation pre-and perioperatively. Markers of oxidative stress and the antioxidative system were measured. Median age was 55 years (21-67), nine were older than 60 years. Antioxidative capacity TORC and vitamine E equivalent Trolox were lower in older recipients perioperatively and at any other time point up to six hours after reperfusion. From 15min after reperfusion, 4-HNE and GSSG were higher in older recipients and increased in the effluate from the renal vein compared to systemic values. GSH, malondialdehyde MDA and thiobarbiturat-reactive substances TBARS increased after reperfusion with even higher values in the venous effluate. They were normal again after 6h with no difference between the two groups. Irrespective of age, patients with a delayed graft function had lower preop TORC than those with primary function. In these patients 4HNE was higher from reperfusion to the end of the observation period. Patients with acute rejection had higher MDA, 4HNE and PLA2 activity in the renal vein blood than patients without acute rejection. After reperfusion, markers of oxidative stress and lipid peroxidation are increased in systemic blood and blood drawn locally from the renal vein. They are further increased in patients with delayed graft function and acute rejection. However, this does not cause a noticeable decrease of the systemic antioxidative capacity. In contrast to this, older recipients have a lower antioxidative capacity even preioperatively. This patientdependent dysbalance of the antioxidative system could be responsible for the poorer outcome in older patients and offers the possibility of prophylactic treatment. Prakas T. D'Cunha, 1 Ravi Parasuraman, 1 K. K. Venkat. 1 1 Nephrology and Hypertension, Henry Ford Hospital, Detroit, MI. The magnitude of proteinuria correlates with poor outcome in native kidney disease and after renal transplantation. Since preemptive renal transplantation (PETx) is on the rise, many patients receive renal transplants with residual urine output and proteinuria from the native kidneys. Delineation of the source of proteinuria (native vs allograft) in the immediate post-transplant period (IPTP) is important for appropriate interventions. However, It is not known if native kidney proteinuria persists post transplantation (postTx). Methods: We prospectively evaluated 11 Live donor Tx recipients with urine output pre transplantation (preTx), immediate graft function and stable creatinine postTx. Random urine protein: creatinine (UPr:Cr) ratio was measured immediately preTx and weekly thereafter. Group A (n=5) included PETx recipients and Group B (n=6) were on dialysis preTx. All recipients had UPr:Cr > 0.5 pre-Tx and total resolution was defined as UPr:Cr <0.2. Results: Demographic features included a mean age of 45 years, 4 Female, 7 Male, 3 Caucasians and 8 African-Americans. The causes of ESRD were diabetic nephropathy (n=4), hypertension (n=3), reflux nephropathy (n=1), and chronic glomerulonephritis (n=3). The mean serum creatinine preTx was 8.8 ± 2.3 mg/dl in the patients on dialysis, 6.2 ± 1.7 mg/dl in the preemptive patients and 1.3 ± 0.37 mg/dl at 5 weeks post-Tx. The immunosuppressive regimen was based on tacrolimus, mycophenolate mofetil and corticosteroids ± induction with Thymoglobulin ®. The mean tacrolimus level at 4 weeks postTx was 10.4 ± 2.7 ng/ml. No patient was on ACE inhibitors, angiotension receptor blockers or nondihydropyridine calcium channel blockers during the study period.One patient not included in the above analysis had proteinuria that did not resolve and allograft biopsy at 3 wks showed acute humoral rejection with glomerulopathy. The evolution of UPr:Cr in the IPTP is shown in the table. Conclusion: Our results show complete resolution of native kidney proteinuria at a mean duration of 5 weeks post Tx and Persistent proteinuria beyond 8 weeks postTx is of allograft in origin and cannot be attributed to the native kidneys. Introduction: Without protocol biopsies early allograft dysfunction in live renal donation is difficult to detect. We asked whether or not mathematical models such as: Cockcroft-Gault, MDRD and Nankivell could be used to predict post-donation creatinine as a mechanism of identifying early allograft dysfunction. Methods #1: Retrospective chart review of living donor renal transplant patients at our University Hospital from 2001 to 2003 (76 donor pair charts) was performed. Exclusion from the study was based on lack of available information (9 donor pair charts). We applied the above mentioned mathematical equations to predict early serum creatinine in the recipient by using anthropomorphic information from the recipient and the known transplanted renal function from the donor. Linear regression analysis was performed to evaluate for possible correlation between early post-transplant serum creatinine measurements in the recipient and estimated serum creatinine from the mathematical equations in all patients and in patients without identified allograft dysfunction. Results #1: We found no correlation between estimated and measured serum creatinine in recipients of live donor kidneys using either Cockcroft-Gault, MDRD or Nankivell equations. R² = 0.270, 0.082, 0.034 respectively in patients without allograft dysfunction. Methods #2: Compensatory hypertrophy and hyperfiltration are known to occur in solitary and post-transplant kidneys. Hence, we used the donor as a control and asked whether or not the estimated clearance or GFR of the donor correlated with the estimated clearance or GFR of the recipient 1 week after transplantation using anthropomorphic data from each coupled with measured laboratory values. Results #2: We found no correlation between recipient and donor estimated creatinine clearance or GFR using Cockcroft-Gault, MDRD or Nankivell equations. R² = 0.041, 0.033, 3.86e -4 respectively in patients without allograft dysfunction. Conclusion: Neither Cockcroft-Gault, MDRD or Nankivell mathematical equations were able to reliably predict serum creatinine, clearance or GFR in recipients. Possible explanations include: 1) 24 hour creatinine clearance is not a reliable indicator of GFR or may be over/under collected. 2) Medications used early post-transplantation may alter creatinine clearance or GFR. 3) Variable muscle catabolic rate related to medications or early post-operative status. 4) Demographics used to derive these equations were different. The majority of current immunosuppressive protocols for solid organ transplant recipients are based on calcineurin inhibitors, increasingly combined with the antiproliferative agent mycophenolate mofetil (MMF). We have recently demonstrated that indirect T cell recognition of donor-specific HLA peptides plays an important role in the immunopathogenesis of chronic allograft rejection (CR). We have also previously generated HLA allopeptide specific T cell lines and clones from renal transplant recipients with CR treated with cyclosporine (CsA), azathioprine (Aza) and corticosteroids which were of Th1 phenotype (IFN-γ), while lines and clones from stable patients were of a Th2 (IL-10) phenotype. For this study we generated T cell lines using peripheral blood lymphocytes (PBLs) from renal transplant recipients treated with MMF+CsA (n=12) or MMF+Tac (n=6) with either CR (biopsy, serum creatinine > 2 mg/dl) or stable renal function (SRF, serum creatinine ≤ 2 mg/dl). Generated lines showed a significant response to donor-specific peptide (CR: 24.125± 3.256 cpm vs SRF: 4.283 ± 731 cpm). T cell lines from patients with CR produced IFN-γ, but only minimal amounts of IL-10 (548±94 vs. 58±12 spots/10 6 cells) in response to the donor specific peptide. In contrast, T cell lines from patients with SRF produced IL-10 but minimal IFN-γ assessed using ELISPOT (533±37 vs 52±11 spots/10 6 cells, respectively) and confirmed by ELISA and were CD4/CD25 positive (FACS-analysis). In this study on MMF-based immunosuppression we confirm that CR is associated with a Th1 pattern of cytokine production, while stable renal function is associated with a Th2 phenotype regardless of the immunosuppressive regimen used (CsA/ Tac+MMF vs CsA+Aza). This analysis may provide an invaluable tool to clarify the potential of current immunosuppressive protocols with MMF enabling the induction of T regulatory cells. Abstract has been withdrawn. Background: Previous work demonstrated that surgical manipulation, ischemia, or adenovirus vector infection induced innate immune responses in mouse islets, resulting in production of chemokines and impaired islet engraftment. CD4+CD25+ T cells with regulatory functions participate in immune tolerance and the regulation of inflammatory responses. Aims: To evaluate the role of CD4+CD25+ T regulatory cells to modulate adenovirus vector induced chemokine expression in murine islets, and to determine their effect on islet engraftment. Methods: CD4+CD25+ T cells were sorted from BALB/ c splenocytes. Murine islets were isolated by stationary digestion method with collagenase from BALB/c mice. Freshly isolated islets were transduced with AdCMVLacZ vector at multiplicity of infection (MOI) of 100 for 1 hour, and then cocultured with CD4+CD25+ or CD4+CD25-T cells at ratio of islet cells: T cell of 2:1 (assuming 1000 cells/islet). Cultures were harvested after 48 hours for determination of chemokine gene expression profiles. Additionally, groups of islets and T cells were cotransplanted in a marginal islet mass model to diabetic recipients (150 islets/mouse). Group I: control; group II: AdCMVLacZ transduced; group III: CD4+CD25+ T cells + AdCMVLacZ; group IV: CD4+CD25-T cells + AdCMVLacZ. Results: AdCMVLacZ transduction induced a wide variety of chemokines in islets. CD4+CD25+ T cells inhibited chemokine expression, including fractalkine, MIG, MIP-1α, MIP-1β, MIP-3α, CXCR5, CXCR6 and CX3CR1. The marginal islet mass transplants showed that vector transduction prevented islet engraftment and diabetes cure, while CD4+CD25+ T regulatory cells, but not CD4+CD25-control T cells permitted engraftment and cure. Conclusion: CD4+CD25+ T cells modulate innate immune responses in murine islets by inhibition of chemokine expression and improve transduced islet engraftment. B Lymphocytes are the most abundant antigen presenting cell (APC) population of the immune system. By virtue of their clonally expressed surface immunoglobulin, these lymphocytes are capable of highly specific alloantigen uptake and presentation. Here, we determined the impact of disrupting cognate T-B collaboration on the fate of islet allografts in a non-human primate (NHP) model. Cynomolgus monkeys were rendered diabetic (150mg/kg-BW STZ) and developed a stable state of hyperglycemia (>300mg/ dl) and insulin-dependence. All recipients were transplanted, intra-portally (day 0), with 20000-35000 IEQ/kg of isolated allogeneic islets. Recipients in the control group (n=4) were treated with an induction regimen consisting of 5mg/kg Thymoglobulin (on days 0, 2, 5 and 10) followed by maintenance monotherapy with Rapamycin (to achieve a therapeutic level of 8-15ng/ml). Recipients in the experimental group (n=5) were treated with a combined induction regimen including 5mg/kg Thymoglobulin (on days 0, 2, 5 and 10) and the B lymphocyte depleting mAb, Rituxan (375mg/m2 on days 0 and 5), followed by Rapamycin maintenance. All control recipients rejected their islet grafts within 14-23 days. On the other hand, the experimental group enjoyed long-term allograft survival: 266d, >300d, >400d, >40d (x2). Rapamycin was discontinued at day 200 post islet transplantation in three long-term islet allograft survivors; 2/3 of these long-term survivors continue to remain euglycemic for >300 days. Kinetic flow cytometric analysis of the recipient PBL following induction immunotherapy demonstrated that Thymoglobulin alone promotes transient T cell depletion for up to 3 weeks, as expected, without depleting B lymphocytes. On the other hand, induction immunotherapy with combined Rituxan and Thymoglobulin led to a complete depletion of B lymphocytes lasting for 50-60 days; in addition, to the expected degree of Thymoglobulin mediated T cell depletion. These results indicate that combined transient T-and B-lymphocyte depletion can induce long-term islet allograft survival, while minimizing the need for chronic immunosuppression with Caclineurin inhibitors and steroids in NHPs. Collectively, this preclinical study indicates that early abrogation of cognate T-B lymphocyte interaction may induce long-terms allograft acceptance by disrupting the APC function of B cells and/or preventing the development of anti-donor antibodies. Background: Rejection in up to 70% remains the main problem in clinical small bowel transplantation, even with modern immunosuppression. Treatment with the nondepleting anti-CD4mab induces indefinite survival in heart and kidney transplantation, however, in small bowel transplantation only a marginal improvement in survival could be detected. Differing from specific immune reactions, the early inflammation after transplantation, is due to unspecific ischemia-reperfusion injury. We hypothesized that abrogating the early non-specific inflammation -by blocking the TNF-a receptor activation -in combination with anti-CD4 treatment would increase survival. Material and Methods: Orthotopic small bowel transplantation was performed in the strongly mismatched DA (RT1a) to Lewis (RT1l) combination. Group I were untreated controls (n=7), group II was treated 10 times (20 mg/kg iv, day -1 to +21) with the nondepleting anti-CD4mab (Rib5/2) (n = 7). Group III received 4 times anti-TNF-a receptor mabs (0,3 mg/kg iv; 60 min pretransplant -day +9) (n = 6). Group IV was treated with the combination of both antibodies (n = 12). The follow-up included survival time, clinical outcome, histology and cytokine quantification by Real-Time PCR (TaqMan). Results: Non-treated controls were rejected after median of 8.3 days. TNF-a blockade alone showed similar median survival (8.8 days). The anti-CD4mab treatment resulted in moderately prolonged median graft survival (19.7 days) and the combination anti-TNF and anti-CD4 resulted in significantly prolonged graft survival of 103.4 days (range 12-387) with 2/3 of the animals surviving longer than 50 days. acute rejection was observed in group I, II and III, whereas only moderate cellular infiltration could be found after combined treatment. At day 3 grafts were harvested and analysed for cytokine expression. The lowest level of IFN-g and CD25 -a marker of activated T-cells -was found in the combination group, representing a diminuished cellular infiltration due to missing non-specific inflammation. A shift towards a Th2 response was not responsible for the beneficial effect (no differences in IL-4, IL-10 and TGF-b expression). Conclusions: The data clearly showed that early treatment of non-specific inflammation resulted in long time survival in a highly immunogenic transplant situation. The use of available TNF-a inhibitors may deliver substantial benefit to clinical small bowel transplantation. MR-1 (anti-CD40L, anti-CD154) has been shown to be effective in prolongation of heart allografts in mice. The combination of MR-1 and KBA (anti LFA-1) has been shown to prolong islet allograft rejection and leads to tolerance. Recently, LFA-1 has been shown to be capable of intracellular signaling independent of CD28 costimulatory signals. We asked if the combination of MR-1 and KBA monoclonal antibodies could induce long-term cardiac allograft survival in mice. Balb/c were heterotopically transplanted into C57BL/6 mice which received control Abs, MR-1 alone, KBA alone, or both MR-1 and KBA. KBA was administered at 200micrograms on days 0, 1, 7, 14 (4 doses) by I.P. injection. MR-1 250 micrograms was administered on days -1, then Q3-4days x9 doses by I.P. injection. Rat IgG and Hamster IgG was administered in control mice in identical fashion, respectively. Rejection was defined as cessation of heart beat by daily palpation. Rejection was confirmed by H&E microscopy. Balb/c hearts were rejected in C57BL/6 in 6. 3 days (n=3, 4, 7, 8) which was not different from mice treated with control antibody alone, Hamster IgG, 8.7days (n=3, 7, 9, 10), Rat IgG 11 days (n=3, 7, 8, 18) , nor both 8. 3 days (n=3, 7, 9, 9) . However, immunotherapy with MR-1 resulted in prolonged allograft survival 2/6 >60 days (n=6, >60x2, 19, 20, 22, 42) , KBA alone resulted in 4/6 >60days (n=6, >60x4, 28, 33), but combination therapy KBA+MR-1 resulted in 5/5 >60days. No data is available on whether long-term mice demonstrate donor specific tolerance. We conclude that combination therapy with KBA and MR-1 is extremely effective in promoting long term cardiac allograft survival in mice. These results support the findings that LFA-1 could transmit co-stimulatory signals and is attractive target for cardiac transplantation. Background: Recently, we have discovered that the novel cyclophilin-binding immunosuppressant Sanglifehrin A (SFA) blocks bioactive IL-12 production by human dendritic cells in vitro. Sanglifehrin A is structurally related to cylosporine, but unlike the latter, does not inhibit calcineurin activity. The molecular mechanism of SFA is currently unknown. Here, we have analysed the capacity of SFA to impact on proinflammatory (IL-12p70, TNF-α) and immunomodulatory (IL-10) cytokine production in vivo. By using independent in vivo models that employ different IL-12 inducers we provide evidence that SFA abrogates IL-12p70 production in vivo while having minor or no effects on IL-10 and TNF-α production. Methods: Mice (C57BL/10, H2K b ) were injected intraperitoneally (i.p.) for 3 days with SFA (10 mg/kg/d) or drug vehicle to study drug effects under steady-state conditions. Additionally, to explore SFA's effects specifically on dendritic cells (DCs) under dynamic conditions, we expanded DCs 20-30 fold in vivo by injecting the endogenous growth factor Flt3L (10 mg/d; 10d, i.p.) and co-injecting SFA (10 mg/kg/ 10d, i.p.). Subsequently, under either conditions, animals were stimulated with CpG DNA or LPS and IL-4 i.p. Four hours later, peripheral blood was drawn after cardiac puncture and splenic and bone marrow DC subsets were analysed by flow cytometry. Results: The data show that a 3-day course of SFA inhibited 70% of in vivo IL-12p70 production, induced by either CpG or LPS/IL-4 stimulation. Under dynamic conditions, a 10-day course of SFA blocked 95% of LPS/IL-4 induced IL-12p70 and 98% of CPGinduced IL-12p70 production in vivo. These effects are not due to suppressive effects of SFA on total DC numbers or DC subsets, as indicated by four colour flow cytometry. In direct contrast, the production of TNF-α and the immunoregulatory cytokine IL-10 were only moderately ( <20% compared to vehicle-injected controls) affected by SFA. Conclusion: In conclusion we propose that SFA represents functionally a novel class of immunophilin-binding immunosuppressants that has a high selectivity and potency to abrogate production of the major proinflammatory and Th1-skewing cytokine IL-12p70. Polyclonal rabbit anti-thymocyte globulin induces rapid apoptosis of human B cells in vitro, and has been used succesfully to treat alloantibody mediated renal allograft rejection. Antibodies directed at B cell antigens in rATG are likely due to the presence of thymus resident B cells in the immunizing preparation. Apoptosis induced by rATG occurs within 2 hours in the absence of complement. In this study, we identify several B cell surface proteins linked to specific apoptotic pathways, demonstrate caspase dependent and independent mechanisms of rATG induced apoptosis in naive and activated (CpG or CD40L) B cells, and normal human bone marrow resident plasma cells (PC). Methods: Naive and memory human B cells were obtained by phlebotomy followed by CD19 magnetic bead isolation. Normal human PC's were obtained by bone marrow aspiration from volunteers followed by CD138 magnetic bead isolation. B cell blasts were obtained culture with CpG DNA or CD40L in the presence of IL-2, IL-10, and IL-15. PC's were also generated in vitro by secondary culture with IL-6, IL-15, IFN-α, and hyalurinic acid. Apoptosis was assessed after 18h incubation in rATG (Thymoglobulin; 100 µg/ml) by annexin-V / TOPRO-3 staining, caspase 3/8/9 activation by fluorochrome-tagged caspase substrate labelling, and mitochondrial membrane depolarization by Mitotracker red dye. rATG binding specificities were assessed by rATG compatative inhibition of monclonal antibody binding (CD5, CD19, CD20, CD27, CD38, CD40, CD80, Fas, HLA-ABC, HLA-DR). Results: rATG induced rapid apoptosis of naive (94%) and activated (98%) B cells, as well as PC's (87%). Activation of caspase 3, 8, and 9 could be identified in all cell types after 3-6 hours of incubation in rATG. The pan-caspase inhibitor zVAD-fmk reduced, but did not eliminate, rATG apoptosis. T cell adsorbed rATG had high anti-B cell activity, suggesting that apoptosis may be specific for B cell antigens. Compatative binding experiments indicate the presence of antibodies to several pro-aptotic surface signalling molecules including: CD20, CD38, CD95, HLA-DR, and CD27. Similar results were obtained at all stages of B cell differentiation using an in vitro B cell-to-plasma cell differentiation system. Conclusion: rATG contains antibodies to multiple apotosis surface signaling proteins (CD20, CD27, CD5, HLA-DR, CD27, CD95), and activates multiple apoptotic pathways, in B and plasma cells. These findings support the clinical use of rATG to treat alloantibody mediated allograft rejection. Treatment of diabetes-prone, yet clinically healthy individuals, with immunosuppressive drugs is not a desirable form of therapy because of potential complications associated with chronic use of the drugs. However, transient use of immunosuppression in already diabetic patients may be well justified if the therapy has the potential to cure diabetes. We investigated whether antilymphocyte serum (ALS), a polyclonal anti-T cell antibody, effectively cures full blown diabetes in NOD mice, a mouse model of type 1 diabetes. Methods: NOD mice with >300 mg/dl blood glucose in 3 consecutive measurements were considered diabetic and divided into 4 treatment groups: ALS alone; ALS + exendin-4, an agent that stimulates beta cell replication and differentiation; exendin-4 alone; and no treatment. ALS (0.5 ml, i.p.) was given twice three days apart immediately after diagnosis of diabetes. Exendin-4 was given i.p. at 12 nmol/kg for 4 consecutive days twice with 3 days of rest in-between. No further treatment was given except insulin that was given to all diabetic mice. Persistent <200 mg/dl blood glucose was considered a cure. Results: ALS alone achieved reversal of overt diabetes in 50% of mice within 115 days. Addition of exendin-4 to ALS increased the cure rate to 89% within 75 days. Cure of diabetes was accompanied by progressive normalization of glucose tolerance, improvement of islet histology, increased insulin in the pancreas, and release of insulin in response to glucose challenge, all of which indicate an increased beta cell mass. Syngeneic NOD/SCID islets transplanted in cured mice remained intact, suggesting long-term abrogation of autoimmunity. No treatment or exendin-4 alone failed to produce disease remission. Conclusion: A transient short course of polyclonal anti-T cell antibody combined with beta cell growth factors such as exendin-4 presents a novel approach to the cure of new onset type 1 diabetes. Assessing and monitoring transplant outcomes is an important public policy goal. Current outcomes monitoring is episodic and time-delayed. We explored the use of the cumulative summation technique (CUSUM) to provide real-time, continuously updated, risk-adjusted monitoring of outcomes. Methods: Data from the Scientific Registry of Transplant Recipients were reviewed for adult kidney and liver transplants performed between 1/97 and 12/01. Multivariate logistic regression models were fitted to predict graft failure (kidney) and death (liver) risk. These models were used to construct a CUSUM outcome chart for 258 renal and 114 liver transplant centers. Using a risk-adjusted cumulative sum of 3 as a control limit, we compared centers flagged by CUSUM for potentially substandard performance to centers flagged by the current SRTR/OPTN method. Results: During the study period, 60,470 kidney transplants were performed with a 1year graft failure rate of 9.2%. CUSUM signaling flagged 57 centers and identified 20 of 22 centers identified by the current methods. CUSUM-flagged centers, which were not identified by the SRTR/OPTN method, had deteriorating performance as demonstrated by an increasing CUSUM score (Figure) . During the study period, 18,277 liver transplants were performed with a 1-year mortality rate of 13.9%. The CUSUM method identified 27 centers as having potentially declining performance, including 9 of 11 centers identified using the SRTR/OPTN method. CUSUM monitoring can identify transplant centers with significant changes in performance over time. This technique appears to be more sensitive than existing methods because it includes data from all previous transplants. Adopting CUSUM monitoring techniques may lead to improved transplant outcomes by rapidly identifying significant performance changes at transplant centers. This study analyzed 126 outcomes of 120 pregnancies (including twins and triplets) fathered by 97 kidney transplant recipients on newer immunosuppressive medications reported to the National Transplantation Pregnancy Registry (NTPR). Data were collected via questionnaires, phone interviews and hospital records. In the general U.S. population, birth defects occur in approximately 3-5% of children. Of the 126 outcomes, there were 115 livebirths (91%) and 11 spontaneous abortions (9%, one due to Turner's syndrome). There were no ectopic pregnancies, therapeutic abortions or stillbirths. The mean gestational age was 39 ± 2.4 weeks, range 31-43 weeks, with a mean birthweight of 3244 ± 649 grams, range 1417 -4848 grams. Included in the table are the immunosuppressive regimens administered around the time of conception. One recipient received two doses of Campath 1H post-transplant. Another recipient was participating in an investigational drug trial (ERL). Current paternal graft function was adequate for 87.6%, reduced in 2.1%, not functioning in 4.1% and unknown in 6.2%. Among the 115 children, the majority were developing well with 7 children reporting continuing problems. There was 1 major malformation (spina bifida) and 4 minor anomalies (tongue-tied (2), polydactyly, undescended testicle) for an overall rate of 4.3% (5/115). CONCLUSIONS: Compared to the general population, reports to the NTPR do not reveal an increase in the incidence of birth defects in the newborn of pregnancies fathered by kidney transplant recipients on newer immunosuppressive medications. This study analyzed pregnancy outcomes in female kidney recipients who had changes in adjunctive immunosuppression before conception or during pregnancy. Data were collected via questionnaires, phone interviews and hospital records. Analysis was by 1-sided Fisher's Exact test or, for continuous variables, by 2-sided independent t-test. Of the female kidney recipients reported to the National Transplantation Pregnancy Registry, there were 24 kidney recipients (29 pregnancies, 31 outcomes, group A) with an adjunctive immunosuppressive discontinuation or switch before or during pregnancy. In group A, changes were: discontinuing azathioprine (aza) or mycophenolate mofetil (MMF), switching MMF to aza, or switching sirolimus to aza. These recipients were compared to 80 recipients (99 pregnancies, 103 outcomes, group B) where adjunctive immunosuppressive therapy was not switched or changed. All recipients in both groups were receiving Neoral® or tacrolimus in combination with other agents. MMF was the adjunctive therapy in A 19 and B 7 pregnancies; sirolimus was the adjunctive therapy in A 2 and B 1 pregnancies. Significant outcomes included: transplant to conception interval p=0.03 (A 3.6 vs. B 5.2 years), prematurity p=0.01 (<37 wks; A 69.6% vs. B 40%), and neonatal complications p=0.03 (A 64% vs. B 39%). Graft loss within 2 years of delivery approached significance: p=0.08, A 17% (4/24) vs. B 5% (4/80). Mean serum creatinine was similar before, during and after pregnancy between the groups. There were no significant differences in hypertension, rejection (A 3.5% vs. B 2%), and diabetes mellitus during pregnancy, livebirths, stillbirths, spontaneous abortions, and birthweights. Structural malformations were reported in 4% A and 5.1% B newborn (p=not significant). There was one neonatal death reported in Group A. CONCLUSIONS: When compared to recipients who continued their adjunctive immunosuppressive therapy, female kidney recipients who switched or discontinued their adjunctive therapy before or during pregnancy did not have a significant increase in the incidence of rejection during pregnancy, although somewhat poorer graft outcomes were reported. Whether graft survival is related to adjunctive immunosuppressive changes before or during pregnancy requires further study. While the incidence of newborn structural malformations was similar between groups, pregnancy exposures remain limited with the newer agents. (N=2) , and severe adverse events (one or more) in 12 patients: 1 acute pancreatitis, 1 hepatitis, 1 abdominal abscess, 1 delayed wound healing, 1 stroke, 1 AR, 1 severe hypertriglyceridemia, 1 raised SCr and 7 de novo nephrotic-range proteinuria (median 2.9g/day; range, 2.5-9.8) that occurred 9 days after conversion (mean; range, 5-117). In the univariate Cox analysis, SRL discontinuation was associated with a previous AR (RR=2.5), a retransplantation (RR=2.8), the number of HLA ABDR mismatches (R=1.6) and the highest SRL through level recorded during the first month post-conversion (Max SRL) (RR=1.36). In the multivariate Cox analysis, only the Max SRL was a risk factor for SRL discontinuation (RR=1.36). The nephrotic-range proteinuria was reversible after SRL discontinuation in 6/7 patients. A previous AR was the only risk factor associated. We conclude that conversion to SRL as main immunosuppressive drug in RT was associated with a considerable incidence of major side effects leading to SRL discontinuation in more than 2/3 of the patients. A nephrotic-range proteinuria, that seems reversible after SRL discontinuation, occurred in almost 1/3 of the patients. High exposure to SRL after conversion might have contributed to the high incidence of side effects. Anal cancer has been associated with viral stimulation including HPV and anal condyloma. It is usually a limited malignancy compared to de novo malignancies of the colon and rectum. Squamous cell cancer of the anorectum is known to occur with higher frequency in transplant recipients. Purpose: To examine anal cancers in renal txp recipients. Methods: We examined all cases of anal canal cancers arising de novo in patients reported to our database for both patient demographics and tumor characteristics. Results: 30 renal allograft recipients presented with anal canal cancers. In txp recipients, there was a 12% incidence of anal canal cancers (30/234), which is significantly higher than the 2% noted by SEER. The incidence of higher stage malignancies was considerable: stage I (n=9), stage II (n=10), stage III (n=4), and stage IV (n=7). The average age at cancer diagnosis (55.6 yrs) was significantly less than that of the general (non-txp) population as reported by the SEER registry (74 yrs). The majority of patients were male 23/30 (77%), which is in direct contrast with SEER data, which notes a 2/3 majority of females (p<0.001). The time from txp to cancer diagnosis was shorter in anal canal tumors than in rectal SCC or colorectal tumors. Stage I tumors were treated with local excision in the majority of cases, with colonic sleeve resection reserved for 2 cases; all with universal survival. Stage II cancers were also treated conservatively with local resection, except for 2 cases treated with APR; with one death due to malignancy in an APR pt. Stage III disease was treated in 75% of cases with APR, with universal survival. However, of the 7 stage IV renal txp recipients, only 2 pts had colonic sleeve resection, with the remaing receiving only paliative therapy; all succumbed to disease within 14.0 months following diagnosis. Conclusions: Compared to the general population, anorectal cancers in renal txp recipients were found more frequently at a younger age, in males much more so than females, at a much advanced stage, and at a markedly greater incidence. As these lesions have a strong viral etiology, more thorough and intense surveillance of viral infection should enable better prognostication of disease and allow for treatment prior to the development of cancer. In the patients reported to the Registry, skin cancer was the most common malignancy. SCC was observed with greater frequency in reported transplant recipients compared to that seen in the general population. Skin malignancies occur after long interval of immunosuppression. SCC alone or in combination with BCC appears aggressive, and often carries with it significant mortality. However, the highest incidence of recurrence was demonstrated in the mixed tumor group. and 2164±496, p<0.05) groups. Histological damage was ameliorated selectively in livers with increased CoPP-induced HO-1 expression. Conclusion: This study is the first to document that HO-1 induction protects mice against warm hepatic I/R injury, and in parallel it downregulates the expression of pro-inflammatory cytokines. Unexpectedly, HO-1 cytoprotective mechanism remains TLR4 independent. Background: The current mouse models of liver injury, resulting from ischemia and reperfusion (IRI) are largely limited to in situ "warm" ischemic component. Indeed, the "cold" IRI model in mice requires a demanding OLT. As "cold" ischemia is critical for a full-blown liver IRI sequel in the clinics, a reliable mouse model of "cold" hepatic ischemia followed by OLT is warranted. Here, we report on the development of such a mouse "cold" IRI/OLT model. We investigated the significance of EP2 and EP4 receptor and the efficacy of each selective agonist in hepatic ischemia/reperfusion injury (I/R injury). Methods and Results: Seventy percent partial hepatic ischemia was performed for 90 or 120 minutes in male C57BL/6 mice. First, we evaluated the local expression of EP2 and EP4 in the liver. Both EP2 and EP4 expressed in the naïve liver. EP4 expression was significantly up-regulated after 6h of reperfusion following 90 minutes of ischemia, while EP2 expression was not changed. Furthermore, EP2 agonist treatment did not show any protective effect on liver function. By contrast, EP4 agonist treatment significantly inhibited hepatic injury compared to control at 6 h of reperfusion (sAST: 854±117 vs.4683±1043, sALT:1535±292 vs.6252±1638, sLDH:3884±569 vs.19653±4917). Histological analysis also confirmed this protective effect of EP4 agonist. Massive cellular infiltration and extensive hepatic cellular necrosis was observed in control mice, while the lobular architecture was well preserved and there was few necrosis in EP4 agonist treated mice. To address the underlying mechanism, we evaluated local expression of cytokine, chemokine and adhesion molecule using realtime quantative PCR. EP4 agonist treatment significantly down-regulated the local expression of several pro-inflammatory cytokine (TNF-α, IL-1β and IFN-γ), chemokine (MCP-1 and IP-10) and adhesion molecule (E-selectin and ICAM-1) after 2h of reperfusion following 90 minutes of ischemia. By contrast, IL-10, an anti-inflammatory cytokine, was significantly up-regulated. Finally, the removal of shunt liver after 120 minutes of ischemia resulted in the death of 86 % of mice treated with saline within 48 h. By sharp contrast, 80% of mice treated with EP-4 agonist survived (P=0.016 compared to control). Conclusion: This study demonstrates for the first time the inhibitory role of EP4 receptor in hepatic I/R injury and therapeutic efficacy of selective EP4 agonist for protection of the liver. Ischemia/reperfusion (I/R) injury often results in dysfunction of steatotic orthotopic liver transplants (OLT). Expression of fibronectin (FN) by sinusoidal endothelial cells is an early event after liver injury. We hypothesized that FN-leukocyte interactions are critical in liver I/R injury. We have recently shown that α4β1-FN interactions preferentially regulate T cell recruitment in OLT. The present work was designed to assess the function of α5β1-FN interactions in steatotic OLT. Methods and Results: cyclic RGD (cRGD) peptides (500 µg/rat), with high affinity for α5β1 integrin, were administered through the portal vein of steatotic Zucker rat livers before and after the 4h cold ischemic storage. Lean Zucker recipients of steatotic OLTs received an additional 3d-course of cRGD peptides (1mg/rat, i.v.). Administration of cRGD peptides significantly increased the 14d OLT survival rate as compared with respective controls (100% vs. 50%, n=8 rats/gr; p<0.001). cRGD treated OLTs showed no signs of vascular congestion or necrosis, contrasting with extensive centrilobular pallor and necrosis in control livers (score: 0.3 ± 0.4 vs. 2.5 ± 0.5; p<0.008), at day 1 post-OLT. Hepatic function was improved in cRGD treated recipients as evidenced by decreased levels of sGOT (717±33 vs. 2437±903; p<0.02), and MPO (0.96±0.12 vs. 1.89±0.35; p<0.03). Moreover, cRGD therapy diminished intra-graft expression (mRNA) of iNOS (∼8 fold), without affecting eNOS expression, and of pro-inflammatory cytokines such as TNF-α (∼1.4 fold), and IFN-γ (∼4 fold). Interestingly, blockade of α5β1-FN interactions had a reduced effect on the initial recruitment of T lymphocytes (7±0.8 vs. 11±2.8; p<0.06); however, it strongly inhibited intra-graft infiltration of monocyte/macrophages (74±5 vs. 173±34; p<0.0006). Leukocyte migration requires adhesion and focal matrix degradation. Metalloproteinase-9 (MMP-9), a gelatinase implied in FN breakdown, was profoundly depressed in cRGD peptide treated livers (MMP-9/actin mRNA: 0.2 vs. 1.8). Indeed, MMP-9 was highly expressed by macrophages in control steatotic OLTs. Conclusion: cRGD peptide therapy down-regulated MMP-9 expression, decreased monocyte/macrophage recruitment, and inhibited the expression of iNOS and pro-inflammatory cytokines. Importantly, it significantly improved steatotic liver function and recipient survival. This is the first study to document a function for α5β1 -FN interactions in OLT. Background: Ischemia/reperfusion (I/R) injury is one of major factors leading to dysfunction or loss graft function following small intestinal transplantation (SITx). IL-13 has been shown to modulate the inflammatory response by down-regulating the production of proinflammatory cytokines. This study was to evaluate the cytoprotective effects and putative mechanisms of IL-13 against injury in a mouse intestinal I/R model. Methods: Male C57BL6 (WT) mice were anesthetized and underwent 100 minutes of warm ischemia induced by clamping the superior mesenteric artery. Mice received either sterile saline or 1 µg recombinant murine IL-13 (rIL-13) via the lateral tail vein before the induction of ischemia. Separate groups were performed for survival and analysis. For the latter group, intestinal tissue was harvested at 4 hr, 24 hr, day 3 and day 7 and assessed for histology, myeloperoxidase, expression of Stat6 and anti-oxidant (HO-1) genes by Western blots, cytokine gene expression by competitive-template RT-PCR. Results: 100% of mice pretreated with rIL-13 survived >7 days (vs. 50% in saline controls; n=6 rats/gr). rIL-13 treatment resulted in near normal histopathological architecture. In contrast, controls demonstrated mucosal erosion, villous congestion/ hemorrhage, and apoptosis. rIL-13 treatment also significantly decreased intestinal myeloperoxidase (surrogate marker for neutrophil accumulation) as compared with saline controls (1.72±0.14 vs. 4.55±0.34). The expression of Stat6 and HO-1 were markedly increased in WT mice treated with rIL-13, as compared with that of saline controls. Unlike in controls, the expression of mRNA coding for TNF-α/IL-1β and IL-2/IFN-γ remained depressed, whereas that of IL-13/IL-4 reciprocally increased in the WT treated with rIL-13. Conclusion: IL-13 treatment plays a protective role in mouse intestinal warm I/R injury leading to reduced tissue injury and improved survival. IL-13 treatment appears to protect the intestine by reducing inflammatory and Th1 type cytokine expression while fascilitating Th2 type cytokine expression via a Stat6 pathway. Upregulation of anti-oxidant HO-1 by IL-13 may exert synergistic cytoprotection against intestinal inflammatory injury and these results may suggest a HO-1 dependent regulation of intestinal Stat6 inflammation after I/R injury. We developed a protocol combining neoadjuvant radiotherapy, chemosensitization and orthotopic liver transplantation (OLT) for patients with operatively confirmed stage I and II hilar cholangiocarcinoma (CCA) in 1993. We reviewed our experience with specific AIMS to 1) delineate the role of the staging operation, 2) assess efficacy, and 3) determine whether results warrant use of donor livers. METHODS: All patients had unresectable CCA without evidence of metastases or CCA arising in the setting of primary sclerosing cholangitis (PSC). Diagnostic criteria were intraluminal brush cytology or biopsy or a CA 19.9 level >100ng/ml with a radiographically malignant stricture. Neoadjuvant therapy included external beam irradiation, transcatheter Iridium-192 brachytherapy, and thereafter, protracted IV 5-FU infusion or oral capecitabine. A staging abdominal operation was performed prior to OLT. We compared survival after OLT for patients with CCA versus hepatitis C (HCV), hepatocellular carcinoma (HCC), and PSC whom underwent OLT at our institution during the same time period. RESULTS: Fifty-six patients were enrolled in the protocol. Four patients had disease progression and another four patients died prior to completion of neoadjuvant therapy. Fourteen of 48 patients (29%) whom underwent operative staging had findings precluding OLT. Status 1 for children awaiting liver transplantation differs from adults by including children with chronic liver disease, a policy unchanged with the introduction of PELD/ MELD. This study reports the number of children transplanted with deceased donor grafts at status 1 in the UNOS database, either meeting the standard criteria, or 'by exception' (i.e. approved by the regional review board). Comparisons of demographics between the pre and post-PELD eras, regional differences, PELD scores of status 1 patients, and patient and graft survivals are examined. RESULTS: Data from 18 month periods pre-PELD (8/27/00-2/26/02) and post-PELD (2/27/02-8/27/03) showed that pre-PELD 339 of 710 (47.7%) children were transplanted at status 1 compared to 306 of 741 (41.3%) children post-PELD. The most common diagnoses were acute hepatic necrosis (23.1%) and biliary atresia (22.3%). Of pts with biliary atresia 31.2% were at status 1 by exception. The % of pts at status 1 by exception between the 2 eras were: 23.3% pre-PELD and post-PELD 29.7%. For 9 of 11 regions performing >20 pediatric LT, the % of children transplanted at status 1 was 29.6-61.2% pre-PELD and 22.4-65.7% post-PELD. Of these 6.2-43.3% were at status 1 by exception. The % of retransplants was 21.8% pre-PELD, and 21.2% post-PELD. Between the 2 eras the median age of status 1 by exception pts was the same (1.0 yrs) and was lower than for the standard status 1 pts: 2.0 yrs pre-PELD, 5.0 post-PELD). The mean PELD score of status 1 by exception pts was 14.0 compared to 22.6 for standard status 1 patients (p<0.0001). Table 1 compares pt and graft survivals at 6 months for deceased donor recipients pre and post-PELD for: all children (overall), status 1 (standard), status 1 (exception) and non status 1. There were no significant differences between groups either within or Abstracts between eras. CONCLUSIONS: In this analysis pt and graft survival was not effected by status 1 vs nonstatus 1 at LT. However, the practice of transplanting almost half of pediatric recipients at status 1 has an important effect on donor liver allocation. Questions to be addressed are time waiting and death on the list for status 1 and non status 1 children, and the appropriate definition of status 1. Background: The MELD/PELD score was adopted to improve liver allocation by establishing an objective, verifiable system that reduces subjectivity in listing practices and advantages patients with a higher probability of waiting list mortality. The PELD score was validated to predict mortality and transfer to an intensive care unit (ICU) in children with chronic liver disease. Aim: To determine if the PELD system has improved liver allocation for children as measured by changes in recipient characteristics and regional variation in listing practices. Methods: Data reported to the UNOS registry for children ( 0-18years) receiving primary liver or liver/kidney transplantation between two time periods, January 1, 2000 to December 31, 2001 (era 1, n =788) and March 1, 2002 to July 31, 2003 were included. Patients coded as acute hepatic necrosis of any etiology were excluded to focus the analysis on patients with chronic liver disease. Results: Waiting-list time was similiar (221 days in era 1 vs. 231 days in era 2, NS). The new system has not reduced the percentage of children transplanted while in an ICU (28% in era 1 vs. 27% in era 2, NS) or as status 1 (31% in era 1 vs 28% in era 2, NS). Active exception letters were used for allocation in 15% of patients. Therefore, a calculated PELD score was used for allocation in only 57% of children with chronic liver disease. The mean PELD score at transplant was 16±14.33. Regional means were compared to national means by t test to examine variability in patient status and listing practices. Significant regional variation was found for the rate of children transplanted as status 1 in 3 regions, the mean PELD score at transplant in one region, the rate of exception letters in 3 regions and the percentage of children in the ICU at transplant in 3 regions. Patient and graft survival computed by Kaplan Meier at 3, 6 and 12-months were not significantly different with the new system. Conclusions: The new allocation system appears to serve only 57% of children with chronic liver disease, with 15% of these bypassing the scoring system with exception letters. The new system has not reduced the percentage of children with decompensated chronic liver disease who require ICU support at the time of transplant. Regional variation in listing practices suggests that the system may not have reduced subjectivity in listing practices in pediatric liver transplantation. Experienced transplant professionals may predict mortality in their sickest patients better than the (Pediatric) Model for End-stage Liver Disease score (MPS). This score has never been tested in the highly selected cirrhotic patients referred to regional review boards for accelerated listing. Physician requests for accelerated listing are frequently denied. Denied patients may thus have a greater risk of dying before receiving a transplant. This study was done to establish if: 1) such denials increased mortality; 2) experienced physicians could predict mortality better than the MPS. The requesting physicians assessment of mortality risk had no significant (p=0.23) effect on predicting mortality, in the Cox proportional hazards model whereas the MELD had a highly significant effect (p=0.0003). Conclusion: Regional review boards are able to accurately distinguish patients at low from those at high risk of death, indicating that the process is fair. The (Pediatric) Model for End Stage Liver Disease remains the most significant predictor of mortality, but the RRB process adds to its predictive ability, providing additional safeguards to sick patients. Referring physicans risk assessment was not predicitve of patient mortality. Background: Split liver transplantation (SLT) has been used to effectively expand the cadaveric donor pool and provide size-appropriate left lateral segment(LLS) grafts for children. To optimize use of a limited resource, the remaining right trisegmental (RTS) graft can be transplanted into adolescents or adults. Little data exists addressing the outcomes of RTS allografts and no US report describes a multi-center, multi-OPO cooperative SLT sharing alliance. Methods: Recipients from the 5 participating adult liver transplant programs that received a split RTS liver allograft over a 5 yr period were identified. Prospective donor and recipient (table 1)information collected from the individual transplant programs. Results: 63 RTS grafts were generated with the implementation of SLT at 5 Texas centers. Donors were generally young, healthy, and stable with 70% traumatic deaths. 83% of livers were allocated to pediatric recipients. Splitting occurred via the in-situ(64%) and ex-vivo(36%) techniques. The celiac axis was often maintained with the LLS(52%) requiring vascular reconstruction in 48% of RTS grafts. HCV was the most common indication for OLTx. 33% of RTS grafts were shared among different centers. Mean cold ischemic time was 7:22 ± 3:00 hours. 1 yr patient and graft survival is comparable to UNOS results for primary OLTx (table 2) . Both RTS allografts transplanted into status 1 patients were lost: one due to hepatic artery thrombosis with subsequent successful re-transplantation with a second RTS graft; in the second a RTS allograft was a second graft due to PNF of a whole organ and the patient expired. MELD score tended to be higher in RTS grafts lost(23.4 vs 20.7, p=.23). There were no cases of primary non-function in the RTS allografts. Complications included; HAT(6), portal vein thrombosis(0), and biliary complications requiring reoperation (7) . Conclusions: SLT consistently generates 2 functional allografts from one cadaveric donor thus expanding the donor pool. 1 yr patient and graft survival with the RTS grafts compare favorably with UNOS averages. Non-function of the RTS graft is rare. Use of RTS allografts in status 1 patients, or those deemed to be high risk should be avoided when possible. The Texas alliance shows that broader application of SLT with inter-OPO and inter-programatic sharing successfully expands the donor pool for adults and children. Patients with severe portal hypertension and refractory ascites may have normal bilirubin, PT and creatinine and thus a low MELD score. It has been suggested that MELD may not serve well this patient population. Elevation of creatinine has a major impact on MELD. However, renal failure is a late event in patients with ascites and may have an accelerated course leading to death before a donor becomes available. Dilutional hyponatremia reflects an impairment of renal perfusion and has been found to be a predictor of hepatorenal syndrome and death in cirrhotics with ascites. In this study we investigated the prognostic value of serum sodium (Na) and hyponatremia (Na ≤130) in patients listed for liver transplantation (OLT) and whether the addition of Na may increase the accuracy of MELD to estimate waitlist mortality. respectively. The c-statistic scores were 0.753 for hyponatremia, 0.784 for Na and 0.894 for MELD. The MELD and Na together yield a score of 0.908. This increase was statistically significant (p=0.026) and represented a net improvement because it allowed a better identification of patients who survived or died during the study period. Conclusions: In this study hyponatremia and Na were significant predictors of 3month waitlist mortality. More importantly, addition of Na to the MELD significantly increased the accuracy of the score. Na appears as an attractive variable for a numerical score because it is objective, quantitative and easily available. The benefits of incorporating Na into the MELD score requires further confirmation Objective: The impact of liver transplantation (LT) of patients with high disease severity based on the model for end stage liver disease (MELD) on the health care system is undetermined. This project analyzed the effect of high MELD scores with clinical outcomes and health resource utilization. Patients and methods: Retrospective review of 682 adult single organ LT recipients (ICD-9, 50.51, 50.59) in the 15-month period ending March 2003, were included in the study. Data was voluntarily provided by 34-member institution and analyzed independently by UHC. Recipients were classified into 4 groups (<10, 10-20, 20-30, >30) based on their MELD scores. Results: MELD scores of recipients at LT were <10 in 100, 11-20 in 322, 21-30 in 176, and >30 in 84 patients. At 3 months, patient survival was similar between all groups (96.5% -98%, P NS). However, graft survival was significantly lower in patients with MELD >30 when compared to patients with MELD <10 (85.7 and 96%, P <0.05). Posttransplant complication rates exhibited a stepwise progression from 38% to 60.7% and were highest in recipients with MELD scores >30. Average ICU lengths of stay (LOS) were 4.3, 5.1, 7.2, and 9.8 days for patients with MELD <10, 10-20, 20-30, and >30, respectively. Average hospital LOS post-LT also steadily increased from 11.6 to 13.4, 16.7 and 21.1 days. More importantly, only 50% of patients with MELD>30, compared to 85% with MELD <10, were discharged to home. Cost analysis for the initial transplant period averaged 96,943/99,658/118,581, and 160,182 for recipients with MELD <10, 10-20, 20-30, and >30, respectively. Most significant cost increases were reflected in blood product utilization (5,215 to 16,534), pharmacy (11,425 to 21,598) and dialysis (4,541 to 8,512) when MELD increase from <10 to >30. In contrast, cost of surgical and medical services was not significantly different amongst all MELD groups (15,025 to 18,011, P NS). Conclusions: LT in recipients with high severity of illness is performed with good survival outcomes in the short-term. Recipients with higher MELD scores exhibit extended ICU stays, prolonged hospital LOS, increased hospital costs and requirement for extended care following hospital discharges. This study argues for adjusted compensation based on the severity of disease. Additionally, the impact of cost should be considered by policymakers when determining allocation priority systems. Background: The Eastern Co operative Oncology Group (ECOG) Performance status is a simple tool to assess a patient's disease progression, assess the impact of disease on the daily living abilities of the patient, and to determine appropriate treatment and prognosis for cancer patients. It is measured on a scale from 0 to 4 with 0 being patients who are fully active and able to carry on normal activity without restriction and 4 for completely disabled, unable to carry on any selfcare, totally confined to bed or chair. It is evident that the ECOG score reflects the physiological status of a patient. The objective of this study is to investigate the possible association between ECOG performance status and post liver transplantation mortality in patients receiving a first liver transplantation in the UK & Ireland. Methods: We included all 4112 adult patients who underwent a non-urgent first liver transplantation between 1 March 1994 and 31 March 2003. As a first step the impact of ECOG performance score on 90-day patient mortality was assessed by univariate analysis. Multivariate logistic regression modelling was then employed to adjust the ECOG score for other risk factors using bootstrap sampling techniques. Results: The overall 90-day patient mortality was 9.7% (95% confidence interval 8.8% to 10.6%). When compared with patients with a score of 0 the unadjusted odds ratios for 90-day mortality for scores of 1, 2, 3 and 4 were 1.2, 1.8, 2.3 and 6.8, respectively (p<0.001). After adjustment for other risk factors the ECOG score remained a significant predictor of 90-day mortality (p<0.001). Recipient female sex, serum creatinine, cold ischaemia time, abnormal donor organ appearance and use of partial organs were also significantly associated with mortality (p<0.05). Conclusions: The ECOG performance status indicator is a simple score, which can be easily measured. Patients with higher ECOG scores have a worse outcome after liver transplantation compared with those with lower scores. The results of this study suggest that performance status of the patient should be taken into consideration when selecting patients for liver transplantation. Campbell, 2 P. Scheiner, 4 A. Fisher, 1 M. Korogodsky, 1 K. Noto. 3 1 Surgery, New Jersey Medical School, Newark, NJ; 2 Surgery, New York University Medical center, New York, NY; 3 Surgery, Albert Einstein Medical Center, Philadelphia, PA; 4 Surgery, New York Medical College, Valhalla, NY. UNOS policy 3.6.4.4 requiring chest CT (CCT) and bone scan (BS) in liver transplant candidates with HCC is consensus rather than evidence based. We hypothesized that such policy does not improve patient selection and is not cost effective. This retrospective study included all patients with HCC evaluated for transplant between Jan 1999 to Dec 2002 at 4 centers and excluded HCC diagnosed after transplant. Majority of CCT/BS were performed at transplant centers. Scans with an indeterminate result were repeated a few weeks later. Evaluation outcomes were Accepted for listing, Not accepted-advanced HCC, Not accepted-other reasons. Outcomes following listing were Transplantation, Waiting for transplant, Delisted-HCC progression or Delistedother reasons. The charges were $1,608/CCT and 1,689/BS. Physician fee and personnel costs were not included. 187 patients with HCC were evaluated at 4 centers. Number of scans performed and interpretations are shown. *One patient died from biopsy of mediastinal mass (later proved benign) seen on CCT but not on chest x-ray; Only one patient was declined for listing based on the results of the initial scans, after 3 indeterminate BS followed by biopsy of a metastatic lesion in hip.158 patients were listed. 18 and 11 were declined for listing due to HCC and other reasons, respectively. After listing 120 patients were transplanted, 13 are waiting and 16 and 9 were delisted due to HCC and other reasons, respectively. Excluding invasive procedures, CCT/BS incurred $ 803,901 charges. After transplantation 5/120 patients had recurrence. All 5 had negative CCT before transplant and 1 BS was indeterminate. We have previously demonstrated that dendritic cells (DC) actively rearrange their cytoskeleton to form the immunological synapse (IS) with allogeneic CD4+ T cells. We have now examined the localization and function of DC costimulatory molecules during IS formation. We used a TCR transgenic model that recognizes a specific OVA peptide in the context of MHC Class II. Mature DC were pulsed with different doses of agonist or control peptide (10nM-10000nM) or media alone and conjugates were formed at 1:3 to 1:10 ratios with naive CD4 + DO11.10 transgenic T cells by low speed centrifugation. The time course of DC-T cell conjugate formation was determine by labeling DC with CFMDA and T cells with CM-Dil dyes and examining conjugates at different timepoints using an inverted fluorescence microscope. We performed time-lapse video microscopy to determine conjugate stability. To determine the functional role and duration of costimulation during IS formation T cells were loaded with CFSE dye and then anti-CD80, anti-CD86 or both were added to DC-T cell conjugates 0, 0.5, 1, 2, 6, 12 or 24 hours after conjugate formation. T cells were isolated after 48, 60 or 72 hours and evaluated for a first cell division by flow cytometry. DC-T cell conjugates were stained using antibodies against CD80 and CD86. Conjugates were examined by confocal microscopy and scored blindly. DC-T cell conjugation occurred rapidly reaching a plateau by 30 minutes. The majority of conjugates were stable for >12 hours. Blockade of CD80 inhibited progression of naive T cells through the first cell division by 26% while inhibition of CD86 did not significantly affect cell division. Blockade of both costimulatory molecules inhibited first cell division by 43% at 300nM peptide and 74% at 30nM peptide. Inhibition of cell division and IL-2 production by intracellular staining occurred when antibodies were added for up to 12 hours following conjugate formation indicating that T cell activation requires prolonged costimulation. Interestingly, there was only modest polarization of either CD80 or CD86 into the IS in the presence of peptide which was not significantly different than control. These results demonstrate that prolonged synapse formation is critical for delivery of an appropriate costimulatory signal to naive T cells. The relatively low level of CD80/ CD86 polarization suggests that sufficient molecules are present in the IS for lattice formation with CD28 resulting in optimal signaling. Background: Inductive anti-CD40L mAb is highly effective as a monotherapy in preventing allograft rejection, indicating that the CD40-CD40L pathway is critical to the rejection process. Most studies have employed anti-CD40L mAb to investigate the role of this pathway. While this approach is often referred to as a "blockade" of CD40-CD40L interactions, the precise mechanism of action of anti-CD40L mAb remains to be established. Very little is known regarding the role of CD40 in the rejection process. This study was designed to test the hypothesis that disrupting CD40-CD40L interactions by targeting CD40 as opposed to CD40L will yield distinct outcomes. Methods: Wild type (WT) C57BL/6 (B6) or BALB/c cardiac allograft recipients were injected i.p. with 1 mg anti-CD40L mAb (MR1) on days 0, 1, & 2 relative to transplantation. Experimental results were compared to CD40-/-B6 or BALB/c recipients of CD40-/-allografts. Primed and precursor donor-reactive Th1 and Th2 responses were monitored by ELISPOT assays, and the impact of donor-derived dendritic cells (DC) on outcome was assessed. Results: Inductive anti-CD40L therapy was remarkably effective at promoting graft survival (>60 days) and inhibiting T cell priming in both B6 and BALB/c recipients. Similarly, graft survival was prolonged and T cell priming was prevented in CD40-/-B6 recipients. In contrast, CD40-/-BALB/c recipients acutely rejected their allografts and mounted both Th1 and Th2 responses. CD40-/-B6 mice could be "forced" to acutely reject their allografts by injection of CD40-/-donor-derived DC, which induced Th1 and Th2 responses. However, donor-derived DC failed to induce rejection in WT B6 mice treated with anti-CD40L mAb. To assess the impact of donor-derived DC on "accepted" allograft function, B6 CD40-/-or anti-CD40L treated WT recipients were injected with donor DC on day 30 post-transplant. DC induced rejection in the majority of CD40-/-recipients, but not in anti-CD40L treated mice. However, DC failed to induce Th1 or Th2 in either setting, suggesting the induction of Treg. The observation that anti-CD40L mAb prevents rejection in BALB/c recipients, while CD40 is not required in this strain supports the notion that CD40 and CD40L play roles independent of their interaction with one another. The observation that DC induce rejection of "accepted" grafts in B6 CD40-/-recipients but not following anti-CD40L therapy further points to differences in targeting CD40 vs. CD40L to prevent rejection. Memory T cells have properties that are beneficial for host protection, but may be deleterious for the transplanted organ. It has been shown that the presence of donorreactive memory CD4 T cells prevented the beneficial effect of donor-specific cell transfusion plus anti-CD154 mAb MR1 (DST/MR1) on heart allograft survival in mice. The goal of this study was to understand the mechanisms of resistance to costimulatory blockade mediated by memory CD4 T cells. We first transferred B6 mice with either naïve or C3H-reactive memory CD4 T cells followed by C3H DST/MR1 treatment and C3H heart allografts. As anticipated, recipients of naïve CD4 T cells had prolonged graft survival (>60 d, n=3) compared to non-treated mice (8 d, n=6), while recipients of memory CD4 T cells rejected the grafts by d. 13 (n=5). We next evaluated the effect of CD4 memory cells on the endogenous donor reactive CD8 T cells. CD8 T cells from animals that received naïve CD4 T cells plus DST/MR1 (prolonged graft survival) responded weakly to donor stimulator cells (200 IFNg producers/million by ELISPOT and 18% lysis by in vivo CTL assay) similar to naïve non-transplanted mice. In contrast, anti-donor CD8 T cell response was strong in the animals transferred with memory CD4 T cells (1050 IFNg spots/million and 98% lysis), comparable to non-treated rejecting recipients. These findings suggest that endogenous CD8 T cells participate in graft destruction under these conditions. To test this, we transferred memory CD4 T cells into B6 recipients of C3H heart grafts treated with anti-CD8 depleting Ab. Depletion of CD8 T cells resulted in significant prolongation of heart graft survival (MST of 21.6±0.7 days) although all grafts were ultimately rejected suggesting that cells other than CD8 mediated graft destruction under these circumstances. Consistent with this, the rejecting grafts in the depleted recipients were heavily infiltrated with CD4 but not CD8 T cells. Furthermore, serum obtained from both anti-CD8 depleted and non-depleted recipients contained anti-donor alloantibodies. These data clearly indicate that DST/MR1 treatment did not affect the ability of memory CD4 T cells to provide help for activation of anti-donor CD8 T cells and B cells and that the induced alloimmunity functioned in multiple ways to mediate graft destruction. As memory T cells comprise a significant portion of alloreactive T cell repertoire in humans, our results may translate into improved therapies for human transplant recipients. CTLA4.Ig-treated controls, p<0.001), indicating the cooperative interaction of these 2 pathways in T cell activation, and the relevance of dual targeting to prevent acute rejection. Lastly, a sub-therapeutic course of rapamycin significantly prolonged allograft survival in BTLA-/-(mean >40 d vs. 10 d in RPM-treated controls, p<0.001). Our data indicate that in the context of alloresponses, BTLA promotes both CD28dependent and CD28-independent acute rejection, and its targeting can contribute to long-term allograft survival. Blockade of the CD40-CD40L costimulatory pathway results in long-term allograft survival but does not prevent chronic rejection and therefore new immunotherapies are needed to obtain tolerance. ICOS and its receptor, B7RP-1, are the recently described members of the CD28-B7 families which play an important role in T cell activation and survival.A previous study of the blockade of CD40-CD40L interactions in a rat heart allograft transplantation model using an adenovirus coding for CD40Ig resulted in long-term (>300 days) survival concomitant to the development of chronic rejection.We analyzed the effect of co-treatment with an anti-ICOS MAb (JTT1, 1mg/week) and CD40Ig on chronic rejection.CD40Ig and anti-ICOS treatment also led to indefinite graft survival in all recipients (>120 days, n=7).Treatment with anti-ICOS alone resulted in a modest but significant prolongation of allograft survival (17±1.5 vs 7.6±1.6, p=0.002, n=4).Analysis of chronic rejection lesions at day 120 after transplantation showed in the CD40Ig+anti-ICOS group (n=7) .001) but not fibrosis were significantly lower.Importantly, 4 of the 7 CD40Ig+anti-ICOS-treated recipients showed complete absence of chronic rejection lesions whereas all CD40Igtreated recipients showed signs of chronic rejection.The CD40Ig+anti-ICOS group showed decreased graft infiltration by TCRαβ+, CD4+ and CD8α+ cells as well as macrophages and mast cells.Recipients in the CD40Ig+anti-ICOS group displayed significant inhibition of anti-donor CTL activity.Alloantigenic proliferative responses of splenocytes in the CD40Ig+anti-ICOS group were strongly inhibited and were reversed by IL2.CD40Ig treatment strongly but incompletely inhibited total IgG, IgG1 (Th2), IgG2a (Th2) and IgG2b (Th1) alloantibody production vs. untreated rejecting recipients.The residual total IgG and IgG2b but not IgG1 and IgG2a were significantly reduced in the sera of recipients of CD40Ig+anti-ICOS treated allografts at day 120, suggesting a preferential inhibition of Th1 responses.These data indicate that the chronic rejection mechanisms that are CD40-CD40L-independent are ICOS/B7RP-1-dependent and that operational tolerance can be obtained by simultaneous blockade of these two costimulatory pathways. Purpose: To provide the first efficacy and safety data of the malononitrilamide FK778 in transplant patients to date. This new agent is structurally unrelated to known and currently used immunosuppressants. In animal and in vitro models, it inhibits acute rejection, modifies vasculopathy and shows anti-viral activity. In this European, multicentre, phase II study, FK778 was administered to kidney transplant recipients at two concentration-controlled ranges. Methods: In a double-blind manner, 149 patients were randomized to a 12-week treatment with FK778 in combination with tacrolimus (Tac) and corticosteroids (S). 49 patients of the high-level group (H) received 2x600 mg/day FK778 and continued on 150 mg/ day, 54 patients of the low-level group (L) got 1x600 mg/day followed by 75 mg/day and 46 patients received placebo (P). Subsequent FK778 doses were adjusted to trough levels of 100-200 µg/mL (H) and 10-100 µg/mL (L). An independent unblinded pharmacokinetic panel advised the necessary dose adjustments. The primary endpoint was the incidence of biopsy proven acute rejection (AR). In phase I and II clinical trials a transient, reversible, first-dose effect of FTY720 on heart rate (HR) was evident, causing bradycardia in some patients but no difference in cardiac morbidity. This study investigated cardiac effects in renal transplant recipients treated with FTY720 or MMF for a minimum of 12 months. METHODS: 421 recipients (FTY720, n=94; MMF, n=327) pooled from phase II studies underwent ECG and 24-hour Holter monitoring. Results were compared between pooled FTY720, MMF and the FTY720 dosing groups (2.5 and 5mg). RESULTS: Demographic characteristics were comparable except for a higher prevalence of patients >60 yrs in the MMF group vs FTY720 (18% vs 9%, respectively). Analysis of mean hourly HR yielded no significant difference between groups. Analyses based on discrimination by day/ night, concomitant use of β-blocker, gender or age revealed no differences. The incidence of bradycardia (HR <50bpm); either sustained (for >1min), severe (<35bpm) and sustained and severe (HR <35bpm for >1min) were observed more frequently with MMF. Among patients maintained on β-blockers the incidence of bradycardia was comparable between FTY720 and MMF groups. Serious Holter findings were observed only in the MMF group; ventricular tachycardia (n=1), Torsade des Pointes (n=1) and second degree atrioventricular block (n=2). Autonomic cardiovascular responsiveness (measured by laying and standing systolic and diastolic BP, and HR) was comparable between groups. ECG derived intervals did not differ significantly between FTY720 and MMF groups for mean PR (155.1 vs 158.0msec) and QTc interval (Bazett 409.7 vs 404.1msec; Frederica 401.9 vs 395.8msec), respectively. The lack of significant differences for these parameters supports the absence of a dose-dependent effect. CONCLUSIONS: This study demonstrates the absence of any clinically significant effect of FTY720 on cardiac rhythm in patients on chronic therapy, thus confirming that the transient reduction in HR after the first dose of FTY720 does not persist in the maintenance phase. Campath-1H (C1H) induction in renal transplant (RTx) recipients (recips) has been associated with a state of 'near tolerance' reflected in good immunologic outcomes coupled with use of less than normative levels of maintenance immunosuppressive therapy (MIRx). Recently, no acute rejection (AR) or graft loss was reported in 23 recips receiving C1H (30 mg); 3 doses of solumedrol (500 mg intraop, 250 mg POD #1, 125 mg POD #2); and dual drug, Pred-free MIRx with Mycophenolate mofetil (MMF) (1 gm bid) and low dose Tac (Am J Transplant 2002; 2(Suppl3)(A):397). In view of nonoverlapping toxicities associated with CsA vs Tac, the former may be preferable in selected recips receiving such dual drug, maintenance minimization protocols. Some controversy exists, however, viz. the relative immunologic efficacy of CsA vs Tac. To determine if CsA can be utilized as effectively and safely as Tac in the dual drug MIRx regimen described above, we have undertaken a comparative study alternating use of the two calcineurin inhibitors in our RTx recips. Results from the first 17 of these patients are reported herein. Mean recip age is 47 yrs (21-73). Recip race: Cauc (n=9), Hisp (n=4), AA (n=3), other (n=2). Donors: deceased (n=6) and living (n=11). Target Co from 0-3 months were 8-10 ng/ml for Tac (n=8) and 150-200 ng/ml for CsA (n=9). Mean follow-up is 2.3 ± 1.4 mo. No AR has occurred in either cohort. There has been one infectious complication (cellulitis surrounding a drain site [resolved] ) and one lymphocele (drained via a peritoneal window). No CMV or other opportunistic infections have occurred. To date, two recips have been converted from MMF to Pred due to leucopenia. Both were from the Tac cohort. Creatinine (Cr) levels did not differ significantly between the CsA and Tac recips. Combining the cohorts, Cr at 1 mo, 2 mo, and 3 mo are 1.7 ± 0.6 mg% (n=12), 1.4 ± 0.4 mg% (n=9), and 1.1 ± 0.3 mg% (n=6), respectively (mean ± SD). The data suggest that CsA and Tac may be used interchangeably in maintenance minimization protocols following C1H induction without compromising either immunologic efficacy or safety. Further modifications of the present protocol will focus on reducing target Co for both CsA and Tac and the possibility that drug interactions may predispose Tac-MMF recips to more severe leucopenia. CONTROLLED STUDY TO REDUCE THE IMMUNOSUPPRESSIVE LOAD AFTER KIDNEY TRANSPLANTATION GUIDED BY DONOR SPECIFIC CTLp MONITORING. Jacqueline van de Wetering, 1 Barbara J. van der Mast, 1 Petronella de Kuiper, 1 Nicolle M. van Besouw, 1 Jacqueline Richen-Vos, 1 Jan N. M. IJzermans, 2 Willem Weimar. 1 1 Internal Medicine, Erasmus MC, University Medical Center, Rotterdam, Netherlands; 2 General Surgery, Erasmus MC, University Medical Center, Rotterdam, Netherlands. Background Tapering the immunosuppressive medication is indicated to prevent long term side effects. Recently we have shown that renal transplant recipients can safely be converted from calcineurine-inhibitors to MMF or AZA when their donor specific cytotoxic T lymphocyte precursor frequencies (CTLpf) are below 10/10 6 PBMC. We wondered whether the CTLpf also have predictive value when immunosuppression was reduced in patients only on MMF or AZA and steroids. Methods Renal transplant recipients with stable renal function, without proteinuria and at least two years after transplantation were included. If their CTLpf was low (<10/10 6 PBMC), the MMF or AZA dose was reduced to 75% at 4 months and to 50 % at 8 months after inclusion. If their CTLpf was high (≥10/10 6 PBMC), they were randomized in two groups. In one group immunosuppressive medication was tapered, the other group served as control. Sixty-eight patients have reached the one year follow up end point. Their median time after transplantation was 4.2 years (range 2.0-15.5 years). CTLpf were low in 41 (60%) and high in 27 (40%) patients. All patients had detectible cytotoxicity against third party. In all patients with low and in 15 patients with high CTLpf the MMF or AZA dose was reduced, while 12 patients served as controls. During tapering the immunosuppression, reversible acute rejection was observed in 1 of 41 (2%) patients with low and 2 of 15 (13%) patients with high CTLpf. Biopsy proven chronic allograft nephropathy was diagnosed in 1 patient of each group. The majority of patients on MMF or AZA long after kidney transplantation have undetectable donor specific CTLpf. In this group a 50% reduction of immunosuppression is save and further decreasing their immunosuppressive load is the obvious next step. Patients with high CTLpf seem to have a higher risk to develop acute rejection, but in most of them (87%) reducing the immunosuppression is possible. except for one child lost on day 159 from tumor recurrence. Incidence of early (<28 days post-LT) AR was 7/20 (35%) in group TS, versus 1/20 (5%) in group TB (p=0.044). Gene polymorphism for IFNγ, TNFα, and IL10 could be correlated neither with clinical outcome nor with pre-and post-LT levels of the corresponding cytokines. The immunological monitoring showed: (1) a peri-operative IL10 peak (as assessed by variations from baseline), which was significantly more sustained in the 32 children without early AR (+1h: p<0.001; +2h: p<0.001; day 1: p=0.004), when compared with the 8 children with AR (+1h: NS; +2h: p=0.018; day 1: NS); (2) a statistically significant decrement in IL2, IL4, IFNγ, and TNFα blood levels between day 1 and day 28 in the children without AR, whereas the corresponding levels remained stable in children with AR. Conclusions: Early AR was associated with lower peri-operative peak of the Th2-type cytokine IL10, and with subsequent lack of post-LT decrease of Th1-type cytokines. This study suggests the clinical relevance of Th1/Th2 immune deviation as a predictor/indicator of early AR in pediatric liver transplantation, to be confirmed with the currently ongoing intra-graft analysis of cytokines mRNA precursors using quantitative PCR technique. (n=7), liver-kidney (LKTx-1), and SBTx (n=4) recipients, median age 6.5 years , who received rATG and steroid-free Tacrolimus/Sirolimus (TAC/SRL) were screened for T-sup. Four control LTx recipients, median age 15 years (5-20), in whom conventional IM without rATG was successfully discontinued for at least one year (no-IM) were also screened as a reference population. Magnetic-bead sorted, >90% pure, B-cells (CD19+), which present antigen, were stimulated by T-helper cell line (D1.1), transfected to overexpress CD40 ligand costimulatory receptor. The resulting frequency (%) of B-cells expressing CD86 (B7.2), a costimulatory/activation marker, represented baseline expression. T-sup function was present if this was decreased to <90% of baseline value after addition of recipient CD8+28-cells. No change or CD86 upregulation indicated absence of T-sup. Results: Median time to T-sup analysis was 16 months in rATG subjects, and 168 months in no-IM subjects. Nine of 16 subjects demonstrated T-sup (%CD19+86+ reduced to 69±28 of baseline) while 7 subjects did not (%CD19+86+ increased to 114±16 of baseline). T-sup were present in 4/4 children with no IM, and absent from 3 of 4 subjects with rejection (REJ), all of whom experienced > 1 REJ episode. An 8-year old subject with one REJ episode previously, demonstrated T-sup cells at admission for diarrhea, 6 months after SBTx. TAC trough levels (C 0 ) were reduced to 5-7 ng/ml for adenoviral enteritis, in the absence of biopsy-proven REJ.No rejection occurred. In the reduced IM group (n=8), 4 children demonstrated T-sup, and 4 did not. In this group, a 6-year old LKTx demonstrated T-sup on annual followup, during clinically-indicated reduction of TAC C 0 to 5-8 ng/ml, to protect declining renal allograft function. Conclusions: Donor-specific T-suppressor cells may characterize transplant recipients, in whom graft function can be maintained with minimal or no immunosuppression. Their absence from some subjects who are on reduced immunosuppression suggests that alternative adaptive mechanisms such as negative selection, or novel homeostasis, must be entertained. Such assays may also permit safe evaluation of prospective immunosuppression withdrawal strategies. Operational tolerance {graft acceptance in an immunosuppression (IS)-free environment} after LDLT occurs in a substantial number (approximately 40%) of the patients by our elective protocol 72,449,2001) . There is, nevertheless, no reliable parameter to monitor and select patients who may discontinue IS without a risk of rejection. To identify such parameters, we analyzed systemically liver transplant (Tx) patients exhibiting operational tolerance for the profiles of peripheral blood lymphocytes. Methods. Seventeen liver Tx patients (Gr-tol) regarded as tolerant were enrolled in this study. FACS analysis was performed to determine the phenotype of peripheral blood lymphocytes. As controls, twenty-four age-matched volunteers with normal liver function (Gr-vol) were analyzed. Results. An increase was observed in the percentages of B cells, CD8 + T cells, CD4 + CD25 high+ cells and Vδ1γδT cells in peripheral blood lymphocytes from Gr-tol, compared with those from Gr-vol (p<0.01, p=0.14, p<0.05, p<0.01, respectively). On the other hand, the percentages of CD4 + T cells, NK cells, NKT cells, and Vδ2γδT cells were decreased in Gr-tol, compared with those in p<0.05, p<0.05, p<0.01, respectively) . There was no significant difference between the two groups with respect to the percentages of pan-T cells, total CD4 + CD25 + cells and αβ T cells and γδ T cells (NS, respectively). The similar tendency was found in the absolute number of each lymphocyte phenotype. Conclusion. Present results revealed several intriguing features of peripheral blood lymphocyte subsets in patients exhibiting operational tolerance after LDLT. Although the contribution of those subsets to the tolerant state remains elusive, the results may provide clues for reliable indicators of tolerance after human liver Tx. A. The purpose of this study was to develop a method whereby livers rejected for whole-organ transplantation could be processed in a regulatory compliant manner suitable for the manufacture of human cell therapy products. B. Donated livers, not suitable for orthotopic liver transplantation, were obtained from federally designated organ procurement organizations. Cells were isolated utilizing a two-step collagenase digestion protocol. Separation of live from dead cells was facilitated by using a density gradient within the closed system of a commercially available cell washer. Cryopreservation methodologies were used to preserve the live cells. Commercially available primary antibodies and fluorochrome-conjugated secondary antibodies were employed for determination of cell phenotypes. C. Processing donated livers yields a suspension containing billions of viable cells. These liver cell isolates can be reliably cryopreserved and thawed with a high degree of cell viability and recovery. The majority of the recovered cells are hepatocytes, based on expression of albumin. The remaining cells are a mixture of Kupffer cells, biliary cells, T-cells, and progenitor cells, based on their expression of a panel of specific cell surface and cytoplasmic markers. In vitro expansion of mature hepatocytes has been achieved using hormonally-defined medium. D. We have successfully developed a process to manufacture a cell therapy product containing predominantly mature hepatocytes, and have received an allowance by the FDA to begin Phase I clinical trials of this product. It is anticipated that several patients can be treated using the mature hepatocytes obtained from just one donated organ. Expansion of these mature cells should enable treatment of tens to hundreds of patients. Methods: Pediatric and adult EBV D+/R-transplant recipients were enrolled at 4 North American centers. Patients were randomized to receive either ganciclovir + placebo/no product or ganciclovir + immune globulin (IG) for 3 months. Following this, patients were unblinded and IG patients received additional IG therapy up to 6 months. All patients received oral antiviral therapy until 1 year post-transplant. EBV viral loads were done once monthly until 12 months post-transplant. Results: A total of 34 patients completed the study protocol (16 in placebo arm and 18 in IG arm). 25 patients were pediatric and 9 patients were adults (22 male,12 female). Transplant types included liver (n=13), kidney (n=12), lung (n=8) and pancreas (n=1). Both groups were comparable at baseline in terms of age, sex, and donor/recipient CMV status. All patients had an undetectable EBV viral load at baseline. The incidence of detectable viremia within the first year post transplant was 13/16 (81.3%) in the ganciclovir arm vs. 13/18 (72.2%) in the ganciclovir+IG arm (p=0.8). Time to first detectable viremia, and time to high level viremia (viral load 3 log 10 copies or higher) were not significantly different between the two arms. By a repeated measures ANOVA analysis, and by estimation of viral load AUC, no significant effect of randomization group was observed on EBV viral loads. Viral loads in adults were lower than pediatric patients in the first 6 months but no difference was observed by 12 months. PTLD developed in 3 (8.8%) patients (all 3 in IG arm; p=0.23) within the first 12 months. Conclusions: No significant difference in EBV viral load suppression was observed when ganciclovir was compared with ganciclovir + IG in high risk EBV D+/R-patients. Viremia was common in both arms despite anti-viral prophylaxis. The objective of this multi-center study was to assess the impact of antiviral exposure on risk of PTLD. Methods. All patients received a renal-only transplant on or after July 1, 1995 and were treated at one of 20 U.S. transplant centers. PTLD cases were confirmed by external review; pathologic information was available for 92%. Up to four controls were identified by UNOS for each case and matched on center, date of transplant and age. Center personnel abstracted data from medical records; data included initial and maintenance immunosuppression and rejection therapies, demographics, pre-transplant viral EBV status, rejection, risk factors (black race, cadaver donor, prior transplant, HLA mismatches, dialysis time) and antiviral exposure. Antiviral exposure was coded as days on acyclovir and ganciclovir separately (discounting the 30 days prior to PTLD diagnosis and a comparable time for controls to avoid including therapy that was given in response to PTLD symptoms). Multivariate conditional logistic regression models (stratified by first-year PTLD (early) and post first-year PTLD (late)) were used to determine the influences of cumulative time on acyclovir and ganciclovir, adjusted for each other as well as other abstracted data. Indication for the antiviral therapy was not taken into account although the majority of the recorded indications were for prophylaxis. Results. Data were collected for a total of 108 PTLD cases (67 early and 41 late) and 404 controls. For early PTLD, cumulative time on ganciclovir was significantly associated with lower risk of PTLD compared to patients with no antiviral exposure. For every 30 days of ganciclovir, risk of PTLD was lowered by 41% (Odds Ratio [OR]=.71; 95% CI=0.51-.99). There was no such effect for patients on acyclovir. Negative EBV serostatus of the recipient, and history of rejection or prior non-lymphoma malignancies were also associated with early PTLD with ORs of 24.56 (6.21-97.18), 4.83 (1.91-12.20) , and 10.19 (2.50-41.58 ), respectively. Only rejection history was associated with late PTLD after adjusting for cumulative antiviral exposure and other abstracted information (OR=2.58; 95% CI=1.05-6.36). Conclusion. These results provide support for the role of ganciclovir in reducing PTLD risk during the first year posttransplant. Ganciclovir use was independent of the other predictors of PTLD, suggesting that these risk factors could be useful in identifying patients most likely to benefit from extended treatment with ganciclovir. Post-transplant lymphoproliferative disorder (PTLD) is the leading cause of late morbidity and mortality in the pediatric orthotopic liver transplant (OLT) recipient. This is due to the convergence of multiple factors including: primary EBV infection, EBV donor/recipient mismatches, and long-term immunosuppression. Purpose: To determine if prospectively collected and analyzed EBV titers and titerdriven immunosuppressive modifications could reduce the incidence of PTLD in our OLT population. Between 4/2001 and 10/2003‚ 34 pts (14 females, 20 males all <18 yrs old [median age 5.8 yrs]), 21 of which were EBV seronegative underwent OLT (25 EBV seropositive and 9 EBV seronegative donors) and were enrolled into this prospective study. Maintenance immunosuppression consisted of tacrolimus and prednisone. Tacrolimus troughs were maintained at 10-12 ng/ml, 8-10 ng/ml, and 5-8 ng/ml for post-OLT mos 1-3, 4-6, and after 6, respectively. Prednisone was tapered off by 6 mos following OLT. All pts were followed for EBV infection via serial clinical examination and monthly serum EBV titers utilizing real-time EBV PCR. EBV titers were considered elevated at >4000 copies/ug DNA. Upon the development of 2 EBV titers >4000 copies/ug DNA, tacrolimus was reduced to a trough of 3-5 ng/ml and if not already completed, prednisone was discontinued. There was no concurrent use of ganciclovir or other treatment modalities. Immunosuppression was left at this level as long as liver function remained normal irrespective of lower EBV titers. Follow-up for these pts is from 8-922 days. Results: Prior to 4/2001, 30 pts underwent OLT and 5 pts (16%) developed biopsyproven PTLD (Table 1) . Since 4/2001, 34 children have entered the protocol and have had their immunosuppressive management assisted by serial serum EBV titers. 21 EBV seronegative recipients underwent OLT with 17 EBV seropositive and 4 EBV seronegative allografts. Since the initiation of this protocol no patient has developed PTLD. This is statistically significant (p-value<0.05, chi-square test) when compared to pre-protocol PTLD incidence. Conclusion: The use of EBV titers as a tool to monitor EBV viral loads and assist in immunosuppressive management has an important role in the care of pediatric liver transplant recipients and leads to a decreased incidence of PTLD. Table 1 PTLD Non-PTLD Pre-protocol 5 25 Post-protocol 0 34 Abstract# 1056 IFN-γ γ γ γ γ GENOTYPE AND TGF-β β β β β INTERACTIONS CONTRIBUTE TO EBV-ASSOCIATED LYMPHOPROLIFERATIVE DISORDER (EBV-LPD) DEVELOPMENT. Anne M. VanBuskirk, 1 Tyler C. Hoppes, 1 Amy K. Ferketich, 3 Sameek Rowchowdhury, 2 Robert Baiocchi. 2 In post-transplant lymphoproliferative disorder (PTLD), EBV-reactive T cell memory responses are unable to control EBV-driven lymphoproliferation and transformation. The mechanisms leading to the memory immune response failure in PTLD are not well delineated, although a reduction in EBV-reactive CTL activity is important. IFN-γ contributes to T cell and macrophage activation. In contrast, TGF-β antagonizes the effects of IFN-γ, modulating responses by T cells and antigen presenting cells, and contributes to EBV reactivation. Previous data from a small study indicated an association of a specific IFN-γ polymorphism (A/A at base 847) with both PTLD and TGF-β mediated inhibition of EBV-reactive CTL in vitro. To more completely test the association of cytokine genotype and EBV LPD, we performed a prospective study comparing cytokine genotype and LPD development by engrafting peripheral blood leukocytes (PBL) from 49 EBV-reactive, IFN-γ genotyped donors into SCID mice (hu PBL-SCID mice). The A/A genotype for IFN-γ was significantly more prevalent (p=. 014) in rapid LPD producers compared to intermediate/late LPD producers or donors whose PBL did not produce LPD. All rapid LPD donors exhibited the A/A or T/A genotype, indicating that the A allele is significantly more prevalent in rapid LPD donors (p=. 025). Thus, LPD in hu PBL-SCID mice is associated with the same IFNγ polymorphisms as PTLD and inhibition of CTL restimulation. Because TGF-β has been shown to counteract the effects of IFN-γ, we went on to assess the contribution of TGF-β to LPD in vivo. Hu PBL-SCID mice were randomized to receive therapy with anti-TGF-β antibody or vehicle. ELISA demonstrated significant reductions in serum TGF-β (p=. 02) in treated mice, compared to controls. Preliminary data indicate that TGF-β neutralization results in expansion of human CD8+ cells in hu PBL-SCID mice. Importantly, this pilot study also shows that fewer treated mice (25%) develop LPD compared to controls (90%). Thus, TGF-β appears to contribute to LPD development at least in part by inhibiting CD8+ cell expansion. Given our previous in vitro data demonstrating the ability of TGF-β to inhibit EBV-reactive CTL restimulation in A/A and T/A genotype PBL donors, and the association of these genotypes with PTLD and LPD development, we suggest that TGF-β can inhibit CTL restimulation and expansion in people with the IFN-γ A/A or T/A genotype, thus rendering them more susceptible to PTLD. Renal cell carcinoma (RCC) is one of the most common donor transmitted malignancies encountered. Still, as little information is available defining the most effective treatment modalities, donor derived malignancies present a difficult therapeutic dilemma. Purpose: To examine the outcomes and therapeutic options of transplant patients with donor derived renal cell carcinoma. Methods: Treatment and mortality information on all solid organ transplant recipients with potential donor derived renal cell carcinoma was reviewed from our database. Results: 72 patients were identified to have received organs from donors with RCC. 47 recipients (65%) had tumor transmission of RCC. In 14 of these patients (30%) incidental tumors of the renal allograft were noted prior to implantation and local excision was performed. No tumor recurrences were noted and there was a 93% patient and graft survival rate. In 3 (6%) patients incidental RCC of the allograft was recognized within 6 months of transplantation. Each patient underwent partial nephrectomy with no recurrence and 100% patient and graft survival. The remaining 30 recipients (64%) were diagnosed with extensive allograft involvement and/ or diffuse metastatic disease. Of these patients, 17 (57%) had isolated allograft involvement. The mean time to diagnosis was 2.9 months. All of these patients underwent explantation with an 89% survival (neither of the 2 deaths were due to malignancy). 9 patients (30%) had allograft involvement and metastatic disease. The mean time to diagnosis was 9.5months. 2 patients survived (22%) with combined therapy of explantation, immunosuppression discontinuation (ISD) and immunotherapy (ITX). The final 4 patients (13%) had metastatic disease. The mean time to diagnosis was 21.6 months. 1 patient survived (25%) with explantation, ISC and ITX. Conclusions: The presentation of donor transmitted RCC can differ greatly. As a small allograft confined lesion the tumor can be excised safely while preserving renal function. In cases of extensive allograft involvement explantation and ISD is essential to salvage the patient. Finally, in patients with metastatic disease survival is poor but is associated with explantation, ISD and ITX. Introduction: Malignancy such as PTLD is a major complication of solid organ transplantation. Others have reported an increased risk of PTLD associated with certain immunosuppressive agents, such as OKT3 and tacrolimus. Our analysis of the North American Pediatric Renal Transplant Cooperative Study (NAPRTCS) registry was the first to show that tacrolimus, when used appropriately, was not a risk factor for PTLD (Pediatric Transplantation 2002; 6:396-9) . In all cases, the higher risk was not noted with early prospective clinical trials with short primary endpoints, but became apparent with longer follow up through retrospective analyses, either single-center based or registry-based. Dacluzimab and basiliximab are newer anti-IL2R blocking agents that are used as induction therapy in solid organ transplants. With longer follow up, the risks for malignancy due to these agents has not been reported. Methods: In this study, we queried the NAPRTCS registry for the rate of malignancy development in patients who received either dacluzimab, basiliximab, OKT3, or other (polyclonal) induction. These agents have been listed separately in the database since 1997 onwards. Results: In a previous analysis (Transplantation 2001;71:1065-8), we had reported no increased risk for PTLD with OKT3 use in pediatric renal transplant recipients. A re-review of the pre-1997 data showed a malignancy rate of 3.32% for OKT3 induction recipients versus 2.81% for those who did not receive OKT3 (P = NS). In the post 1997 era, the rate of malignancy in patients receiving OKT3 was higher at 3.9%, but this was not significant versus other strategies by Fisher's exact test (P = 0.23). OKT3 use has declined significantly in recent years, with only 7 recipients after the year 2000. The malignancy rates with dacluzimab and basiliximab are not higher than in those receiving other induction antibody (Table) , nor are they different from each other. Conclusions: In this early analysis, the first to look at malignancy rates with longer follow up times, dacluzimab and basiliximab do not appear to be associated with higher malignancy rates than other induction therapies or versus each other. INTRODUCTION Suppression of αGal antigen-antibody reaction is important for successful pig-to-human xenotransplantation. Genetic modification of donor pigs has been actively attempted in the world, and consequently, homozygous knockout cloned pigs for α1,3 galactosyltransferase (GT) were successfully produced. However, there is a possibility that such cloned pigs might express low levels of αGal epitopes by the function of a second GT gene such as iGb3. We have focused our attention on production of cloned pigs expressing endo-β-galactosidase C (EndoGalC) which can effectively digest αGal antigens on pig cells. The purpose of this study was to analyze the cells isolated from cloned piglets expressing EndoGalC and to assess its potential value for xenotransplantation. METHODS The coding sequence of EndoGalC derived from Clostridium perfringens was ligated into pCAGGS expression vector. The sequences of cytoplasmic tail, transmembrane domain and stem region of GT were fused upstream of EndoGalC gene to localize EndoGalC to the trans-Golgi network effectively. EndoGalC gene was inserted into primary cultured cells isolated from Meishan or Landrace neonate pigs. Two pig cell lines in which αGal expression was reduced to below 2% were obtained. These cells were used for nuclear transfer. The normal cleavage and development was observed in the nuclear transferred embryos after two days of in vitro culture. 1282 embryos developed to 2 cell -16 cell stage were transferred to 15 recipient pigs. RESULTS Two cloned piglets were born by natural birth. Flow cytometric analysis revealed that more than 98% of αGal epitopes were removed in fibroblasts and red blood cells from cloned piglets. EndoGalC gene expression was detected on various organs. Immunohistochemical staining with GS-IB4 demonstrated that the level of αGal expression in the heart, kidney and liver was reduced to an undetectable level. CONCLUSIONS The introduction of EndoGalC gene could eliminate αGal antigens almost completely. Its efficacy for αGal elimination was comparable to the targeted disruption of GT gene. The project on producing cloned pigs for xenotransplantation is now being implemented, but EndoGalC gene expression appeared to be useful as an alternative method to GT knockout. Survival times following miniature swine-to-nonhuman primate renal xenotransplantation (XTx) have remained limited despite removal of natural antibodies (nAb) directed toward the α-1,3-gal (Gal) epitope and the use of T cell tolerance induction regimens aimed at preventing elicitation of new, T cell dependent antibodies. The inexorable recurrence of anti-Gal nAb leads to accelerated humoral xenograft rejection (AHXR) and coagulation disorders within 2 weeks following XTx. We now demonstrate that neither hyperacute, nor AHXR occur using Gal knock-out (GalT-KO) inbred miniature swine kidneys, even without treatment to remove nAb and/or complement. Methods: Renal xenograft function in the early postoperative period of baboon recipients of GalT-KO grafts (n=9) was compared to historical control recipients of Gal+ grafts from miniature swine. GalT-KO graft recipients received donor vascularized thymic grafts, splenectomy, and completed a full immunosuppressive (IS) regimen designed to induce T cell tolerance (including T cell depletion, MMF, steroids and a human anti-human anti-CD154 mAb). Gal+ grafts recipients received extracorporeal immunoadsorption of anti-Gal nAb and complement inhibition (CI) with a full IS regimen. GalT-KO graft recipients did not receive EIA, and 2 of the 9 received no CI. Results: Recipients of Gal+ kidneys rejected their grafts within the first 2 weeks, showing evidence of humoral rejection. Life-supporting GalT-KO donor renal xenografts showed no evidence of humoral rejection (with the possible exception of mild thrombotic microangiopathy), and graft survivals were extended to a mean of ≥31 days, with the longest survival at 81 days with a functioning xenograft (death was secondary to pneumonia). A slightly higher creatinine (Cr) level was noted in the two recipients of GalT-KO xenografts that did not receive CI in the first week (1.12 mg/dl without CI vs. 0.67 mg/dl with CI on POD 7). However, by POD 14 the Cr levels were similar (1.51 vs 1.27). Conclusions: GalT-KO donor kidneys appear resistant to hyperacute and AHXR, without the need for removal of nAb or CI, although the latter may protect the grafts from non-specific injury during the early post-transplant period. Tolerance induction strategies may prevent subsequent T-cell dependent antibody responses to other determinants on the GalT-KO kidneys. We have previously reported that vascularized thymic grafts, transplanted either as composite thymokidneys (TK) or vascularized thymic lobe (VTL) grafts, induce tolerance across allogeneic barriers in inbred miniature swine, and survive up to 30 days in pigto-baboon xenotransplants using normal pig donors. We have now extended the latter studies using α-1,3-gal knockout (GalT-KO) pig donors. Methods: Baboon recipients of either TK (n=5) or VTL plus kidney (n=6) xenografts, from GalT-KO pigs, were treated with a protocol directed at tolerance induction (thymectomy, splenectomy, and T-cell depletion, followed post-transplant by a human anti-human anti-CD154 mAb, low dose steroids, and MMF). Control baboons received a GalT-KO kidney alone (n=3); one was similarly treated and 2 received chronic immunosuppression (CIS). Nine of 11 experimental recipients, and all controls, received CVF for the first 2 weeks. Results: Currently, 2 recipients of kidney plus VTL are POD 16. Immunosuppression was discontinued due to infection on PODs 5 and 20 in 2 recipients, and they rejected their kidneys on PODs 13 and 33, respectively. The other 7 recipients did not show evidence of rejection; two survived > 50 days (plasma creatinine < 1.5 mg/dl), and five expired from other causes (PODs 4, 16, 18, 26, and 31) , but with normal renal function (although some proteinuria). Patchy thrombotic microangiopathy was seen histologically. One long-term survivor demonstrated excellent renal function for 68 days before expiring during surgery to replace an infected IV catheter. Another experienced an apparent rejection crisis starting on POD 53, which reversed with anti-T cell rejection therapy; however, the recipient expired on POD 81 from pneumonia. Control baboons receiving CIS rejected their grafts on PODs 20 and 33, and the remaining control rejected its kidney on POD 34. Neither hyperacute nor accelerated humoral rejection was observed in any animal. Conclusion: Use of GalT-KO donors has extended the maximum survival of vascularized thymus plus renal xenografts in baboons from 30 to over 80 days. Although the induction regimen still needs to be modified to reduce complications, these initial results are encouraging. Co-transplanting vascularized thymic tissue with xenografts holds the potential of achieving long-term tolerance across pig-to-primate barriers. Cardiac allograft vasculopathy (CAV) is the major factor limiting long-term survival in heart transplantation (HT). The evaluation of coronary flow reserve (CFR) provides valuable information on the status of coronary vessels. Intracoronary Doppler flow analysis suggested progressive CFR impairment in HT, although its correlation with CAV is controversial. We assessed the potential role of noninvasive CFR evaluation as predictor of CAV. Methods: To evaluate the CFR in the left anterior descending coronary artery (LAD) of cardiac allograft we examined 26 HT recipients (13 male, aged 46 ± 12 years at HT) without wall motion abnormalities and hypertrophy by transthoracic echocardiography. Coronary blood flow velocity in the LAD was noninvasively detected at rest and during intravenous infusion of adenosine (0.14 mg/kg/min) at 7 ± 5.5 years after HT. CFR was obtained as the ratio of hyperaemic diastolic mean velocity (DMV) to resting DMV. Noninvasive assessment of CFR was achieved blindly from coronary angiography results and within 24 hours from catheterization. A CFR >2 was regarded as normal. Comparison of means was made by Student's t test. A p value <0.05 was considered to be significant. Results: CAV was diagnosed in 12 out of 26 patients (pts) (46%) (group A) while 14 pts had normal coronary angiograms (group B). Septal and posterior left ventricular wall thickness were similar in the two groups (9.6 ± 0.9 vs 9.8 ± 1.16 mm; 8.9 ± 0.6 vs 9.4 ± 0.8 mm respectively, p= NS). Blood haemoglobin, heart rate, systolic and diastolic blood pressure were also similar. CFR was 2.7 ± 0.7 in all pts and reduced in group A vs group B (2.19 ± 0.46 vs 3.13 ± 0.64, p=0.0001). Conclusions: Noninvasive assessment by transthoracic color Doppler echocardiography reveals that impaired CFR is associated with angiographically detectable CAV. CFR impairment seems related to dysfunction of the coronary microcirculation rather than enhanced myocardial oxygen consumption. Contrastenhanced transthoracic echocardiography may represent a promising noninvasive diagnostic tool in CAV. The development of left ventricular hypertrophy in patients after heart transplantation is usually associated with rejection, both cellular or humoral. The long-term outcome of these patients has not been established. Therefore, we reviewed 160 heart transplant patients with normal baseline echocardiograms one month after transplant surgery. 11 patients were subsequently found to develop posterior wall thickness 15 mm within the next 6 months. These patients constitute the Hyptrophy Group and the remaining 149 patients are the Control Group. There was no difference between groups for 5-year survival, freedom from cardiac allograft vasculopathy and freedom from any treated rejection. In addition, the mean arterial pressure (MAP) at 6 and 12 months and echocardiographic left ventricular ejection fraction (LVEF) at 6 months post transplant between the two groups were not different (see table) . The Hypertrophy Group was 90% male, had a mean age of 53 years, and donor mean age of 38 years which were all similar to the Control Group. 6/11 patients of the Hypertrophy Group normalized their increased wall thickness within 2 years. The remaining 5 patients had persistent hypertrophy. Conclusion: The development of left ventricular hypertrophy after heart transplantation appears transient for most patients and does not appear to be a marker for poor outcome. Systemic hypertension does not appear to be the cause of the left ventricular hypertrophy as blood pressures were similar/normal in both groups. The exact cause of left ventricular hypertrophy remains unclear but may be related to a self-limiting inflammatory state. The field of heart transplantation continues to progress with improved outcomes. As more of the large baby boomer generation reach older age and develop end-stage heart disease, we are finding that an increasing number of older patients are undergoing heart transplant surgery. However, data from several registries cite older patients as a risk factor for having less optimal outcome as co-morbidities of older age may compromise outcome. We reviewed 251 adult heart transplant patients between 1/90 and 9/97 to determine if older age 65 years is a true risk factor for poor outcome. Patients were divided into groups≥65 and < 65 years of age. The mean age of the older group was 67±2 years with 15% female while the younger group had a mean age of 51±11 years with 31% female. The 5-year outcome for the older group vs the younger group was comparable in survival and freedom from angiographic cardiac allograft vasculopathy (see table) . Although not significant, the older group had a higher incidence of cancer (33% vs 17%), which expected. The older group vs the younger group had a significantly lower first year average biopsy score (ISHLT grade 0=0, grade 1A=1, grade 1B=2, grade 2=3, grade 3A=4, grade 3B=5, grade 4=6) and less infection (see table) . This is consistent with immunoscenence in the older age group with regard to less rejection and subsequently less infection (due to reduced rejection therapy). Earlier studies have shown that the pre-transplant use of amiodarone may increase the risk of both mortality and re-transplantation following cardiac transplant. We therefore, explored whether amiodarone use was independently associated with the risk of cardiac re-transplantation. A retrospective analysis was conducted using the Organ Procurement and Transplantation Network data as of 5/31/2002 on the United Network for Organ Sharing (UNOS) Thoracic Registry. The analysis was restricted to first-time single organ cardiac transplant recipients between 4/1/1994 to 3/31/2001 with a known history of amiodarone treatment prior to transplant (n=13,204). The primary outcome was retransplantation within 60 months following cardiac transplantation. Follow-up time was truncated at the time of death or at 60 months after transplantation. The probability of re-transplantation was calculated using the Kaplan-Meier method. Hazard ratios were estimated using a covariate-adjusted Cox proportional hazard regression model. For risk factors exhibiting non-proportional hazard over time, hazards were modelled either as a function of time or by stratification. Results 13,204 patients met the study criteria, with 2,720 (20.6 %) patients having history of amiodarone use prior to transplantation. There were 120 re-transplantations in the 60 months follow-up period. The cumulative probability of re-transplantation at 60 months was 1.32% among amiodarone users compared to 1.53% among nonusers (p=0.61, figure) . Under the proportional hazard assumption, multivariate analysis revealed no significant difference in the risk of re-transplantation for amiodarone use patients with a hazard ratio of 1.00 (95% CI: 0.62-1.62,p = 0.99). Pre-transplant use of amiodarone does not appear to be associated with an increased risk of cardiac re-transplantation. Background Anti-cytomegalovirus (CMV) prophylaxis significantly decreased morbidity and mortality rates associated with CMV disease after heart transplantation (HT) but the impact of asymptomatic CMV reactivation is unknown. In this prospective study we hypothesize that asymptomatic CMV activation predisposes to acute rejection (AR) Methods 36 consecutive HT recipients (43±18 yr, 72%males) were enrolled. Immunosuppression consisted of prophylaxis with daclizumab, and a regimen of cyclosporine, prednisone and mycofenolate or sirolimus. All seropositive recipients received CMV prophylaxis, consisting of 4 weeks of intravenous ganciclovir. In addition, seronegative recipients receiving the graft from seropositive donor received CMVIgG. Endomyocardial biopsies (EMB) were taken weekly during the first month, and monthly thereafter. At each EMB time point peripheral blood leucocytes (PBL) were isolated for CMV-DNA PCR analysis. Results Patients were followed for 251±153 days; 490 EMBs and 454 blood samples were available for analysis. Although none of the patients presented clinical CMV disease, we found 24 (66%) patients with multiple positive CMV PCRs in PBL, thus indicating CMV reactivation. This group of patients had higher 6-months rejection score index than CMV negative ones (0.79±0.44 vs. 0.51±0.35, P=0.07). Similarly, estimated incidence of AR≥1B was 90±7% in CMV positive patients and 53±17% in CMV negative ones (P=0.02). Limiting the analysis to AR ≥ 3A, estimated AR incidence was 50±11% in CMV positive vs 20±13% in CMV negative (P=0.09) patients. Time to first positive CMV PCR was significantly shorter than time to first AR positive biopsy (53±51 vs. 104±92 days P=0.05), consistent with CMV reactivation preceding AR. Among multiple clinical and demographic characteristics evaluated, cold ischemia time (RR for each 30 min = 1.81; P<0.01) and CMV reactivation (RR= 5.31; P=0.04) were the only independent predictors of early AR. Conclusions Early asymptomatic CMV reactivation is associated with the subsequent development AR. These data suggest a role for CMV in rejection, which may result from direct consequences of viral replication or from an active interplay between host adaptive immune response to CMV and the alloimmune response to the graft. Anti-viral prophylaxis with ganciclovir appears to prevent acute CMV disease but leaves the recipient open to other consequences of viral persistence. The brunt of ischemia/ reperfusion injury (I/R) after heart transplantaion is taken by endothelial cells and can have long-term consequences leading to atherosclerosis. However, endothelial cells contribute a fraction of the total cells in transplanted hearts, we resorted to laser microdissection of endothelial cells from the hearts to determine the molecular adaptations of endothelial cells to I/R after transplantation. Vascularized heart allografts were exchanged between C57BL/ 6 mice donors and Balb/C mice recipients. Animals were sacrificed after 24 hours and coronary vessels were isolated from heart allografts and normal C57BL/6 mice. Endothelial cells were dissected out from coronary vessels using the Leica® laser dissector software. Total RNA was extracted and amplified for hybridization to the Mouse U74A GeneChip. We used Robust Multichip Analysis to normalize and quantify the expression levels of each gene. Results: In day 1 transplanted heart endothelial cells (THEC), there were 59 and 32 genes with 1.5 fold increased and decreased expression, respectively. These genes were involved in critical cellular metabolic pathways (Table 1) . Conclusion: In response to oxidative stress, THEC cells may increase expression of ETC genes to boost production of ATP and increase vesicular trafficking to eliminate oxidatively damaged proteins and improve cell viability. Because recent evidence indicates that supplementation with serine protease inhibitors (such as aprotinin) greatly reduces ischemia reperfusion damage, the reduced expression of this class of transcripts in the THEC cells may be an important event in the initiation of the ischemia reperfusion cascade, and the basis for the efficacy of this treatment. 10 1 The Queen Elizabeth Hospital SYNGERGY OF IMMUNOSUPPRESSION BY PG490-88 AND FK506 IS THROUGH DOWNREGULATION OF NF-kB AND IL It has been demonstrated that PG490-88 combined with FK506 or cyclosporin A prolongs allograft survival in rodent models. The present study was undertaken to determine whether PG490-88 and FK506 would prolong renal allograft survival in a non-human primate model. Extensive pathological studies were conducted to explore the possible mechanisms of this novel immunosuppressive agent. Materials and Methods: 17 cynomolgus monkeys were divided into five groups: A PG490-88 (0.03mg/kg/, for 30days) plus FK506 (1mg/kg/day) Three out of 8 monkeys in this group have been alive for more than 150 days. Pathological studies showed that lymphocyte infiltration was markedly decreased in the groups treated with PG490-88 plus FK506 compared to the control and FK groups treated alone. PG490-88 plus FK506 treatment also significantly inhibited IL-2 and NE-kB expression in infiltrated lymphocytes and reduced the tubulitis and necrosis. Immunohistochemical staining showed that IgG, IgM and C3 deposition were much less in the Groups A, B and C than in the Groups D and E. Conclusion: PG490-88 combined with FK506 synergistically prolonged renal allograft survival and attenuated cellular infiltration. The inhibition of NF-kB and IL-2 expression in infiltrated lymphocytes with a combined therapy of PG 490-88 and FK 1 Yves Vanrenterghem, 2 Marian Klinger, 3 Zbigniew Wlodarczyk, 4 Jean-Paul Squifflet, 5 the European FK778 Kidney Transplant Study Group. 1 Interne Geneeskunde In order to suggest the roles of these genes we performed a functional cluster analysis guided by PathwayAssist software (Iobion) This finding correlates well with the inflammatory, pro-apototic microenvironment we noted histologically, and therefore suggests targets for future therapeutic interventions EVIDENCE THAT TH1/TH2 IMMUNE DEVIATION IMPACTS ON EARLY GRAFT ACCEPTANCE AFTER PEDIATRIC LIVER TRANSPLANTATION: RESULTS OF IMMUNOLOGICAL MONITORING IN 40 CHILDREN The conventional immunosuppressive drug, cyclosporine A (CsA), affects cancer growth and metastasis, but it is still unclear how cancer is progressed under general immunosuppressive therapy after organ transplantation. Here we addressed how CyA promotes the dissemination of cancer cells that were carried to the grafted liver. The metastatic cell line of rat colon cancer, RCH-H4, was stably transduced with firefly luciferase, and then syngeneically inoculated (7 x 106 cells) into the liver of Fisher rats through the portal vein. The tumor-burden liver was then transplanted into the Lewis rat. The animals were followed daily by 15 mg/Kg of CyA (p.o.). Tumor progression was chased and evaluated by in vivo luciferase imaging system. Substantial luciferase expression was observed in the grafted liver of the CyA-treated animals at day 15, but not in the graft of mock-treated animals. In the un-transplanted Fisher rats, this immunosuppressive regimen showed that tumor cells metastasized to the liver (nearly 100%) and mesenteric lymph nodes (∼50%) at day 15. Luciferase expression was not observed in the lung, bone, and brain. This CyA regimen did not alter the number of natural killer cells and their killing function. Mock treatment with olive oil showed low metastatic frequency in the liver (17%) 1 1 Israel Penn International Transplant Tumor Registry/Div. of Transplantation Analysis of patient demographics, tumor stage, tumor location, cadaveric vs living related (LR) renal TXP grafts, and immunosuppression was performed. Diagnosis age and survival rate were compared to CRC patients in the SEER National Cancer Institute (NCI) database. Results: 150 TXP recipients with de novo CRC were identified, among which were 93 (62%) kidney, 29 (19.3%) heart, 27 (18%) liver, and 1 (0.7%) lung recipient. Median age of TXP was 54 years, compared to a median age of 72 years for patients in the SEER NCI database; median age at diagnosis of CRC was 59 years. Time from transplant to diagnosis of CRC relative to sex, race, tumor location, or tumor stage was not significant. Recipients of cadaveric renal TXP grafts were compared to those who received LR renal grafts. The results indicated a significantly shorter time from TXP Breast cancer (CA) is the most common malignancy in women and the second most common of cancer related deaths. However, little data exists regarding the outcomes of de novo breast CA in solid organ transplantation. The purpose of this study was to define the course of de novo breast CA in transplant recipients. We examined 151 pts diagnosed (Dx) with de novo breast CA post transplant. Results: 97% (147) females and 2.6% (4) males at a median age of 52 ± 10.3 yrs were Dx with breast CA at a median of 40.3 (3-300) months post transplant. 56% (85) were Caucasian, 11% (17) African American, and 32% (48) unknown. Transplanted organs consisted of 119 kidneys, 17 hearts, 10 livers, 3 lungs, 1 pancreas, and 1 intestine.Tumor histology, staging, and lymph node involvement was similar to the general population (GP) and consisted of 66% intraductal, 15% lobular, 51% stage I, 34% stage II, 9% stage III, 5% stage IV, and 19% (28) lymph node (+). Secondary site of tumor involvement included liver Three grafts continue to function well at 8, 19 and 105 days, respectively. The 105-day surviving heart shows some TM on biopsies. Mixed lymphocyte reaction to pig cells has been unresponsive and elicited antibody has been detected in only one baboon while receiving immunosuppressive therapy. Cellular infiltration of the grafts has been minimal or absent, and immunoglobulin and C4d deposition has been variable. Conclusions: GalT-KO hearts are not susceptible to hyperacute rejection, but TM develops that can lead to graft failure Diagnostic CAs and/or EMBs were carried out whenever the systolic velocity (Sm) at the LV basal posterior wall showed a drop of >15% from previous values. Results The 107 routine CAs revealed in 28 patients (26.2%) a new appearance or aggravation of TCA associated also with a Sm drop of >15%. The 58 diagnostic CAs revealed in 31 patients (53.4%) a new appearance or aggravation of TCA. Of the other 27 patients without angiographic changes, despite of Sm reduction, 26 (96.3%) showed relevant ARs in their EMBs. Among the 102 routine EMBs, 84 (82.4%) were ISHLT grade 0 and other 15 (14.7%) were ISHLT grade 1A or 1B. Sm reduction was detected in 12 of these patients, all with evidence of TCA Background: Many centers still recommend invasive endomyocardial biopsy (EMB) monitoring to detect acute rejection in pediatric heart-transplant recipients. Aim: The aim of the study was to determine the usefullness of routine EMB early and late after heart transplantation (HTx) in children. Material and methods: Thirty-eight patients (mean age at HTx= 11.8 years, mean follow-up= 6.8 years) underwent 278 EMB, 227 as routine surveillance EMB (R-EMB) and 51 as non-routine EMB (NR-EMB) for symptoms or changes in non-invasive tests. All were given ciclosporine-based triple immunosuppression. Histological ISHLT grade > II was defined as high-grade acute rejection (AR)and ISHLT = or < II as low-grade. Results: During the first 3 months postHTx, 44 of 93 EMBs were positive for AR, ranging 32 of 80 R-EMB (41%) and 12 of 13 NR-EMBs (92%) Between the 3 rd and the 12 th month postHTx , 3 of 8 AR occurred on R-EMB (2 highgrade) Fifteen were high-grade rejections, 14 occurring on NR-EMB; only one of the 117 late R-EMB was positive for high-grade AR (0.08%). Changes in Echocardiographic and Doppler measurements were the most frequent indication for NR-EMB; Echo-Doppler abnormalities were recorded in 81% of highgrade AR and 63% of low-grade AR. Clinical symptoms were present in 54% of highgrade AR but only 5% of low-grade. ECG changes were less frequent:18% of highgrade and 26% of low-grade AR. Ten NR-EMB were performed for suspected AR and were negative: echographic changes were present in 50%, clinical symptoms in 40% and ECG modifications in 20% of these EMB indications. Conclusion: In our experience, routine EMB could detect unexpected significant acute rejection in the first 3 months after HTx in children Abstracts Abstract# 1083 Poster Board #-Session CARDIAC ALLOGRAFT RECIPIENTS IS DISTINCT FROM THAT SEEN DURING ALLOGRAFT REJECTION. Mario C. Deng, 1 Mandeep Mehra Ochsner Clinic Expression Diagnostics Inc Background: Overexpression of heme oxygenase (HO)-1, hsp32, protects against cellular stress in many inflammatory events, including ischemia/reperfusion (I/R) injury. The toll-like receptor (TLR) system provides a triggering signal in the pathogenesis of infection and inflammatory diseases, and can promote activation of intracellular pathways leading to cytokine/chemokine expression. This study was designed to explore cytoprotective effects of HO-1 induced by cobalt protoporphyrin (CoPP), and its relationship with the TLR system in a model of mouse hepatic warm I/R injury. Methods: Partial warm ischemia was produced in the left/medium hepatic lobes for 90 min followed by 6 h reperfusion in C57BL/6 mice. Experimental animals were divided into 4 groups: 1) sham (n=3); 2) mice treated with saline 24 h before ischemia (n=5); 3) mice treated with CoPP (5.0 mg/kg i.p.) 24 h before ischemia (n=5); 4) mice treated with CoPP as in group 3 plus ZnPP (1.5 mg/kg i.p.) 25 h and 1 h before ischemia, resp. (n=5). Western blot analysis of HO-1 and TLR4 protein expression in the liver were performed before and after I/R. Serum GOT/GPT levels were measured, and liver samples were collected for histology. The mRNA expression of cytokines (TNF-α, IL-1β, INF-γ, and IL-10), INF-γ-inducible protein (IP-10), and TLR4 were analyzed by RT-PCR. Results: CoPP treatment significantly increased hepatic expression of HO-1 protein, as compared with untreated/sham controls. Adjunctive ZnPP diminished HO-1 levels before and after reperfusion. TNF-α, IL-1β, INF-γ and IP-10 mRNA expression were significantly inhibited in animals treated with CoPP (p<0.05), whereas IL-10 mRNA expression was upregulated (p<0.05). Although TLR4 mRNA and protein expression were upregulated by I/R, they were not significantly inhibited by CoPP monotherapy. I/Rinduced hepatocellular damage, as measured by sGOT/GPT release (IU/L) was significantly lower in animals treated with CoPP ( Isquemia/Reperfusion-Injury (I/R-I) of the liver is a common ocurrence in resectional surgery and liver transplantation and may impair regeneration, ultimatly leading to liver failure. The three major mitogen activated protein kinases (MAPK): ERK, p38 and c-Jun N-terminal kinase (JNK) are critical in transmission of signals triggered by proinflammatory cytokines, stress, and growth factors. Previous studies demonstrated that Estradiol reduces I/R-I. In this study, we assessed the effects of estradiol on liver function, survival and activation of MAPK in a murine model of reduced size liver I/R-I. Methods: 70% of liver mass in C57BL/6 mice were subjected to ischemia for 45 minutes. After reperfusion, the non-ischemic lobes were resected. Estradiol or the estrogen antagonist ICI-182780 were given 1 hour before the injury (n=7). ALT was assessed at 6 hours and apoptosis by ELISA at 24 hours. The activation of JNK, p38 and ERK was assessed by Western Blots. Results: Females presented lower initial hepatocellular injury (ALT=714±110) and 70% had indefinite survival after a reducedsize I/R-I, whereas no significant changes were observed in males or ovariectomized mice. Higher incidence of apoptosis was observed in male animals given saline (enrichment factor=7.22 ±0.8) versus males treated with estradiol (5.85±0.3, P,0.05). Conversely, estradiol promoted p38β (5.8±1.2 normalized to actin) and ERK (7.12±2.6 normalized to actin) activation compared with controls ( 1.28±0.8 and 3.36±1.12 normalized to actin, respectively, P<0.05). Conclusion: Estradiol limited hepatocellular injury and promoted survival following reduced size I/R-I to liver. Estradiol inhibited the activation of proapoptotic JNK pathway and induced the activation of the antiapoptotic p38β pathway, and proliferation through ERK. Estrogen therapy may be important in clinical conditions associated with I/R-I, specially split or living donor liver transplantation. Mice deficient in expression of a newly described CD28 homolog, B and T lymphocyte attenuator (BTLA), show increased antibody production and enhanced sensitivity to EAE, suggesting that BTLA, like CTLA-4 and PD-1, is an inhibitory receptor on T cells. We report the rationale for studying BTLA expression post-transplant, and the first and unexpected data concerning BTLA function during alloresponses. Flow cytometric studies using an anti-murine BTLA Ab showed weak expression by resting CD4 (36%) and CD8 (25%) T cells, plus >90% of B cells. T cell expression was markedly upregulated upon activation in vitro using CD3/CD28 mAbs. BTLA mRNA was also upregulated within the spleen and allograft of unmodified recipients; e.g. 17fold upregulation of BTLA mRNA expression in murine cardiac allografts (BALB/c to B6.129) vs. native hearts, and BTLA protein was localized to infiltrating lymphocytes. These data indicated upregulation of BTLA by naive T cells undergoing priming, as well as by graft-infiltrating effector T cells, leading us to analyze the function of BTLA in this context using BTLA-/-vs. wild-type mice. Studies of CFSE-labeled T cells undergoing in vivo activation (parent to F1 adoptive transfer) showed delayed entry into the cell cycle of alloreactive T cells of BTLA-/mice, and decreased production of IL-2 and IFN-g. Comparable effects on activation and cytokine production were seen upon in vitro stimulation of BTLA-/-vs. control T cells using CD3/CD28 mAb. Hence, studies using CFSE-labeled cells indicated that T cells mount an attenuated Th1 response in the absence of BTLA costimulation. Survival of cardiac allografts was significantly prolonged in BTLA-/-mice ( Children's Hospital, Seattle, WA; 4 Division of Nephrology, Children's Hospital Boston, Boston, MA; 5 University of Michigan, Ann Arbor, MI.Background: Under current policy, pediatric deceased donor livers are preferentially allocated to high risk (PELD >46; >50% 3-month mortality) pediatric candidates in the local OPO area. Very few pediatric candidates have high PELD scores in any given OPO. Thus, lowering the PELD threshold alone may not result in many additional pediatric donor to pediatric recipient transplants (PED-PED LTx). Regional sharing may be required to achieve this goal. Methods: We used the Liver Simulated Allocation Model (LSAM) to study effects of regional and PELD-threshold allocation on PED-PED LTx compared to current policy. Data from 22,323 waitlist candidates and 2,719 deceased donor livers available between 4/1/02 and 9/30/02 were included in the simulation. PELD thresholds (46, 30, 20, 10) were tested with regional sharing. After status 1, LSAM rules offered pediatric livers first to pediatric candidates on the regional list above the threshold, then to adults with MELD >32 (>50% 3-month mortality) in the region, then to children below the threshold, and finally to adults with MELD <32. Results: Compared to current policy, regional allocation of pediatric donor livers using the existing PELD 46 threshold increased the predicted 6-month number of PED-PED LTx by a mean of 18 (11%) (average of 10 LSAM runs) and was associated with 1 less pediatric waitlist death ( Figure) . Under regional sharing, lowering the PELD threshold to 10 resulted in 4 additional PED-PED LTx (2%) and 1 more averted pediatric waitlist death. Conclusions: Regional allocation of pediatric donor livers is predicted to increase PED-PED LTx opportunities and decrease waitlist deaths for children. There is a small incremental effect of lower PELD thresholds for regional sharing beyond that achieved by regional sharing for higher risk children.Introduction: FTY720 is the first in a new class of immunomodulators and may represent a better tolerated therapy for post-transplant maintenance regimens. A phase 2 study reported equivalent efficacy for FTY720 5 mg/RDN vs MMF/full dose Neoral (FDN) in de novo renal transplant recipients. The safety/tolerability profile for FTY720 at 12 months post-transplant was analysed. Methods: Adult patients undergoing a primary cadaveric or living donor transplant were randomised to: daily doses of FTY720 5 mg/ RDN, FTY720 2.5 mg/RDN, FTY720 2.5 mg/FDN or MMF 2.0 g/FDN. All patients received corticosteroids but no antibody induction therapy. For the RDN regimens, the Neoral dose (C2 level) was adjusted to achieve exposure at 50% less than C2 targets for FDN regimens. Results: 261 patients who received at least one dose of maintenance medication were included. Patient demographics were similar across treatment groups except for the inclusion of more male subjects who received FTY720. Although the overall incidence of adverse events and serious adverse events were comparable, the metabolic profile for FTY720 5 mg/RDN showed a trend towards a potential safety advantage compared with MMF/FDN: for patients without lipid-lowering therapy HDL >1.55 mmol/L (48.3% vs 30.0%, respectively) and LDL ≤3.35 mmol/L (51.7% vs 60.0%, respectively). The incidence of clinically relevant infections is reported below. Owing to the expected pharmacodynamic effect on heart rate, bradycardia was reported for all FTY720 groups (28.9-29.2%); this was transient, self limiting and not associated with a relevant increase in cardiac morbidity. No difference in malignancies was reported for FTY720 5 mg/RDN and MMF treatments (2.8% vs 2.6%). Conclusions: FTY720 5 mg/day plus RDN is well tolerated in renal transplant patients and may confer safety advantages over conventional regimens with respect to its improved metabolic profile and reduced incidence of infections. 2 Klemens Budde, 2 Yves Vanrenterghem. 1 1 Department of Nephrology and Renal Transplantation, University Hospitals Leuven, Leuven, Belgium; 2 Department of Nephrology and Renal Transplantation, University of Berlin, Charité, Berlin, Germany; 3 Studies have provided conflicting results as to the protective role of calcium channel blockers (CCB) in cyclosporine(CsA)-treated patients with regard to blood pressure control and preservation of renal graft function. We conducted a multicentre prospective, randomised, placebo-controlled study in de novo recipients of a primary cadaveric renal graft on CsA therapy. The aim of this 2-year study was to demonstrate that lacidipine could prevent deterioration of renal graft function and to asses the effect of lacidipine on graft function (plasma iohexol clearance), renal plasma flow (plasma PAH clearance), anastomotic arterial blood flow (doppler), systolic/diastolic blood pressure, acute rejection and hospitalisation rate. A total of 118 recipients were available for an intention-to-treat analysis on efficacy (lacidipine: n=59; placebo: n=59). Three patients on lacidipine therapy and 4 on placebo experienced treatment failure defined as an increase in serum creatinine from baseline of more than 60% (p=0.57). Graft function assessed by serum creatinine concentration and measured GFR, was better in lacidipine-treated patients from 1 year onwards (p<0.01 and p<0.05). RPF and anastomotic blood flow were not persistently higher in lacidipinetreated patients. Study groups did not differ in acute rejection rate, trough blood CsA concentrations, systolic /diastolic blood pressure, number of anti-hypertensive drugs, hospitalisation rate and adverse event rate. The use of lacidipine in CsA-treated renal recipients results in a significantly better graft function at 2 years and this effect is independent of blood pressure.Plasma iohexol clearance (GFR) and PAH clearance (RPF), anastomotic blood flow and serum creatinine at 1,12 and 24 months post-transplantation. Month 12 Month 24 GFR (mL/min/1. Both sirolimus (RAPA) and corticosteroids (CS) have been shown to adversely impact wound healing. Moreover, since RAPA and CS may have additive effects on wound healing; we sought to determine if corticosteroid avoidance (CSAV) would ameliorate the wound healing complications associated with RAPA. METHODS: 109 patients (pts) treated with a CSAV regimen (no pre or post transplant CS) were compared to a historical control group (n=72) that received cyclosporine (CsA), mycophenolate mofetil (MMF) and CS. The CSAV regimen included thymoglobulin (mean 2.6 doses), RAPA (8-12 ng/ml), MMF (2 grams/day), low-dose CsA, (trough 100 ng/ml, discontinued at 4-6 months) along with arginine and canola oil nutritional supplements. Complications were classified as: wound healing complications (WHC) or infectious wound complications (IWC). WHC include lymphocele, hernia, dehiscence, and skin edge separation. IWC include wound abscess and empiric antibiotic therapy for wound erythema. RESULTS: The CSAV group was largely CS free: 11% of pts received CS for rejection, 12% received CS for recurrent disease and 85% of pts are currently off CS. 5% of patients who received CS develped wound complications however, CS were started after the development of all wound complications. CSAV (n=109) Control ( Ischemia-reperfusion injury (IRI) represents a serious problem affecting transplantation outcomes. Both innate (e.g., PMNs, complement) and adaptive (T lymphocytes) immunity contribute to the development of liver IRI. However, the molecular mechanisms interlocking these cellular cascades remain to be elucidated, in particular: (1) which innate immune system is triggered by IR? and (2) how its downstream mechanism triggers T cell activation? We identified TLR4 as a possible initiating innate immune system, responsible for hepatic IRI. By microarray analysis of IR-injured livers, we also identifed a single chemokine, IP-10, significantly upregulated in and associated with irreversible IRI. Indeed, post-ischemic neutralization of IP-10 did ameliorate hepatocellular injury. Here, we determined the intracellular signaling pathway of TLR4 activated by IR, which led to IP-10 upregulation/hepatocellular injury in a mouse partial liver warm IR model (90 min. ischemia, followed by 6 h of reperfusion). At least three distinct intracellular signaling pathways have been identified downstream of TLR4 activation, each of which produceing somewhat overlapping but different proinflammatory profiles. To differentiate between these pathways, TLR4-, MyD88-, and IRF3-KO mice were used. Liver damage was evaluated by sALT levels and histology (Suzuki criteria). Intrahepatic induction of IP-10 and TNF-α were measured by quantitative RT-PCR. While WT B6 mice suffered severe IRI, as evidenced by increased sALT levels and liver pathology, TLR4 KO mice were IRI resistant (sALT IU/L=276±50 vs. 2149±485 WT controls). MyD88 KO mice remained IRI sensitive (722±479 vs. 1122±306 WT controls), whereas IRF3 KO mice were IRI resistant (396±158 vs. 4467±1737 WT controls). Correlated with the liver damage, intrahepatic IP-10 induction was abolished in TLR4 KO, and IRF3 KO, but sustained in MyD88 KO mice. As a putative effector molecule, intrahepatic TNF-α expression was comparable with IP-10 levels in all the groups. Thus, liver IR activates TLR4 through MyD88independent, but IRF3-dependent pathway to induce IP-10, which may then be responsible for recruitment/activation of T cells, leading to full scaled IRI. Our study identifies a key signal transduction pathway during liver IRI, which links innate and adaptive immunity in liver IRI. Targeting molecules along this pathway may provide us with much needed and novel anti-IRI therapeutic approaches in the clinics. Though the field seems to be awash with new costimulation molecules, in many cases, their regulation, distribution and function in vivo are unknown. One such example, B7-H3, was reported to costimulate T cell proliferation through binding to an unknown receptor on T cells and enhance IFN-g production, but recent data suggested this ligand dampens host immune responses. We report the first clinical and experimental data concerning expression and function of B7-H3 in alloresponses. Immunohistologic studies showed florid B7-H3 expression by infiltrating mononuclear cells during human renal allograft rejection. In addition, immunohistology and real-time RT-PCR showed a 4-5-fold upregulation in murine cardiac allografts vs. isografts, and flow cytometry showed induction of surface B7-H3 expression by activated CD4 and CD8 T cells (CD3/CD28), as well as activated macrophages and mature dendritic cells. To analyze the significance of B7-H3 in rejecting allografts we generated B7-H3-/-mice by homologous recombination and used them, along with wild-type mice (B6/129), as recipients of cardiac BALB/c allografts. Though cardiac allografts in knockout or control mice were rejected with comparable speed (7-8 d) in this fully MHC-mismatched model, synergistic effects of targeting B7-H3 in conjunction with limited immunosuppression were observed. Thus, a subtherapeutic course of CsA led to rejection by 10 days in control mice, whereas CsA use in B7-H3-/-mice led to a mean survival of 35 d (p<0.01). Moreover, a regimen of RPM which gave 12-14 days of survival in controls, led to permanent engraftment (>100 d, 80%, p<0.001) in B7-H3-/-mice. Studies by real-time RT-PCR showed that allografts in B7-H3-/-mice had decreased production of cytokine (IL-2, IFN-g), chemokine (IP-10, MCP-1) and chemokine receptor (CXCR3) mRNA compared to wild-type controls, and in each case the decreases in mRNA expression were enhanced in stepwise manner by immunosuppression (RPM>CsA). Consistent with this, analysis of CFSE-labeled T cell responses following adoptive transfer in vivo (parent to F1) showed B7-H3-dependent production of IL-2 and IFN-g by CD4 and CD8 T cells, respectively. Lastly, in contrast to controls B7-H3-/-mice treated with CD40L mAb showed an absence of chronic rejection. We conclude that the B7 homolog, B7-H3, promotes Th1-mediated immune responses and the development of acute and chronic allograft rejection in small animal models and, likely, clinical allograft rejection. Two of the best characterized costimulatory pathways include CD28/B7 and CD40/ CD154. Blockade of these pathways simultaneously, using fusion protein constructs (eg. CTLA4-Ig) and/or mAbs, can promote prolonged allograft survival. Interestingly there are some models in which costimulation blockade is less effective, primarily due to the ability of CD8 T cells to act in a CD28/CD154 independent manner. In recent years numerous novel costimulatory pathways have been described.In the current study we tested whether signaling through any of these alternative costimulatory pathways could be critical in "CD28/CD154 blockade resistant rejection." For the initial evaluation we employed a well-described GvHD model. Irradiated BALB/c mice received B6 T cells which had previously been labeled with CFSE to track cellular division in-vivo. Experimental groups included animals which received no treatment, costimulation blockade (CoB) alone (CTLA4-Ig/anti-CD154) or CoB in combination with an additional agent. Additional agents tested included anti-ICOSL, anti-CD70, anti-41BBL, or anti-CD134L. The combinations of CoB + anti-ICOSL (p<0.02) and CoB + anti-CD134L (p<0.03) significantly inhibited both CD4 and CD8 T cell proliferation when compared to CoB alone. In the following experiments we evaluated these agents in a fully allogeneic mouse skin graft model. It was shown that simultaneous blockade of CD28-and CD40-mediated costimulatory signals significantly prolonged allograft survival. Although these results led to an expectation of the establishment of specific immuno-tolerant therapy for organ transplantation, it became evident that these treatments rarely resulted in indefinite allograft survival. In order to uncover the mechanisms underlying these costimulation blockade-resistant allograft rejections, first, we have studied the process of allogenic skin graft rejection, by the use of CD28 and CD40L double deficient mice ( dKO ). It was found that skin grafts from MHC-missmatched donors were rapidly rejected by dKO mice and these rejection were mediated by CD8+ T cells that were primed with donor antigens via direct antigen presentation. These data indicated that some elements beside CD28-and CD40-mediated costimulatory signals would provide stimulatory signals for the activation of donor-specific CD8+ T cells. We postulated that nonspecific inflammation associated with transplantation resulted in the production of various inflammatory cytokines, which facilitates immune response against graft. Thus, second, we investigated the role of non-specific inflammation in the graft rejection. RAG -/mice were transplanted with BALB/c skin graft and adoptively transferred with B6 CD8+ T cells in the presence or absence of blockade of CD28 and CD40L mediated costimulatory signals. If CD8+ T cells were transferred at the time of transplantation, grafts were acutely rejected ( MST: 12.3 days ). Costimulation blockade had only a limited effect on prolongation of graft survival in this setting ( MST: 28.6 days ). In contrast, if CD8+ T cells were transferred at 50 days after transplantation at the time when transplanted grafts were well-healed, costimulation blockade effectively prolonged skingraft survival ( MST: 81.4 days ) compared to no treatment group ( MST: 15.7 days ). These results indicated that in the absence of CD28-/ CD40-costimulation, factors associated with inflammation play an important role for the priming of donor specific CD8+ T cells, and suggested that control of inflammation at the site of grafts may be the key element for the successful treatment of graft rejection by the costimulation blockade. with CsA and none with LEA29Y. Conclusions: LEA29Y, a potent co-stimulation blocker, demonstrated similar efficacy at preventing AR, better renal function, and reduced hypertension, hyperlipidemia, and PTD compared with CsA at 6 months in renal transplant recipients. These data suggest that co-stimulation blockade with LEA29Y may offer a new paradigm for improving long-term outcomes in maintenance immunosuppression following renal transplant. BACKGROUND: The effect of immunosuppression with co-stimulatory blockade and anti-IL2 receptor antibody on regulatory T cells is not known. This study examines regulatory T-cells in the peripheral blood of kidney transplant patients who participated in a Phase II multicenter trial that involved induction therapy with anti-IL2Rα (CD25) antibody and maintenance with a calcineurin inhibitor free regimen using LEA29Y. LEA29Y is a modified version of CTLA4Ig specifically designed to provide potent immunosuppression via blockade of the B7/CD28 co-stimulatory pathway. METHODS: Peripheral blood was collected between 6-30 months post-transplant from 21 of 33 kidney transplant recipients at our site enrolled in a prospective, randomized trial studying the use of LEA29Y. All patients received induction therapy with anti-IL2Rα antibody (basiliximab) as well as standard maintenance with MMF and steroids. 18 of these patients also received LEA29Y, while the other 3 received cyclosporine A (CsA). Peripheral blood was also collected from 3 healthy donors. Flow cytometry was used to determine the number of CD4+CD25hi cells as a proportion of all CD4+ lymphocytes. Using magnetic bead separation, CD4+ lymphocytes were then isolated, followed by separation into CD4+CD25hi and CD4+CD25lo/CD25populations. Proliferation in culture was measured by ³H-thymidine uptake in response to platebound anti-CD3. Suppressive activity of the CD4+CD25hi cells was measured by titration into the CD4+CD25lo/CD25-population of cells. RESULTS: The numbers of CD4+CD25hi lymphocytes as a percentage of CD4+ lymphocytes were 4.5%, 4.3%, and 4.2% for healthy donors, patients receiving LEA29Y, and patients receiving CsA respectively. The CD4+CD25hi T-lymphocytes exhibited properties characteristic of regulatory T cells, namely reduced proliferation in response to nonspecific stimulation by anti-CD3, and dose-dependent suppression of proliferation of CD4+CD25lo/CD25-lymphocytes. The overall efficacy of patients treated with LEA29Y was comparable to CsA treated patients. CONCLUSIONS: CD4+CD25hi T-cells are present in the peripheral blood of kidney transplant patients who had induction therapy with monoclonal anti-IL-2Rα antibody and maintenance with either CsA or the novel co-stimulation blocker, LEA29Y. These cells appear to possess characteristics of regulatory T cells, similar to those found in healthy donors. Introduction: Development of modern immunosuppressive regimens in transplantation has focused on maintaining efficacy while improving long-term outcomes by preserving renal function and managing CV risk factors. Minimizing or avoiding calcineurin inhibitor (CNI)-related toxicities has gained recognition as a means for achieving these goals. This multicenter Phase II non-inferiority design study compares the safety and efficacy of the potent co-stimulation blocker LEA29Y vs. cyclosporine A (CsA) in a quadruple immunosuppressive drug regimen in renal transplant. LEA29Y is a modified version of CTLA4Ig, designed to provide immunosuppression in transplantation via blockade of the B7/CD28 co-stimulatory pathway. Methods: Renal allograft patients received mycophenolate mofetil, corticosteroids and basiliximab and were randomized 1:1:1 to maintenance treatment with CsA (N=73) or LEA29Y, more or less intensive dosing regimen. CsA and LEA29Y treatment arms were open-label. Assignment to lower-and higher-intensity LEA29Y groups remained blinded and groups were combined for analysis (N=148). The primary 6-month endpoint was incidence of biopsyproven acute rejection (AR). Results: Baseline recipient and donor characteristics were comparable among groups. Biopsy-proven AR rates were similar (19% for LEA29Y vs. 18% for CsA). Incidence of Grade I or Grade II AR was also comparable (4% vs. 6% and 15% vs. 12%, respectively, for LEA29Y vs. CsA). 6-month renal function with LEA29Y treatment was significantly better compared with CsA. Median measured GFR was 14ml/min/1.73m 2 higher in LEA29Y treated patients (p<0.05). Graft loss was 3% (LEA29Y) vs. 4% (CsA) and death rate was lower with LEA29Y (1% vs. 5% for CsA). Discontinuation and adverse events, including infection and malignancies, were similar across groups. LEA29Y infusions were well tolerated. LEA29Y was less often associated with typical CNI-related toxicities such as hyperlipidemia, hypertension, and diabetes. Conclusions: Compared with CsA, potent co-stimulation blockade with LEA29Y as part of a CNI-free maintenance regimen provided similar efficacy, was generally safe and well tolerated, showed less decline in renal function, and reduced the CV/metabolic risk. These data suggest that co-stimulation blockade with LEA29Y may offer a new paradigm for improving long-term outcomes in maintenance immunosuppression following renal transplant Introduction: There is little data published that addresses mycophenolate area under the curve (AUC) and allograft rejection episodes. Protocol allograft biopsies have allowed us to study this question. In this study, AUCs were obtained in 35 kidney transplant recipients who underwent protocol biopsy. Mycophenolic acid (MPA) AUC values were calculated using the trapezoidal AUC method. Methods: Fasting MPA troughs were obtained in 35 consecutive kidney transplant recipients (57% male) from March 2002 through August 2003. The average dose of mycophenolate mofetil (MMF) was 2 grams daily (maximum 3.5 grams/day; minimum 0.25 grams/day). The standard immunosuppression regimen was prednisone, tacrolimus, and MMF in > 90% of patients. The majority of patients also received phosphorus 250mg 4 tabs 4 times daily and calcium 650 mg 4 tabs 4 times daily for electrolyte abnormalities, which are known to interfere with MMF absorption, even though patients were instructed to take the doses at least 1 hour prior to or 2 hours after MMF doses. Results: Three levels of AUC were determined: low (AUC < 20 mcg.hr/mL; N=16), mid (AUC 20.1 to 40 mcg.hr/mL; N=15), and high (AUC > 40.1 mcg.hr/mL; N=4). Subclinical rejection episodes occurred in 75% of patients with an AUC low, 47% in AUC mid, and 25% in AUC high group. The early AUC appears to determine the frequency of subclinical allograft rejection. The standard one gram twice daily MMF dose resulted in 60% of AUCs in the low group. Late chronic allograft nephropathy may be due to inappropriately low early AUCs resulting in subclinical rejection. AUCs should therefore be used to get an appropriate level early and maximal benefits of MMF to thereby decrease these rejections. Hepatocellular carcinoma (HCC) is the fifth most common cancer and the third leading cause of cancer death in the world. We investigated the genes involved in viral carcinogenesis and tumor progression in patients awaiting liver transplantation with Hepatitis C Virus (HCV) and HCC. We performed the analysis of hepatic gene expression in HCV-infected liver recipient patients with different stages of disease with and without HCC. Methods: Gene expression profiling using microarray technology was performed in different RNA pools from liver tissue including: early HCV-cirrhosis, late-HCV cirrhosis, early-HCV-HCC, HCV-HCC (T3-T4), and normal liver tissues. In addition, we compared the gene expression profile from HCV-cirrhotic livers with alcoholic cirrhotic livers. Expression summaries for every probe set were calculated using three different algorithms. Results: Transcripts more highly expressed in HCV-cirrhosis than in alcoholic-cirrhosis included IFN-inducible genes and those associated with antigen presentation and processing. The down-regulated genes from the comparison between HCV-cirrhosis and alcoholic-cirrhosis were those related to extracellular matrix production and deposition. The transcript DLK1 was up regulated 86-fold in the advanced HCC pool vs. early HCC. The expression of alpha-fetoprotein in advanced HCC pool was more than 50-fold higher than in early HCC pool. Glypican 3 was highly expressed in T3-T4 pool. Five probe sets showed up-regulation of the gene expression for CD24 from the comparison between HCC pools with the normal liver pool. In addition, IGF-2 transcript was elevated 5-fold in HCV-HCC samples. The expression of CYP3A4 and CYP3A7 was higher in HCV-HCC pool, whereas it was significantly lower in normal liver tissues. Conclusions: We found consistent differences between the gene expression patterns in HCV-HCC compared with early HCV-cirrhosis and late HCV-cirrhosis, and normal control livers. The expression patterns in HCC were also readily distinguished between early and advanced HCC tumor stages. We found different gene expression patterns between early cirrhosis and late cirrhosis. These findings confirm the presence of multiple molecular alterations during HCV-HCC hepatocarcinogenesis and provide a practical and testable method to identify prognostic factors associated with HCV-HCC progression and recurrence. The clinical consequences of hepatic ischemia-reperfusion injury (HIRI) in liver allografts can be profound. However, little is known about the pathogenesis of this process. Methods: We performed sequential liver biopsies over the course of 17 living donor liver transplant operations and defined hepatic gene expression profiles at initial laparotomy (OPENING), after liver transection but before clamping of vascular inflow (PRECLAMP), and 2 hours following arterial reperfusion of the liver (post hepatic artery, PHA). Total cellular RNA was extracted from each biopsy and interrogated using either a 19K human microarray or a 2K ImmuneArray specific to human inflammatory genes. Results: Histological analysis confirmed that livers at OPENING were entirely normal; PHA biopsies had evidence of inflammatory change and early hepatocyte apoptosis. Tissue glutathione (GSH) levels decreased from the PRECLAMP to the PHA phase, confirming the underlying ischemic insult (relative to OPENING, GSH 1.30±3 PRECLAMP vs 0.70±2 PHA, p<0.03 ANOVA). Microarray gene expression data was normalized using GeneTraffic (Iobion) and in-house software and fold changes calculated in relation to baseline OPENING data. Nine annotated genes were increased by ≥2 fold with a p value ≤0.01 in the PRECLAMP phase (43 ≥1.5 fold), and 13 in the PHA phase (39 ≥1.5 fold) Steatotic livers have a greater risk of primary non-function (PNF). Consequently, approximately 25% of livers are discarded secondary to steatosis. Evaluation of H&E stained biopsies by visual interpretation is subjective. We hypothesized that H&E staining of frozen sections fails to estimate the degree of steatosis from a liver biopsy of a potential donor organ. We developed a computer program to objectively assess fat content based upon differential quantification of color pixels in Oil Red O (ORO) stained liver biopsies. This was then compared to standard H&E and ORO, pathologistread biopsies. H&E and ORO stains were performed on 25 consecutive frozen sections of donor liver biopsies to determine fat content. Slides were read by a pathologist, a transplant surgeon, and evaluated by the computer program. Adobe PhotoShop 5.5' was used as base software in conjunction with ORO stained slides to objectively measure fat (fig 1) . Results from the 3 methods were grouped based on computer ORO fat content into 3 groups. The human estimated ORO slides were compared to human estimated H&E slides and to the computer evaluated slides. Samples with a fat content >20% showed marked variation between human interpretation and computer analysis. There was also a significant difference in the human interpretation of fat based upon staining method. This difference ranged from 3% to 37% with H&E. Although metabolic response after partial hepatectomy has been well studied in animal models, there are few studies examining restoration of metabolic capacity after right hepatectomy in humans. We used 13 C-labeled phenylalanine (13 C-P) administered orally or IV in 7 living liver donors and measured exhaled 13 C-P labeled CO2 to determine the extent of metabolic impairment and time course of its return. For orally administered 13 C-P, all subjects had dramatic drops in 13 C-P metabolism 2-4 days after surgery. Two of the 7 patients had restoration of the 13 C-P metabolic capacity 4 and 7 days after right lobectomy. The remaining 5 did not achieve pre-op levels of 13 C-P oxidation by as long as 58 days after surgery for orally administered substrate. Those recovering 13 C-P metabolism had significantly higher dose recovery 60 minutes after ingestion by day 4 (0.97 vs. 3.06, P=0.033) and day 7 (1.50 vs. 5.02, P=0.031). One patient given intravenous 13 C-P exhibited only a 43% reduction of 1 C-P metabolism on day 2, 25% on day 4 and 35% on day 7 compared to an 86% reduction on day 3, 92% on day 5 and 94% on day 8 for orally administered 13 C-P (figure 1). We conclude that orally administered amino acids may are not well absorbed and/or metabolized in some subjects for weeks after partial hepatectomy whereas intravenously delivered substrates are much better oxidized by the regenerating liver. These findings may be due to impaired gut motility or portal venous flow that reduces delivery of oral agents after liver surgery. These preliminary findings have wide implications for nutrition and drug delivery in the early recovery phase for living liver donors. . SVM analysis of B-cell receptor expression in a test subset of patients (n=9) yielded a classification algorithm, which identified rejectors and non-rejectors with 100% accuracy in a validation subset (n=6). Conclusions: In children treated with rATG, rejection despite T-cell hyporesposiveness was associated with enhanced donor-specific alloreactivity, and not with decreased tacrolimus levels. A relative sparing of APCs from the depleting effects of rATG, and the ability of B-cell responses to distinguish rejectors from nonrejectors suggests that antigen-presenting mechanisms and their modulation are pivotal determinants of the balance between rejection and graft adaptation in pediatric liver recipients pretreated with rATG. Background: Pediatric and adult transplant recipients who are EBV seronegative and receive an organ from a seropositive donor (EBV D+/R-) are at increased risk for EBV and may benefit from specific antiviral prophylaxis. Viral load testing has become common in this setting and can be used as a surrogate outcome for clinical trials assessing prophylaxis. We performed a multi-center RCT assessing two different antiviral regimens and their effect on EBV replication. Tumor histology, stage and survival appear equivalent to that observed in the general population. Despite this, a more aggressive surgical approach is utilized in TX pts than the GP. We thus believe breast conservation should be encouraged in appropriate transplant patients with breast cancer. While we have previously demonstrated that Th2 cytokines are detrimental and Th1 cytokines are beneficial to xenograft survival (Nature Medicine 2000), the immunological factors controlling cytokine profiles of anti-xeno responses remain unknown. We previously reported that CD11c + CD8α + cells (DC1-like) were predominant in C57BL/6 recipients, in which Lewis rat heart grafts were slowly rejected with typical cell-mediated rejection (CMR) in 21 days. In contrast, CD11c + CD8α -(DC2-like) were predominant in BALB/c mice, in which the Lewis heart grafts were rapidly rejected with typical acute vascular rejection (AVR) in 6 days. These data suggest that DCs may play an important role in navigating xenograft immune responses. The present study was undertaken to elucidate the role of distinct subsets of DCs in controlling xenograft rejection. Purified CD8α + or CD8α -DCs isolated from wild type BALB/c mice were adoptively transferred into naive BALB/c mice respectively, and they subsequently received Lewis heart grafts. Adoptive transfer of CD8α + DCs into BALB/c mice shifted the response to Th1, thereby altering the pattern of rejection from AVR to CMR and significantly prolonging xenograft survival to 14.2 ± 0.8 days, while transfer of CD8α -DCs did not change the pattern of rejection and the grafts were rejected at 6 days. In addition, transfer of exogenous CD8α + DCs rather than CD8α -DCs from wild-type C57BL/6 mice into IL-12 -/-C57BL/6 mice prevented AVR and prolonged xenograft survival to 16.4 ± 0.9 days, while IL-12 -/-C57BL/6 mice without transfer rapidly rejected the xenograft with AVR in 6 days. Furthermore, adoptive transfer of CD8α -DCs from IL-12 -/mice into wild-type C57BL/6 recipients shifted the immune response to Th2, thereby altering the pattern of rejection from CMR to AVR and the grafts were rapidly rejected in 9.5 ± 0.6 days. We conclude that transfer of CD8α + DCs (DC1-like) induces Th1-mediated CMR and prolongs xenograft survival, whereas administration of CD8α -DCs (DC2-like) facilitates Th2-mediated AVR and accelerates xenograft rejection. These data suggest that distinct DC subclasses differentially affect the mechanism of xenograft rejection and ultimately affect graft survival. Vincenzo Mirenda, 1 Joseph Read, 1 Ivan Berton, 1 Anthony Warrens, 1 Robert Lechler. 1 1 Immunology, Imperial College London, London, London, United Kingdom. Although species incompatibility underlies most of the hurdles to xenotransplantation, it can also be exploited to achieve donor-specific immunosuppression. With this in mind we cloned the pig homologue of CTLA4 (pCTLA4) and constructed a fusion protein consisting of the extracellular regions of pCTLA4 and the constant regions of human IgG1 (pCTLA4-Ig). This failed to inhibit costimulation provided by human B7. The aim of this study was to confirm species specificity of PCTLA4-IG in a model of pigto-mouse islet transplantation. We demonstrated that pCTLA4-Ig bound poorly to murine CD80 + and CD86 expressed on transfectants, and gave minimal staining of the CD80 + murine cell line DAP.3 and to mature mouse dendritic cells, as revealed by cytofluorometric analysis. In a T cell proliferation assay, pCTLA4-Ig blocked mouse T cell responses to pig but not mouse stimulator cells, whereas control murine CTLA4-Ig inhibited responses to both. In vivo, a single injection of 200ug CTLA4-Ig was incapable of inhibiting the delayed hypersensitivity response to oxazolone in mice. We therefore tested the efficacy of pCTLA4-Ig at preventing pancreatic islet rejection. Pig islets were transplanted under the kidney capsule of streptozotocin-induced diabetic mice. Mice injected with pCTLA4-Ig exhibited prolonged islet survival (36.5±9.4 days, n=6) compared to controls treated with isotype matched antibody (9.8±2.9 days, n=6). When combined with two subsequent injections of murine CTLA4-Ig to inhibit the indirect T cell xenoresponse, indefinite survival was achieved. Our results indicate that pCTLA4-Ig is a relatively specific inhibitor of the direct mouse T cell response to porcine tissues and therefore, is a potential therapeutic reagent to use in clinical xenotransplantation. Anthony Dorling, 1 Daxin Chen, 1 Robert I. Lechler. 1 1 Dept. of Immunology, Imperial College London, London, United Kingdom. Acute humoral xenograft rejection (AHXR) remains a significant immunological problem after xenotransplantation. Histologically, organs rejected by AHXR show prominent widespread microvascular thrombosis, often accompanied by systemic coagulation disturbances in the recipient. This has led to the hypothesis that abnormal activation of clotting might be of primary importance to the pathophysiology of AHXR. We have tested the effect of inhibiting coagulation in a mouse heart-to-rat model of AHXR. Control C57BL/6 hearts, transplanted heterotopically without immunosuppression were rejected after a mean of 2.8 (+/-0.4) days. Histology showed widespread intravascular fibrin deposition and features typical of AHXR. Hearts were harvested from one of two novel strains of transgenic mice expressing membrane-tethered human tissue factor pathway inhibitor or hirudin fusion proteins under the control of a modified CD31 promoter for expression on activated endothelial cells (EC). These hearts survived for a mean of 6.6 (+/-0.49) days and 6.4 (+/-1.02) days respectively. Histology on the day of rejection showed no evidence of thrombosis but the grafts were infiltrated with CD3+ cells. Experiments were repeated under cover of daily cyclosporin. Control hearts were still rejected at 2-3 days by AHXR. In contrast, hearts from both transgenic donors are still beating more than 28 days post-transplantation. These experiments are ongoing. These data show that efficient inhibition of intravascular coagulation by expression of anticoagulants on EC completely inhibits AHXR in this small animal model, implying that activation of coagulation factors is an important element in the pathophysiology of humoral rejection. Background: We have to date performed 8 GalT-KO heart transplants in baboons. Methods: Baboons received induction therapy with anti-thymocyte globulin, thymic irradiation (700cGy, n=7), cobra venom factor (CVF) from days -1 to 14 (n=2) or days -1 to 3 (n=3), and maintenance therapy with a human anti-human CD154 mAb, mycophenolate mofetil, and methylprednisolone. Heparin was administered to all baboons, anti-thrombin from days 1-12 to 3 baboons, and aspirin to 4 baboons. Two hearts from Gal compound heterozygous pigs expressing low but detectable levels of Gal were transplanted into baboons receiving the same regimen (without CVF) as controls. Results: Both control hearts were rejected hyperacutely within 20 minutes. Hyperacute rejection was not seen in any GalT-KO heart, even in the absence of CVF therapy. Two grafts failed from a thrombotic microangiopathy (TM) on days 59 and 67, respectively, one of which showed focal interstitial hemorrhage and edema. One baboon died (day 56) and two were euthanized (days 23 and 16) for unrelated causes, all with functioning Abstract# 1069 THROMBOTIC MICROANGIOPATHY IN HDAF AND GalT-KNOCKOUT PIG HEARTS FOLLOWING TRANSPLANTATION INTO BABOONS. Stuart L. Houser, 1 Akira Shimizu, 2,3 Kenji Kuwaki, 2 Frank J. Dor, 2 Yau-Lin Tseng, 2 Christoph Knosalla, 2 Jane Cheng, 3 Henk-Jan Schuurman, 3 David H. Sachs, 2 David K. C. Cooper. 2 1 Pathology, Massachusetts General Hospital, Boston, MA; 2 Transplantation Biology Research Center, Massachusetts General Hospital, Boston, MA; 3 Immerge Biotherapeutics, Cambridge, MA. Background: Acute humoral xenograft rejection (AHXR) is an immunologic barrier in pig-to-baboon organ transplantation (Tx). We report histopathology of 3 groups (A, B, and C) of xenograft hearts which failed with microvascular thrombosis and myocardial necrosis. Methods: Eight baboons underwent heterotopic heart Tx from pigs transgenic for human decay accelerating factor; 4 (A) received heparin from day 2 after Tx, and 4 others (B), from the day of Tx (day 0). Six (C) received hearts from GalT-knockout pigs and were heparinized from day 0. All recipients received thymic irradiation, antithymocyte globulin, cobra venom factor (only 2 in C), mycophenolate mofetil, and methylprednisolone. A and B underwent anti-Gal antibody depletion with synthetic Gal oligosaccharide conjugates. Grafts were removed when palpable contractions stopped. Stained tissue sections from harvested grafts were studied by light and fluorescence microscopy. Results: Grafts survived 12-36 (m 23; med 14), 25-139 (m 68; med 54), 56-67 (m 63; med 63) days in A, B, and C, respectively. In C, two grafts failed, 3 recipients with beating grafts died of unrelated causes (m32; med 23), and one graft is ongoing at >104days. In all grafts, multiple platelet-rich fibrin thrombi occluded myocardial vessels. Interstitial neutrophils were focal in A and C; mononuclear cells were present but rare in all groups. Hemorrhage and edema were diffuse or regional in A and focal in B and C. Ischemic injury included myocytolysis, contraction band necrosis, coagulative necrosis, and myocyte dropout. Marked intimal thickening like that of allograft vasculopathy was seen in the longest surviving graft (139 days). Vascular IgG deposition occurred in 3 A grafts and 1 B graft; IgM, in 2 C and 2 A grafts. Patchy, sparse C4d staining of capillaries was seen in all groups. Notable C3 staining was absent. Conclusions: At varying times after Tx, AHXR and/or a non-immunologic coagulopathy caused microvascular thrombosis and myocardial infarction. Cardiac xenograft vasculopathy (chronic rejection) can occur with prolonged graft survival. On-going studies are directed toward determining the immunological or nonimmunological basis of the observed findings. Binding of human xenoantibodies (XAb) to the α-Gal antigens in pig endothelial cells (PEC) is considered one of the initial steps in the hyperacute rejection of pig to human solid organ xenotransplants. The recent development of transgenic pigs lacking the α-Gal epitope (GalT-KO) holds exciting prospects. We analyzed the long-term anti-pig XAb response developed by acute liver failure patients exposed to pig hepatocytes following bioartificial liver treatment (BAL). Methods: Patients' plasma from 5 patients was collected before and at different intervals after BAL. Flow-cytometric and ELISA assays (1:10 plasma dilution) were used to assess patient's IgM and IgG XAb cytotoxicity and binding to PEC from wild-type (wt) and GalT-KO miniature swine. Results: The binding of patients' IgG and IgM XAb to PEC from GalT-KO pigs were 30-57% lower than those from wt pigs. The natural anti-pig XAb levels fell during multiple BAL treatments (25-50%), the reduction being more pronounce for cells from GalT-KO pigs. After the BAL treatment was ended, the XAb titers rebounded by 35-50% in the 1 st week, and remained stable afterwards. The XAb cytotoxicity results of one patient are depicted in Figure. The increase in cytotoxicity post-BAL treatments was markedly attenuated using cells from GalT-KO pigs. Conclusions: Human XAb recognize and bind to non-αGal epitopes on PEC from GalT-KO miniature swine. The anti-pig XAb response to GalT-KO pigs is reduced by 30-50% compare to wt pigs. Despite rebound in the XAb titers post-BAL, there was a marked reduction of cytotoxicity toward cells from GalT-KO. Purpose: A common consequence of left ventricular assist device (LVAD) implantation is the development of anti-HLA alloantibodies. While alloantibodies have been shown to be associated to vascular rejection the ultimate impact on vasculopathy is still ill defined. Previously observed the presence of Class I anti-HLA donor reactive antibodies was associated with significant early post-transplant vasculopathy. To examine this we obtained coronary angiograms from 196 patients bridged with LVAD and compared their post transplant coronary angiograms to a non-LVAD cohort (n=112) examined in a previous study. Methods: Adult patients undergoing orthotopic heart transplant between 1999-2000 were retrospectively studied and compared to our LVAD population. Coronary angiograms were retrospectively reviewed and severity of coronary vasculopathy was categorized as either normal, mild, moderate or severe. Other variables studied included cytotoxic panel reactive antibodies (PRA) against Tcell targets and flow cytometric crossmatching against donor t lymphocytes. Results: As anticipated there was an increase sensitization in those patients receiving a LVAD. Twenty-one percent of LVAD patients had a T-cell PRA greater than 10% at the time of transplant compared to 3.6% of the controls (p<0.0001, Fisher's exact). Likewise, 25% of the LVAD patients had positive T-cell flow crossmatches compared to 5.4% of the controls (p<0.0001). Despite this coronary angiograms revealed no significant difference in the degree of coronary vasculopathy between the groups at follow-up. Normal coronary anatomy was detected in 72.5 % of LVAD patients and 64.6% of non-LVAD patients (p=0.79). These results were nearly identical at 2-and 3-year follow-up (71.8% vs. 64.6% and 58.3% vs.59.7%) Conclusion: While preoperative LVAD use is associated with a risk of recipient sensitization these patients develop transplant vasculopathy at the same rate as those not receiving a LVAD. Background: Allograft coronary vasculopathy is a major risk factor for mortality following cardiac transplantation. Several factors including recipient-donor characteristics, immunosuppressive agents, and non-immune mechanisms have been evaluated as risk factors. Objective: We evaluated the influence of donor gender on the progression of coronary vasculopathy in heart transplant recipients. Methods: Eighty-nine heart transplant recipients (67 male, 22 female, mean age: 56±12 yr) underwent serial intravascular ultrasound analysis (IVUS) at baseline (within one month) and at one year after transplantation. Patients were divided into 2 groups: Forty-five recipients of female allografts (Group A) and forty-four recipients of male allografts (Group B). Ultrasound images were recorded on a super VHS tape during a distal to proximal automated pullback. Side branches were used to match the sites at baseline and 1 year of transplant. Volumetric analysis of the matched sites was performed and equal number of slices (Average=22) were evaluated. The following IVUS indices for the left anterior descending artery were measured for each patient: Change in maximal intimal thickness (MIT, mm), total plaque volume (Total PV, mm3), average intimal area (Av IA, mm2) and intimal index (%). Results: Patients were similar in baseline characteristics including age, gender, diabetes mellitus, hyperlipidemia, etiology of heart failure, ischemia time, use of assist device, cytomegalovirus disease, rejection episodes, and immunosuppressive therapy. At 1 year of transplantation, measurements of allograft vasculopathy were found to be significantly higher in recipients of female allografts (Table) . The recipient gender had no influence on coronary vasculopathy progression. Conclusion: Female donor allograft is associated with increased risk of coronary vasculopathy independent of recipient gender. UNOS 1987 UNOS -2001 Of the ten patients undergoing third-time retransplants, pre-operatively, one was VAD dependent, four were on IV inotropes, and two had creatinine levels greater than 2.5. Additionally, four were male recipients of female donor hearts and the mean donor ischemic time was 2.6 hours. In our center experience, 3 patients underwent a third heart transplant. There was no early or hospital mortality. One patient died late from TCAD and another following a fourth allograft. Conclusion: The mortality rate for third-time heart allograft recipients is acceptable. These favorable results may be influenced by small sample size, younger age, case selection, and operations at select, high-volume institutions with significant experience. Further study is warranted 1 1 Transplant Pathology, Biomedical Science Tower, Pittsburgh, PA; 2 Cardiology, Childrens Hospital of Pittsburgh, Pittsburgh, PA; 3 Pathology, Rush Medical Center, Chicago, IL; 4 School of Public Health, University of Pittsburgh, Pittsburgh, PA. The aim of this study was to assess the effect of growth factors gene polymorphisms on acute rejection (AR) and graft coronary disease (CAD) in pediatric heart transplant (HTx) recipients. Genotyping using PCR specific primers were performed for TGFβ1 (codons 10 and 25) and VEGF-2578 gene polymorphisms in 111 patients who underwent a HTx at the mean age of 7.5 ± 6.7 years. Acute rejection (AR) was defined as ISHLT grade > 2. Patients were defined as rejectors if they had recurrent AR (more than 2 episodes), steroid-resistant AR or graft dysfunction within the first year post-Tx; the other cases were defined as non-rejectors.Thirty-nine patients were considered Rejectors (35%) and 31 developed graft coronary disease (CAD) (28%), 22 to 149 months posttransplant (mean follow-up 79months). The proportion of rejectors was higher in the TGFβ1codons10-25 high-producer group than in the intermediate/low-producer group (respectively 40% versus 14%, p= 0.026). Allele AA for VEGF was associated with a higher incidence of acute rejection (55.5% rejectors versus 30.6% rejectors in patients with AC or CC allele, p= 0.05). CAD was observed in 2/21 (9.5%) of HTx recipients TGFbeta1codon10-25 lowproducers while 29/90 (32.2%) of TGFbeta1codon10-25 high-producers developed CAD (p= 0.037). This difference was also observed when TGFβ1codon25 gene polymorphism was analyzed (11 low-producers with no CAD and 90 high producers with 31 CAD, p= 0.03). VEGF-2578 genotype was not different among patients with or without coronary disease. In summary, TGFbeta codon10-25 high producers have an increased risk of both AR and CAD. Allele AA in the VEGF -2578 was linked with AR but did not influence CAD frequency. Further larger studies are needed to confirm these results and analyze the impact of VEGF on pediatric graft coronary disease. Currently, the diagnosis of acute cardiac rejection (AR) is based on histology of endomyocardial biopsies (EMB). However, its main drawback is a low sensitivity. Moreover, AR is morphologically identical to post-transplant cardiac Chagas' disease reactivation (CHR) unless the parasite is seen. The aim of this study was to determine possible differences in gene expression patterns among rejection, no rejection, or CHR. We divided 54 EMB into the three groups on the basis of their histology and positivity for T. cruzi. We isolated RNA, amplified it, labeled with Cy5 or Cy3, and hybridized to microarray chips containing 14,000 genes. We then divided the samples into two sets, using the first as a training set to build gene predictors and the second as a test set to validate the results. The gene analysis of the training set allowed us to correctly classify rejection vs. no rejection in 87% of cases, based on 70 discriminating genes (p<0.001).Using the genes revealed from the training set, 83% of the test set EMB were predicted correctly, and all EMB with rejection were correctly classified by the array (100% sensitivity). Interestingly, 75% of CHR samples were also classified as rejections (p<0.01), revealing the similarity of immune/inflammatory expression profiles between AR and CHR. However, a comparison of genes expressed by CHR and AR biopsies disclosed 25 genes that discriminated between them (p<0.001), leading to 93% of correctly classified samples in the training set and 80% in the test set. Based on these microarray results, we used real-time RT-PCR to evaluate the expression of 10 target genes encoding immunity/inflammation and cellular energy-related molecules and 2 control genes, in the non-amplified RNA samples analyzed by the microarray and in 30 additional EMB. Among these 10 target genes, the expression of 7 genes confirmed differences observed in microarrays, 2 genes showed a trend, and 1 did not show any difference. Analysis of differences in gene expression between clinical situations revealed a high correlation between the microarray and RT-PCR results (r=0.94, p<0.001), and a detailed gene-by-gene analysis showed that the difference between molecular profiles of AR and CHR lies within an area of non-immune but tissue related gene expression. These results are valuable not only for the diagnosis of AR, and differential diagnosis of AR and CHR, but also for the better understanding of these pathological processes. A subset of these genes had previously been correlated with CMV infection, though a subset of genes not previously known to be associated with CMV infection was also identified. No correlation was detected between plasma CMV PCR positivity and acute rejection (ISHLT Grade 2 or greater) at time of sample acquisition or at future visits. Gene expression profiles seen in PBMCs from subjects with CMV detectable by PCR did not overlap with those seen in acute rejection using a quantitative clinically validated 14 gene acute rejection diagnostic assay.Conclusions: CMV infection in cardiac allograft recipients has a distinct molecular signature in PBMCs that is independent from that of acute rejection. The gene expression profile of CMV viremia has confirmed previously described genes but also identified a new subset of genes associated with antiviral immune response patterns and with CMV-specific gene expression, indicating that the unique molecular signatures for CMV and acute rejection may be clinically useful in determining optimal immunosuppression. Cardiac allograft vasculopathy (CAV) after heart transplantation may have different presentations. It can progress fulminantly over weeks or be indolent with gradual progression over years. Patients who have angiograms before 1 year usually do so because of unstable clinical symptoms, abnormal ECGs, or positive cardiac enzymes. To determine if there is a difference in early (less than 1 year) versus late (greater than 1 year) angiographic development of CAV on outcome, we reviewed 114 heart transplant patients (on cyclosporine-based immunosuppression) between January 21, 1990 and July 15, 1998, who were found to have initial CAV (>30% stenosis) on angiography. The mean age of the patients was 52.2±12.1 with 22.8% female. These patients were followed for 5-year survival from the time of initial diagnosis of CAV. Patients with CAV in the first year had significantly less 5-year follow-up survival compared to those patients with diagnosed CAV greater than 1 year, p<0.002 (see figure) .Conclusion: Heart transplant patients with early angiographic development (less than 1 year) of CAV appear to have decreased 5-year follow-up survival compared to those patients who develop CAV later. More aggressive immunosuppression and risk factor reduction should be considered in these high risk patients. Poster Board #-Session: P15-III With managed care pressure to reduce cost following cardiac transplant many centers are considering omitting routine hemodynamic measurements at the time of endomyocardial biopsy. (EmBx). To evaluate the safety of such a practice we retrospectively reviewed 704 consecutive heart transplant recipients who underwent scheduled EmBx. Among these patients 57 (8%) were identified as having rejection as defined by a reduced ejection fraction, reduced cardiac output or significantly elevated pulmonary capillary wedge pressure; but in whom the ISHLT EmBx score was ≤2. These patients received augmented immune suppression for cellular negative rejection (CNR), and subsequently improved. There was no difference in long term survival following this therapy. There were no other pre or post operative predictors that would have predestined these patients to allograft dysfunction. Allograft vasculopathy was not affected by the presence of CNR and there was a trend toward a higher incidence of CMV but this was at any time and not necessarily associated with the episode of CNR. In conclusion we believe the routine use of hemodynamic measures at the time of EmBx is an essential part of the screening procedure. While reducing these measurements will reduce cost approximately 8% of patients would be placed at an increased risk of allograft loss, and no pre or post-operative measure routinely employed will identify the at risk group. BACKGROUND: BNP levels are elevated in heart insufficiency and are a predictor for cardiac mortality. Recent findings suggest that enhanced BNP plasma levels could form a basis for a noninvasive test for cardiac allograft rejection. Aim of this study was to reveal a correlation between BNP-levels and acute rejection. METHODS: From 2001 to 2003 a total of 472 BNP measurements were taken immediately before routine endomyocardial biopsies. For statistical analysis patients were devided into four quartiles (n=118 per group) according to their BNP-levels. Further covariables were age, fractional shortening (FS), left ventricular end diastolic diameter (LVEDD) and serum creatinine. RESULTS: Mean BNP was 105.7±127 pg/ml with 28.8±9; 55.3±6.9; 89±15.9 and 248.6±186.5 pg/ml for the quartiles. Creatinine levels were significantly different among the quartiles (1.50±0.53; 1.59±0.64; 1.61±0.56 and 2.22±1.13 mg/dl; p<0.001). The highest correlation was found between BNP and creatinine levels (p<0.001) and age, whereas differences in ISHLT grading between the quartiles were evident (p=0.044) but not highly significant. There was no correlation between BNP and FS or LVEDD. CONCLUSION: Circulating BNP is elevated in heart transplant recipients. BNP increases with renal insufficiency. As a marker for acute rejection BNP is not reliable because its level depends on many influencing factors and great interindividual differences occur. This study is limited due to a low number of rejection episodes (no 3A or higher occured in the investigational period) but still BNP seems to be a marker for wall stress and volume overload and not primarily for allograft rejection. Background. Allograft adaptation to a foreign circulation is imperfect as noted from decreased cardiac reserve and persistence limitations to stress. Effective arterial elastance (Ea), a measure of afterload, provides a reliable estimate of aortic impedance. End systolic elastance (Ees) is a load independent measure of ventricular performance and its interaction with the periphery. Their ratio (Ea to Ees) characterizes ventricular-vascular coupling and a value close to unity signifies maximal ventricular work (and thereby poor mechanical efficiency). The purpose of this investigation was to correlate mechanical efficiency of work with expression of B-type natriuretic peptide (BNP), a specific marker of ventricular stress and strain. Methods. We studied 40 consecutive primary heart transplant recipients who were stable and free from rejection. Echocardiography was performed in all patients and Ea, Ees and their ratio (Ea/Ees) was obtained by the single-beat method. BNP levels were measured by a point of care assay. We examined correlates of BNP expression by assessing Ea/Ees while correcting for mean arterial pressure (MAP), body mass index (BMI), left ventricular mass index (LVMI), ejection fraction (EF) and serum creatinine.Results. BNP levels were significantly and positively correlated with an increasing Ea/Ees ratio (see figure) .On multivariable analysis, this relationship persisted independently (t = 2.1, p=0.04) while BMI, MAP, LVMI, EF and serum creatinine were insignificant predictors.Conclusion. This investigation indicates that the transplanted heart demonstrates poor contractile efficiency and operates at maximal left ventricular work. This is paralleled by a tandem increase in expression of BNP and suggests that elevation in this stress peptide is at least partly explained by ventriculo-vascular uncoupling in heart transplantation, independent of alterations in blood pressure. Background: The impact of acute rejections (ARs) occurring during late posttransplant periods on graft function is barely known. To provide more information on this very controversial issue we investigated the influence of late ARs and both left ventricular (LV) function and progression of transplant coronary arteriopathy (TxCA). Methods: In 552 patients (post-transplant times 2-16 years) we analyzed all cases of biopsy-proven late AR that arose after the second post-transplant year. The late ARs were detected by both routine endomyocardial biopsies (EMBs) performed during annual cardiac catheterisations and diagnostic EMBs performed whenever AR was suspected clinically and/or echocardiographically. Late AR analysis included prevalence, severity (both histological and functional) and potential relation to new appearance or aggravation of TxCA. Results: During a mean observation time of 7.5 ±4.9 years a total number of 160 EMBs were graded as ISHLT ≥1A. In 33 patients the positive EMBs were associated with severe LV dysfunction accompanied by hemodynamic alterations. These patients represented 60.6% of all patients with post-transplant times of > 2 years in whom severe LV dysfunction was detected during the study period. The severity of graft dysfunction was not significantly related to the histological severity grade (ISHLT classification). All 33 patients with severe LV dysfunction had intense vascular reaction in addition to cellular rejection and in 16 patients (48.5%) humoral (vascular) rejection was dominant. LV dysfunction was completely reversible in only 2 of these 33 patients. Seven (21.2%) died due to AR-related graft failure (GF). In the other 24 patients LV function remained moderately altered after AR. Shortly (19.8 ±11.2 months) after AR, 8 of them developed accelerated TxCA; other 11 showed aggravation of preexistent TxCA. Sudden cardiac death within 2 and 18 months after late AR occurred in 6 of these 24 patients. The total of 13 rejection-related deaths represented 59.1% of all deaths due to GF (n=22) that occurred beyond the 2 nd post-transplant year during the study period. The mean number of late ARs/year/patient was significantly higher in those which developed angiographic TxCA after the 2 nd post-transplant year than in those without TxCA at that time (p <0.01). Conclusions: AR beyond the second post-transplant year is an important cause of late acute and chronic allograft dysfunction. The new appearance or aggravation of TxCA is at least partially related to late ARs.