key: cord-0005653-w81ysjf9 authors: nan title: 40th International Symposium on Intensive Care & Emergency Medicine: Brussels, Belgium. 24-27 March 2020 date: 2020-03-24 journal: Crit Care DOI: 10.1186/s13054-020-2772-3 sha: 7918a948d0ee675a4faa436a4ed4e4cdcaf545ac doc_id: 5653 cord_uid: w81ysjf9 nan Ventriculostomy-related infection (VRI) is a serious complication in patients with hemorrhagic stroke. In such patients, diagnosis of VRIs is complicated by blood contamination of CSF following ventricular hemorrhage. We aimed to evaluate the diagnostic potential of white blood cells count (WBC), C-reactive protein (CRP), and procalcitonin (PCT) to identify VRIs in patients with hemorrhagic stroke during the time of external ventricular drain (EDV) in situ. This retrospective study was conducted at the Neurosurgical-ICU, University Hospital of Zurich. A total of 347 patients with hemorrhagic stroke and an external ventricular drain (EVD) were admitted over a 6 years period at the ICU. Of those, 14 patients with VRIs ("VRI"), defined by positive CSF bacterial culture and increased WBC in CSF (>250/ul), and 115 patients without VRIs and with serial CSF sampling ("no-VRI") were analyzed. Patients with CSFcontamination or suspected VRI (negative CSF cultures but antibiotic treatments) were excluded. WBC, CRP, and PCT were measured daily. CSF was sampled routinely twice a week or by T>38°C. For the analysis, mean peak values of WBC, CRP, PCT during the time of EVD in situ were compared between groups (t test). Data are expressed as mean with CI 95%. Results: Between groups, WBC and CRP were similar (WBC: 15.13 G/L and 14.55 G/ L, p=0.68 and CRP: 115.93 mg/l and 129.44 mg/l, p=0.56 in the group VRI and no-VRI, respectively) ( Figure 1, panel A and B ). In the group VRI, PCT was low and significantly lower than in the group no-VRI (0.16 ug/l and 2.61ug/l, p=0.03 in the group VRI and no-VRI, respectively) (panel C). WBC in CSF were similar between groups (710.14/ul and 675.16/ul p=0.93 in the group VRI and no-VRI, respectively). In this study, serum-inflammatory markers were not able to screen patients with VRIs. Their routine measurement should be carefully evaluated. Introduction: Central nervous system (CNS) infections constitute a potentially lifethreatening neurological emergency. Patients admitted to the intensive care unit (ICU) usually present with a severe disease and organ failure, leading to high mortality and morbidity. We have performed a retrospective analysis during a 5-year period of patients admitted to a polyvalent ICU. Clinical, demographic and outcome data were collected to evaluate its clinical impact on the outcome of patients with CNS infections. We identified 30 patients with the diagnosis of meningitis, meningoencephalitis and ventriculitis, where the median age was 57,6 years (range 24-80). Upon clinical presentation, their most frequent signs were fever (70%), meningeal signs (40%), seizures (30%), and a Glasgow Coma Scale score <8 (66%). All needed ventilation support and 66% needed cardiovascular support. A definitive microbiological diagnosis was achieved on 22 patients and antibiotic therapy was adjusted on 18 of them. Most common microorganisms were Streptococcus pneumoniae (n=7), Listeria (n=5) and Pseudomonas aeruginosa (n=4) (Figure 1 ). Other gram negative microorganisms were detected and lead to more adverse outcomes. Meningitis was the cause of admission on 26 patients and on a minority (n=4) meningitis was considered to be a secondary diagnosis on patients admitted for other causes (traumatic brain injury, subarachnoid or intraparenchymal hemorrhage, postoperatively of neurosurgical tumor). Patients that eventually died had at least one risk factor (age>65, immunocompromised due to diabetes, corticotherapy, HIV or heart transplantation). Patients admitted to the ICU were not so aged, but had some comorbidities and risk factors leading to more uncommon microorganisms, increasing the risk of adverse outcomes. This lead to an increase of mortality: 23% in the ICU and an overall of 43%. Study of selenium levels in unresponsive wakefullness (UWS) patients with systemic inflammatory response syndrome (SIRS) E Kondratyeva 1 , S Kondratyev 2 , N Dryagina 2 The objective of this study was to evaluate the pharmacokinetics (PK) of levetiracetam (LEV) in critically ill patients with normal and augmented renal clearance (ARC), and determine if the recommended dosage regimen provides concentrations in the therapeutic range (12-46 mg/L) [1] . A prospective observational study was conducted in a tertiary hospital. Six blood samples were taken during a dose interval at steady state and LEV was quantified by HPLC. A population PK study was carried out. Statistical analysis was conducted to evaluate the differences in PK between patients with and without ARC. The suitability of drug concentrations was also assessed. Results: Seventeen patients were included, 13 with normal creatinine clearance (CrCL) (80-129 mL/min) and 4 with CrCl≥130 mL/min (ARC). Ten patients received 500mg q12h, one 1000mg q12h and two 1500mg q12h. The data were best fitted to a two-compartment model. Figure 1 shows LEV concentrations during the dosing interval. Mean clearance (CL) was 4 L/h and mean volume of distribution of central compartment (V) was 44 L. Interindividual variability was 38 and 61% for CL and V, respectively. No differences were identified between both groups (p>0.05) in PK parameters. No correlation was found between LEV CL and CrCL. Trough levels were below the minimum concentration (C min ) 12 mg/L of the therapeutic range in all patients except 1. Furthermore, between 3-5 h 50% of samples were below the C min . Conclusions: Administered doses were not able to maintain LEV concentrations in the recommended therapeutic range. Other dosage strategies, such the extension of infusion time with higher doses, could be evaluated in order to obtain a more favourable profile. No correlation between LEV CL and CrCL was found. The mechanical properties of muscles such as tone, elasticity, and stiffness are often affected in chronic critical ill (CCI) patients. A hand-held device known as the MyotonPRO demonstrated acceptable relative and absolute reliability in a ward setting for patients with acute stroke [1] . The technology works on the principle of applying multiple short impulses over the muscle bulk via the testing probe. The aim of our study is to assess the feasibility of objective measurement of muscle tone in CCI patients with neurological dynamics and serum biomarkers. The study included 23 CCI patients with neurological disorders (stroke, traumatic brain injury, neurosurgical intervention for brain tumors) with more than a 3-weeks stay in ICU. Dynamic measurements of the muscle properties were taken on the deltoideus, brachioradialis, quadriceps femoris, gastrocnemius using the Myo-tonPRO. To identify the leading factor in impaired muscle tone also were measured neurological (S100, NSE), inflammatory (IL-6), bacterial load (PCT) biomarkers using Elecsys immunoassay and the serum level of microbial metabolites using GC-MS (Thermo Scientific). Results: All patients were divided into groups depending on positive and negative clinical dynamics. Significant differences were obtained in parameters characterizing changes in muscle tone of lower limbs -F gastrocnemius (tone) -15.5 vs 18.5 Hz, R quadriceps femoris (the mechanical stress relaxation time) -16.5 vs 13.6 ms (p < 0.01, respectively). Some significant correlations between five parameters of muscle tone biomarkers and microbial metabolites were revealed. The results of a quantitative measurement of muscle tone objectively reflect the dynamics of neurological status, which in the future may be promising technique for the personalized approach CCI in patients. Introduction: Changes in hormonal status in patients with unresponsive wakefulness syndrome (UWS) remains poorly understood. Methods: 275 patients in UWS were examined at the period from 2007 to 2017. 152 patients (115 men) with TBI and 123 patients (63 men) after hypoxia. ACTH, cortisol, TSH, free T3 and T4, STH, prolactin and natriuretic peptide were studied in the period from 2 to 4 months UWS. In men, the level of total testosterone, LH and FSH was additionally studied. The obtained data was compared with the UWS outcome in 6-12 months (CRS-R scale assessment). None of the studied hormones of the hypothalamic-pituitary-adrenal axis were a reliable criterion for predicting the outcome of UWS. Most often and consistently was revealed a tendency of disrupt the rhythm of cortisol secretion, with higher rates in the evening hours. The average value of STH was higher in men with the consequences of head injury who had recovered consciousness than in those who remained in UWS. Significant decrease in testosterone levels, regardless of age, was found in patients with a consequence of TBI. Mean levels of LH were higher in patients with TBI and hypoxia who remained unconscious than in patients who later restored consciousness. The average level of FSH was higher in patients who had recovered consciousness . The increase of natriuretic peptide level was observed both in patients who remained in chronic UWS and in those who restored consciousness. No certain endocrine background, characterising this category of patients was found. Violations of some hormones secretion rhythms, in particular, cortisol can be considered usual for UWS patients, especially in patients with TBI. Therapeutic hypothermia has not been used before our research in chronically critically ill (CCI) patients. Temperature decrease in neuronal cells is a strong signal that triggers endogenic cytoprotection programs using early response genes expression. Our goal is to determine influences of craniocerebral hypothermia (CCH) on level of consciousness in CCI patients. We examined 98 patients with different types of brain injuries. 54 males and 44 females, mean age 45.56 ±16.03. Patients were divided into 2 groups: main group -47 patients (Vegetative State (VS) -28, minimally conscious state (MCS) -19), comparison group -51 patient (VS -32, MCS -19), groups were equal on main parameters (severity, functional state, comorbidity). Patients from main group received courses of CCH, duration -180 minutes, scalp temperature 5-8°С, cerebral cortex cooling up to 32-34 o C, session end was without slow reheating period, and session's amount was set -until signs of consciousness recovery. Cortex temperature check done noninvasively by using detection of brain tissue EMI in SHF-range. Consciousness recovery in VS and MCS patients controlled using CRS-R scale. Results: CCH sessions significantly increased level of consciousness in VS and MCS patient groups. In VS patients vegetative state increased until minimally conscious state and MCS +, and in MCS group until lucid consciousness (p <0.05) (Figure 1 ). Craniocerebral Hypothermia is used in chronically critically ill patients for the first time. Our research results demonstrated effectiveness of CCH as an additive treatment tool in such patients. This let us optimistically determine the perspective of inclusion of CCH method in chronically critically ill patient's rehabilitation to increase level of consciousness. Despite the clinical benefit of endovascular treatment (EVT) for large vessel occlusion (LVO) in ischemic stroke, space-occupying brain edema (BE) represents a common complication during the course of disease. Routinely, CT imaging is used for monitoring of these patients, notably in the critical care setting, yet novel and easy bed-side techniques with the potential to reliably predict BE without repetitive imaging would be valuable for a time and cost effective patient care. We assessed the significance of automated pupillometry for the identification of BE patients after LVO-EVT. We enrolled 40 patients admitted to our neurocritical-care unit who received EVT after anterior circulation large vessel occlusion. We monitored parameters of pupillary reactivity [light-reflex latency (Lat; s), constriction and re-dilation velocities (CV, DV; mm/s), and percentage change of apertures (per-change; %)] using a portable pupilometer (NeurOptics®) up to every 60 minutes during the first 72 hours of ICU stay. BE was defined as midline-shift ≥5mm on followup imaging within 3-5 days after EVT. We assessed differences in pupillary reactivity between patients with and without BE (U-test) and evaluated prognostic performance of pupillometry for development of BE (ROC analysis). In 32 patients (19 women, 74.3±12.0 years) without BE, 1,224 assessments were compared to 207 assessments in 6 patients (3 women, 71.7±11.8 years) with BE. On day 1, day 2, and day 3 after EVT, patients with BE had significantly lower CVs and DVs, and smaller perchanges than patients without BE, whereas Lat did not differ between both groups. ROC-analyses revealed a significant negative association of CV, DV, and per-change with development of BE. Conclusions: Automated pupillometry seems to identify patients at risk for BE after EVT. A prospective study should validate whether automated pupillometry harbors the potential to reduce unnecessary follow-up CT imaging. The aim of this preliminary analysis is to detect differences between the qualitative and quantitative evaluation of the pupillary function carried out by doctors and nurses of an intensive care unit (ICU) of a tertiary level hospital. Secondary purpose is to investigate new indications for the use of pupillometry in a population admitted in ICU Methods: The study has been conducted (currently in progress) at the intensive care unit and ECMO Referral Center at Careggi Teaching Hospital (Florence; Italy). The enrolled patients are adult subjects (> 18 years) with alteration of consciousness defined by a Glasgow Coma Scale (GCS) < 9, following a primary brain injury and/or the use of sedative drugs. The studied parameters, obtained with NeuroLight pupillometer ® (ID-Med, Marseille, France) are analyzed, integrated and Visual/qualitative evaluation of the pupil function shows a lower reliability if compared to automated pupillometry. The estimated error in the proper determination of photomotor reflex is 34.9% (p< 0.01). No significant difference is reported between quantitative and qualitative pupillometry in the detection of anisocoria. Our preliminary results are compatible with previously reported data [1] [2] [3] , even if there was no difference in anisocoria determination. Interestingly, a longer latency period among patients treated with opioids has been observed. Other results are still in progress. Introduction: Due to the dynamic of critical care disease, a rapid bedside, noninvasive and highly sensitive and specific method is required for diagnosis. In this study we set out our experience with trancranial color-coded duplex ultrasound (DXT) [1] . The DXT study identifies cerebral arteries as well as hemorrhagic phenomenon, hydrocephalus, mass-occupying lesions and midline shift. This is the main difference between DXT and conventional Transcranial Doppler (DTC) which is a blind study and do not provide any image. Descriptive, cross-sectional and observational study from December 2018 to June 2019. 21 patients were included. Inclusion criteria: Neurocritical patients. Exclusion criteria: No acoustic window, presence of ultrasound artifacts. Data collection was performed. It was used a lowfrequency transducer from 1.5-3.5 MHz with trancranial duplex preset ( Figure) . The patterns were defined as normal, vasospasm, high resistance, hypermedia and cerebral circulatory arrest, depending on the cerebral flow velocity, Lindegaard Ratio (LR) and Pulsatility Index (IP). Results: 12 men (57.1%) and 9 women (42.9%). Average age 55.6(20-79). Patients diseases: Subarachnoid hemorrhage 6, traumatic brain injury 5, AV malformation 4, stroke 2, hemorrhagic cerebrovascular accident 2 and mass occupying lesions 2. Normal Pattern: 10 patients (rel. freq 0.47). Vasospasm: 5 patients (rel. freq 0.23). High resistance: 4 patients (rel. freq 0.19). Hyperemia: 1 patient (real. freq 0.04). Cerebral circulatory arrest: 1 patient (rel. freq 0.04) Conclusions: DXT should be part of the routine of neuromonitoring, it allows real time images especially useful in unstable conditions. Although it will be needed a large amount of patients to be statistical significant, DXT is useful considering a non invasive study, bedside and it allows early identification of different clinic conditions. Introduction: Embolization of the draining vein during endovascular treatment of arteriovenous malformation (AVM) may result in venous outflow obstruction and hemorrhage. Anaesthesiologist can use deliberate hypotension to reduce blood flow through AVM which may be somehow helpful to prevent this scenario. Adenosine-induced cardiac arrest may facilitate the embolization too. The goal of our study was to improve the results of endovascular treatment of AVM using adenosine-induced cardiac arrest. Methods: After obtaining informed consent 13 patients (8 male, 5 female) were selected for adenosine-induced cardiac arrest during endovascular AVM embolization. Main age was 40,8±6 years old. 9 of them were evaluated as III class ASA, 4 as IV. Endovascular treatment in all cases was performed under general anaesthesia. Propofol, fentanyl, rocuronium were used to induce anaesthesia, then all the patients were intubated and ventilated with parameters to keep EtCO2 32-35 mm Hg. Sevoflurane 2,1-2,6 vol% (12 cases) or desflurane 6 vol% (1 case) were used to maintain anaesthesia. Hemodynamic monitoring consisted of ECG, pulsoximetry, non-invasive blood pressure measurement. Onyx or/and Squid were used as embolic agents. CT was performed to every patient just after procedure as well as neurological examination. Results: Adenosine dosage was 0.875-1.26 mg/kg. Time of consequent cardiac arrest was 10-40 sec. There were 10 cases we administered adenosine for 1 time, in one case we had to administer it twice, in one Fig. 1 (abstract P014) . Circle of Willis and Pulsed-wave Doppler mode of middle cerebral artery -3 times and 4 times in one more case as well. Hemodynamic parameters recovered without any particular treatment in all the patients. Embolization has been performed in all the cases uneventfully. Postoperative CT showed no hemorrhage. Nobody from investigated group had neurological deterioration in postoperative period. Our study shows that adenosine-indused cardiac arrest is not very difficult to perform method and it can be useful during AVM embolization. A major risk factor for stroke is atrial fibrillation (AF). To treat AF anticoagulation is needed. There are now several anticoagulants available. However, a lack of head to head data as well as the absence of accurate techniques makes it difficult to compare them and measure determine there efficacy. Stroke is known to produce an abnormal clot microstructure which is a common factor in many thrombotic diseases. This pilot study aims to use a functional biomarker of clot microstructure (d f ) and clotting time (TGP) to investigate the therapeutic effects of different anticoagulants in stroke and AF. We recruited 114 patients (59 AF and 55 stroke & AF). Two samples of blood were taken: before anticoagulation (baseline) and post anticoagulation (6-10weeks) . Patients were either given warfarin (31%) or axipaban (69%). d f and TGP were measured and compared before and after anticoagulation. Results: Warfarin increased T GP (267±56 secs to 332±78 secs (p<0.05)), and decreased d f (1.71±0.05 to 1.65±0.06 (p<0.05)). Apixaban increased TGP (235±66 sec to 410±105 sec (p<0.05)) but did not change df (1.72±0.04 & 1.72±0.05). Interestingly we found that in the apixaban group TGP significantly correlated (p=0.05) with blood drug concentration levels. In this study we show that d f and TGP can quantify and differentiate between the therapeutic effects of two different oral anticoagulants. Showing that warfarin prolongs clotting and weakens the ability of the blood to form stable clots. Conversely apixaban prolongs clotting time but does not affect the bloods ability to form stable clots. This shows the utility of the d f and TGP biomarkers in comparing two different treatment options, something no other current marker has proven able to do. Where d f and TGP may prove useful tools in a personalized approach to anticoagulation treatment and monitoring in an acute setting. hospital mortality compared to the model with the original HAIRscore. Patients with poor-grade aneurysm subarachnoid hemorrhage (aSAH) World Federation of Neurological Surgeons (WFNS) Grades IV and V, have commonly been considered to have a poor prognosis (70-100% mortality). Though early intervention and aggressive treatment in NeuroICU has improved outcome in the past years, it is controversial because most of the patients left hospital severely disabled. The objective of this study was to investigate the clinical and social outcomes in intracranial aneurysm patients with poor-grade aSAH underwent different intervention therapies. A single center observational registry of 25 poor-grade aSAH consecutive patients, defined as WFNS Grades IV and V, treated at tertiary chilean referral center from December 2013 to March 2019 were enrolled in this study. The clinical data including patient characteristics on admission and during treatment course, treatment modality, aneurysm size and location, radiologic features, signs of cerebral herniation (dilated pupils), and functional neurologic outcome were collected. Clinical outcomes were assessed via GOSE and and sociooccupational outcome, both at discharge and at 6 months. Figure 1 ). 20% mortality is less than previously reported, and survivors had a favorable recovery, confirmed with neuro psychological test. Poor-grade aSAH patients in our study shows a more positive outcome than previously considered. Prognosis of subarachnoid hemorrhage (SAH) is scarce, indeed almost half patients die or become severely disable after SAH. Outcome is related to the severity of the initial bleeding and delayed cerebral infarction (DCI). Infection and more precisely pneumonia have been associated with poor outcome in SAH. However, the interaction between the two pathologic events remains unclear. Therefore, we hypothesized that DCI may be associated to pneumonia in SAH patients. Thus the aim of our study was to analyze the association between delayed cerebral infarction and pneumonia in patients with SAH. In this retrospective, observational, monocentric cohort study, patients included in the analysis were admitted in Neurosurgical Intensive Care Unit or Surgical Intensive Care Unit in the University Hospital of Brest (France) for non-traumatic SAH. Primary outcome was diagnosis of DCI on CT scan or MRI 3 months after SAH. Multivariate analysis was used to identify factors independently associated with DCI. A total of 224 patients were included in the analysis (female male ratio 141/83, median age 57 [47-65] years). Multivariate analysis was adjusted on sedation, intracranial surgery, Fisher classification of SAH severity, pneumonia occurrence and non-pneumonia infectious event occurrence ( Figure 1 ). Pneumonia occurred in 66 patients (29.5%) and other causes of infections in 45 patients (20.1%). DCI was found in 108 patients (48.2%). Factors independently associated with DCI were pneumonia (OR 3.10 [1.41-7.06]; p=0.006) and non-pneumonia infectious events (OR 2.50 [1.20-5.39 ]; p=0.016). Interestingly severity Table 1 (abstract P023). Correlation of safety and efficacy markers of thrombolysis and thrombolysis time with distance from stroke centre Results expressed as odds ratio with 95% confidence interval of initial bleeding evaluated by Fisher scale was not independently associated with DCI. DCI is independently associated with the occurrence of pneumonia or other cause of sepsis. Those results may highlight the need for rigorous approach for prevention protocol, early diagnosis and treatment of hospital acquired infectious diseases in SAH patients. Introduction: Traumatic brain injury (TBI) can have devastating neurological, psychological and social sequelae. Increased psychiatric morbidity after TBI has been shown in both adult and the pediatric population. Also, critical illness as such is a risk factor for psychiatric problems in youth. Our aim was to assess risk factors for later being prescribed psychiatric medication in survivors of intensive care unit (ICU)-treated pediatric TBI. We used the Finnish Intensive Care Consortium (FICC) database to identify patients 5-17 years of age, treated for TBI in four ICU in Finland during the years 2003-2013. We examined electronic health records and CT scans and collected data on drug prescription after discharge. We used multivariable logistic regression models to find statistically significant risk factors for psychiatric drug reimbursement. We identified 248 patients of which 46 patients received psychiatric drug prescription (19%) during follow up. The median time to prescription was 14 months after TBI (interquartile range [IQR] 5-31 months). 33 patients received antidepressants, 9 received stimulants and 18 received antipsychotics. Increasing age showed a positive association with all drug prescriptions except for stimulants, where an inverse relationship was observed (Table 1) . Using multivariable analyses, we could not find any admission or treatment related factors that significantly associated with being prescribed psychiatric medications. Teenage survivors with moderate disability (Glasgow outcome scale [GOS] 4) showed high numbers of psychotropic drug utilization (45% received any medication, 36% received antidepressants, 24% received antipsychotics). Our data suggests, that the risk of psychotropic drug prescription after TBI depends on factors other than those related to injury severity or treatment measures. The incidence of drug prescription is especially high in patients with moderate disability. The effects of 1-adamantylethyloxy-3-morpholino-2-propanol hydrochloride on the formation of steroid neurotoxicity in rats with brain injury A. Semenenko 1 , S. Semenenko 2 , A. Solomonchuk 3 , N. Semenenko 3 Depending on the nature of the brain injury and the severity of the victims, mortality in traumatic brain injury (TBI) ranges from 5 to 65% [1] . One of the targets for pathogenetic influence on the course of TBI is the use of pharmacological agents that are able to counteract the negative effects of excess concentrations of glucocorticoids on brain. The therapeutic effect of new pharmacological derivative 1adamantylethyloxy-3-morpholino-2-propanol hydrochloride (ademol) in rats with TBI was evaluated for 8 days. The pseudoperated animals and control group received 0.9% NaCl solution and the comparison group received amantadine sulfate. Cortisol levels were used to determine the efficacy of the test drugs in TBI. In rats treated with ademol, the level of cortisol in the blood ranged from 179 to 188 ng/ml (P5-P95) and was 2.58-fold lower (p<0.05) compared to control pathology group on the 8 day of therapy. Instead, the effect of amantadine sulfate on the level of cortisol in the blood was significantly less than that of ademol. The concentration of cortisol in rats with amantadine sulfate in the blood ranged from 271-280 ng/ml (P5-P95), was 1.73 times lower (p<0.05), compared with the control pathology group, and by 49.2% (p<0.05) exceeded the corresponding value in animals treated with ademol. Therapeutic treatment of rats with severe TBI with a solution of ademol, preferably better than rats in the group with 0.9% NaCl and amantadine sulfate protect the brain from the formation of steroid neurotoxicity by cortisol (p<0.05). Although cerebrovascular pressure reactivity (PRx) well correlate to patient's outcome [1] , it requires continuous monitoring and mobile average calculation for its determination. We therefore hypothesized that a simplified model of variation between mean arterial pressure (MAP) and intracranial pressure ICP over the first three days of admission would have been able to predict patient outcome: we call this new parameter Cerebrovascular Pressure Correlation index (CPC). We performed a retrospective observational study of all adult patients with severe TBI admitted to ICU from January 2017 to April 2018 inclusive. All consecutive patients with a clinical need for ICP monitoring were included for analysis. Both for ICP and MAP data were mean value over 2-hours registration, for a total of 12 observations/day, CPC was therefore calculated as the Pearson Correlation Coefficient between ICP values (x axis) and MAP values (y axis), obtaining one single value every 24 hours. Variables included in the model (i.e. CPC, CPP, ICP, systemic glucose, arterial lactate, PaCO 2 , ICP, and internal body temperature) were collected for the first 3 days since trauma. For the main outcome only the minimum value of CPC fit the regression analysis (P = 0.004). The correspondent ROC curve showed an AUC of 0.80. The associated Youden criterion was ≤0.26 (Sensitivity = 0.90; Specificity = 0.68). Of all the variables considered for the secondary outcome only CPCmin fit the regression model (P =0.03). Table 1 reports the median and IQR range for SG and NSG of all the variables considered in the model. This observational study suggests that CPC could be a simplified model of variation between MAP and intracranial pressure ICP over the first three days of admission predicting patient outcome. Introduction: Impaired cerebrovascular reactivity (CAR) after traumatic brain injury (TBI) is a marker for disease severity and poor outcome. It is unclear how dynamic changes in body temperature and fever impact CAR and outcome. We calculated the pressure reactivity index (PRx) using the CENTER-TBI high-resolution intensive care unit cohort, as a moving correlation coefficient between intracranial pressure (ICP) and mean arterial pressure (MAP). Minute and hourly values of PRx and temperature were averaged in patients with simultaneous recording of ICP and ABP. Demographic data was based the Core Registry (V2.0). Linear mixed models were calculated based on minute-by-minute data using R with lme4 V1.1-21 and ggeffects V0.9.0. Generalized estimating equation models were used to analyze changes during effervescence (increase of temperature of >1°C within 3 hours). We assessed high frequency physiological data during 567 days of 102 patients admitted to the ICU with predominantly a closed injury type (N=94/102). Median age was 46 years (IQR 29-62), baseline GCS was 6 (IQR 3-9), and 27% had at least one unreactive pupil. The main measurement site for temperature was the urinary bladder 55/ 102(54%). Half of the patients (49/102) developed fever(>1h with mean T ≥ 38.3°C) with a total of 834h fever and a median of 9h fever(IQR 4-24) per patient. Of 110 effervescence episodes 30(27%) reached the febrile threshold of 38.3°C which was associated with an increase in PRx from 0.09 (±SD 0.25) at baseline (2h before) to 0.26 (±SD 0.3) during the febrile peak (p=0.014) (Figure 1-A) . Linear mixed models showed a quadratic relationship between PRx and temperature (p<0.001) with an increase in predicted PRx with febrile and hypothermic temperatures ( Figure 1B ). The association of increasing body temperature with worsening of CAR supports prevention of fever in severe TBI. Prospective studies are needed to further differentiate between mechanisms involved (i.e. inflammation) and central autonomic dysregulation. Fig. 1 (abstract P030) . The patients with a good 6-month outcome (GOSE>3) after severe traumatic brain injury showed an increase in root mean square of successive differences between normal heartbeats (RMSSD) (compared to baseline 30 -minutes before tracheal succtioning) Acute kidney injury (AKI) is relatively common in patients with severe traumatic brain injury (sTBI) and it can contribute to morbidity and mortality [1] . NephroCheck is a point-of-care urine test that flags two biomarkers that indicate if a critically ill patient is at risk for AKI. We investigated the incidence of subclinical AKI in patients with sTBI. We performed a prospective observational study of all adult patients with severe TBI admitted to ICU from January 2017 to April 2018 inclusive. All consecutive patients with a clinical need for ICP monitoring were included for analysis. Urine samples of severe TBI patients was collected at ICU admission from 33 patients to measure NephroCheck (NC) test [IGFBP7] x was performed using the NephroCheck® Astute1 40 ™ meter. Serum creatinine was collected at admission, during the first three days, at ICU dismission and 60-days follow up to assess renal recovery. The diagnosis of AKI was based on KDIGO criteria. Hemodynamics, electrolytes, PEEP, P/F, kind of fluid administered, Fluid Balance, % Fluid overload, length of stay, The Sequential Organ Failure Assessment score, injury severity scores and mortality were collected. A total of 15 patients (45%) presented a median NC higher values at ICU admission. One patient with positive NC value experienced AKI at 24 hrs. The positive NC group had more plasma transfusion (p-value 0.03) and a lower median hematocrit at 24 hrs (p-value 0.013), but similar hospital length of stay (p=0.17) and mortality rate (p=0.80) Conclusions: NC at ICU admission identifies subclinical AKI in TBI patients and it maight be used to predictclinical AKI. Hemodilution (but not fluid overload) seems to be associated with development of subclinical AKI. Higher NC at ICU admission is not associated with worst longterm outcome in TBI patients. Severe traumatic brain injury (TBI) is considered a serious public health problem in Europe. Partly because of the heterogeneity of TBI, considerable uncertainty may exist in the expected outcome of patients. The International Mission for Prognosis and Analysis of Clinical Trials in TBI (IMPACT) and the Corticosteroid Randomization After Significant Head Injury (CRASH) prediction models are considered the most widely validated prognostic models [1, 2] . However, studies using these prediction models for benchmarking of outcomes have been scarce. We aimed to compare actual outcomes in a TBI cohort of critically ill TBI patients with predicted outcomes in a quality of care initiative in an Academic Hospital. In this retrospective cohort study, we included consecutively admitted TBI patients to the ICU Adults of Erasmus MC, University Medical Center, Rotterdam, The Netherlands between January 2018 and February 2019. We included 87 patients with TBI. 14-day mortality was 25%, sixmonth mortality was 36% and six-month unfavourable outcome was 50%. The IMPACT core+CT+lab model predicted 34% 6-month mortality (vs 35% actual, p=0.89) and 51% unfavourable outcome (vs 50% actual, p=0.9). The 14-day mortality prediction by CRASH prognosis calculator was 43% versus actual 14-day mortality of only 25% (p=0.01), whereas 6-month unfavourable outcome prediction by CRASH was 67% (vs. 50% actual, p=0.02) ( Figure 1 ). The IMPACT model, although developed more than a decade ago, seemed appropriate for benchmarking purposes in this single center cohort in the Netherlands, while CRASH predictions were less applicable to our setting. Introduction: Out of hospital cardiac arrest (OHCA) continues to be associated with significant mortality and morbidity. Centralisation of care has considerably improved patient survival but has resulted in increased morbidity in the form of neurological deficit. Accurate neurological prognostication remains challenging incorporating repeated clinical examination and ancillary investigations [1, 2] . Data was collected retrospectively and analysed for 96 patients admitted post OHCA from October 2018 to October 2019. Patient arrest demographics were collected in conjunction with extensive inpatient investigation findings including CT, traditional pupil assessment, pupillometry and EEG. Results: 50% of patients survived to hospital discharge. Patients presenting in a shockable rhythm continue to have higher survival rates ( Table 1) . 53% of patients who received immediate CPR survived to hospital discharge in comparison to 41% of patients who did not receive immediate CPR. 73% of patients underwent non-contrast CT head. 74% of patients had traditional pupillary examination performed on arrival. Pupillometry was introduced in December 2018; 31 out of a possible 85 patients had pupillometry during their inpatient stay. EEG was undertaken in 11% of cases. Our data shows receiving immediate CPR and presenting with a shockable rhythm remain positive prognostic factors. CT head as a stand-alone prognostic modality is unreliable with 14% of patients who survived to discharge, with intact neurology, had an admission CT head reported as hypoxic brain injury. A new neuroprognostic strategy is required in our unit that adds further certainty to likely clinical outcome. This includes increased use of tests such as EEG and pupillometry and the introduction of biomarkers such as neuron specific enolase, somatosensory evoked potential testing and magnetic resonance imaging. Introduction: Post-resuscitation care of patients following an out-of-hospital cardiac arrest (OOHCA) is set out by the UK Resuscitation Council [1] . This is in line with the European Resuscitation Council guideline [2] . The aim of this audit was to review compliancy to this guideline at the Intensive care unit at the Bristol Royal Infirmary . A retrospective audit was performed over a six-month period in adults who were admitted to the intensive care unit at the BRI following an OOHCA whom later died during that admission (41 patients). The focus was on whether the Neuroprognostication and End-of-life (EOL) care received was as per the standards set by the UK Resuscitation Council. The main neuroloical examinations documented were pupillary reflex (100%), corneal reflex (75%) and motor response to pain (100%). 61.5% of patients received an SSEP analysis >72 hours post-ROSC, 81.5% underwent an EEG and 66.7% had >2 serum neuron-specific enolase measurements recorded. All patients (100%) underwent a CT Head during their admission. 5.6% of patients were referred to Palliative Care during their admission. 22% of patients were prescribed all EOL medications. Most common prescriptions included Alfentanil (90.2%) and Midazolam (58.5%). Finally, 100% of appropriate patients were referred to be potential organ donors. The audit reflected our local practice and that some parameters were not being maintained as set by UK Resuscitation guideline. Multiple Introduction: The prognostication of neurological outcome in comatose out-ofhospital cardiac arrest (OHCA) patients is an integral part of post cardiac arrest care. Biochemical biomarkers released from cerebral cells after hypoxic-ischemic injury represent potential tools to increase accuracy in predicting outcome after OHCA. Currently, only neuronspecific enolase (NSE) is recommended in European prognostication guidelines. In this study, we present the release dynamics of GFAP and UCH-L1 after OHCA and evaluate their prognostic performance for long-term neurological outcome in OHCA patients. Serum GFAP and UCH-L1 were collected at 24, 48 and 72 h after OHCA. The primary outcome was neurological function at 6-month follow-up assessed by cerebral performance category scale (CPC), dichotomized into good (CPC 1-2) and poor (CPC [3] [4] [5] . Outcome prognostic performance was investigated with receiver operating characteristics (ROC) by calculating the area under the receiver operating curve (AUROC) and compared to NSE. Results: 717 of 819 included patients had at least one serum GFAP or UCH-L1 value at 24, 48 or 72 h after OHCA. GFAP and UCH-L1 levels were significantly elevated in patients with poor outcome. GFAP and UCH-L1 discriminated excellently between good and poor neurological outcome at all time-points (AUROC GFAP 0.88-0.89; UCH-L1 0.86-0.87) and overall predictive performance measured by AUROC of GFAP and UCH-L1 was superior to NSE (AUROC 0.76-0.85) ( Figure 1 ). However, the ROC at the highest specificities of UCH-L1 and GFAP overlap those of NSE and comparing the sensitivities for UCH-L1 and GFAP with those of NSE for the highest specificities (>95%) revealed higher sensitivities for NSE than for UCH-L1 and GFAP at 48 and 72 h. GFAP and UCH-L1 predict poor neurological outcome in patients after OHCA excellently and with a higher overall accuracy than NSE, but both biomarkers perform inferior to NSE at specificities over 95% at 48 and 72 h limiting their clinical use to guide decisions on prognosis. Blood pressure after cardiac arrest and severity of hypoxicischemic encephalopathy C Endisch 1 , S Preuß 2 , C Storm 3 Introduction: Blood pressure management in post cardiac arrest (CA) patients ensures sufficient cerebral perfusion to avoid secondary brain injury. In Local chain-of-survival improvements affect P-OHCA survival [1] [2] [3] [4] [5] . Also initial rhythm in P-OHCA is an important predictor of survival [1, 4] . Little is known about the relationship between initial rhythm in P-OHCA and long-term outcome [6] [7] [8] . Our aim was to establish the relation between shockable rhythm and favorable long-term outcome in POHCA. All children aged 1 day-18 years who experienced non-traumatic OHCA between 2002-2017 and were admitted to the Sophia Children's Hospital in Rotterdam were included. Long-term outcome was determined using a Pediatric Cerebral Performance Category score at the longest available follow-up interval. The primary outcome measure was survival with favorable neurologic outcome, defined as PCPC 1-2 or no difference between pre-and postarrest PCPC. The association between shockable rhythm and the primary outcome measure was calculated in a multivariable regression model, adjusted for the pre-defined variables. From the 329 patients included in the 16 year study period 126 (38%) patients survived to hospital discharge of which 99 patients (30%) had favorable neurologic outcome (median follow-up duration of 24 months). The rate of favorable neurologic outcome rose from 17% in 2002 to 52% in 2017 (p < 0.001 for trend) (Fig. 1) The odds of favorable neurologic outcome at the longest follow-up duration were significantly higher after a shockable initial and unknown rhythm. Secondly, trend analysis showed an increase in AED defibrillation and shorter CPR duration. This was followed, finally, by a rise in ROSC, survival to hospital discharge and favorable neurologic outcome rate. Low socioeconomic status is associated with worse outcome after cardiac arrest. This study aims to investigate if patients´socioeconomic status impacts the chance to receive early coronary angiography after cardiac arrest. In this nationwide retrospective cohort study, 4011 patients admitted alive after out-of-hospital cardiac arrest (OHCA) and registered in the Swedish Registry for Cardiopulmonary Resuscitation were included. Individual data on income and educational level, prehospital parameters, coronary angiography results and comorbidity were linked from other national registers. In the unadjusted model there was a strong correlation between income level and rate of early coronary angiography where 35% of patients in the highest income quartile received early angiography compared to 15% in the lowest income quartile. When adjusting for confounders (educational level, sex, age, comorbidity and hospital type) there were still higher chance of receiving early coronary angiography with increasing income, OR 1.31 (CI 1.01-1.68) and 1.67 (CI 1.29-2.16) for the two highest income quartiles respectively compared to the lowest income quartile. When adding potential mediators to the model (initial rhythm, location, response time, bystander cardiopulmonary resuscitation and if the arrest was witnessed) no difference in early angiography related to income level where found. The main mediator was initial rhythm (Figure 1 ). Higher income is strongly related to the rate of early coronary angiography after OHCA. This finding is consistent when adjusting for known confounders. However, the association between income and early angiography seems to be mediated by initial rhythm. Patients with low income more often presents with non-shockable rhythms which lowers the likelihood to undergo early coronary angiography. A. The total amount of mortality as a stacked bar: in light-red the number of patients who deceased at scene, in green the number of patients deceased during admission, in red patients who died after discharge. The grey line is the total number of inclusions. B. The rate of bystander AED use, rate of initial shockable rhythm, rate of less than 15 minutes of CPR and rate of favorable neurologic outcome over time. P for trend significant for bystander AED use, less than 15 minutes of CPR and favorable neurologic outcome. Trend analysis performed using binary logistic regression for dichotomous data (and a Kruskal-Wallis test for non-normally distributed continuous data) Effect of simulation teaching of cardiopulmonary resuscitation for nursing V Spatenkova 1 Introduction: Simulation teaching is a modern type of critical care (CC) education. The aim of this study was to assess the effect of simulation teaching of CC on a comparison of final examination in different model levels of cardiopulmonary resuscitation (CPR) after the first (CC1) and third, final CC3. The success rate of CPR was tested in prospective study (2017) (2018) on two groups with a total of 66 students in CC1 and CC3 at the Faculty of Health Studies. Three semester of undergraduate nursing simulation education (lectures and training) used the Laerdal SimMan 3G. Quality of CPR was evaluated according to 4 parameters: compression depth, compression rate, chest release and time of correct frequency. We tested if CPR quality differed between the two groups. For the compression depth and compression rate parameters, first the conformity of variance was verified and then two-sample t-test. As the chest release and time of correct frequency are recorded as percentages, the Wilcoxon rank-sum test was conducted for these parameters. To ensure good resuscitation, all recorded parameters must be properly performed during resuscitation. Thus, pivot tables were used to generate statistics and test if the number of correctly performed resuscitation parameters for CC1 and CC3 differ. The compression depth parameter was statistically significantly higher for the CC3 than for the CC1 (p=0.016). There were no differences in compression rate (p=0.210), chest release (p=0.514) and time of correct frequency (p=0.586). It was also tested how many of the parameters were performed correctly by students at CPR. The chi-square test shows the relative frequency of CPR success is higher for the CC3 group than for the CC1 group. At least 3 out of 4 parameters were correctly performed by 13% of CC1 students compared to 28% of CC3 students. The study showed a significant improvement of CPR in the final CC3 and supported the three semester simulation education. Changes in blood gases during intraoperative cardiac arrest JJ Wang, R Borgstedt, S Rehberg, G Jansen Protestant Hospital of the Bethel Foundation, Anaesthesiology, Intensive Care and Emergency medicine, Transfusion medicine and Pain therapy, Bielefeld, Germany Critical Care 2020, 24(Suppl 1):P049 Introduction: Blood gas analysis (BGA) is a common approach for monitoring the homeostasis during surgery. While it is well known that cardiac arrest (CA) leads to circulatory collapse and disturbances in homeostasis, little is known about changes of blood gas during peri-operative CA. We retrospectively analysed patients ≥18 years who suffered from peri-operative CA during non-cardiac surgery from 01/2014 to 12/ 2018. Peri-operative CA was defined as need for cardiac compression during anaesthesia care. Collected data included pH, PaCO2, PaO2, return of spontaneous circulation (ROSC) and 30-day mortality after CA. Within the study period, we observed 56 peri-operative CA (m=35, f= 21; age 69±16) during 62742 anaesthesia procedures (ROSC occurred in 38 patients (68%). 30 days after CA, the mortality was 45% (n=17), 23% (n=13) were discharged, and 14% (n=8) still in hospital. 87% (n= 49) of CA patients had an invasive blood pressure monitoring, 52% (n=29) had BGA before and 66% (n=37) during peri-operative CA. Prior to CA, the average values were: pH 7.3±0.1, PaCO2 39±8 and PaO2 225±107. During CA, the average values were pH 7.2±0.2, PaCO2 50±17 and PaO2 215±138. Table 1 shows the distributions of blood gas before and during CA. There were no statistical differences between the groups (pH: p=0.4; PaCO2: p=0.19; PaO2: p=0.21). Hypercapnia and respiratory acidosis is common in peri-operative CA. These data suggests inadequate ventilation during peri-operative resuscitation. Further studies should focus on its impact on the outcome. ]. Comparing cases with and without ROSC, there were significant more diagnostics done in the group without ROSC but more therapeutic consequences seen in the ROSC-group (Table 1) . ICU-CA is frequent. Diagnostics to detect reversible causes of CA were used rarely in ICU-CA (44%), even in patients without ROSC. Notably, diagnostics often had therapeutic consequences particularly in ROSC. Further studies are required to define standardized diagnostic algorithms during ICU-CA. Continuous monitoring of cardiac patients on general ward were improved short term survival of in-hospital cardiac arrest UJ Go 1 Introduction: The importance of early detection in the in-hospital cardiac arrest (IHCA) is emphasized. Previous studies have reported that clinical outcomes are improved if IHCA is witnessed, or if a patient admitted to a monitored location [1, 2] . This study aimed to evaluate the association between continuous monitoring and survival of IHCA on general ward. A retrospective cohort study of IHCA in patients admitted to ward at an academic tertiary care hospital between January 2009 and December 2018 was performed. The primary outcome was return of spontaneous circulation (ROSC). The secondary outcomes were 72hour survival and survival to hospital discharge. (Table 1) . Cardiac patients with continuous monitoring on general ward showed improving ROSC and 72-hour survival but not survival to hospital discharge in IHCA. In-hospital cardiac arrest is associated with poor outcomes. Although steroids are frequently used in patients with septic shock, it is unclear whether they are beneficial during cardiac arrest and after return of spontaneous circulation (ROSC). Of 369 cardiac arrest patients evaluated, 100 were enrolled. Advanced life support was conducted according to the2015 resuscitation guidelines. Forty-six patients were randomly assigned to receive methylprednisolone 40 mg during resuscitation, and 54 to receive saline (placebo). After resuscitation, steroid-treated patients received hydrocortisone 240 mg daily for up to 7 days, followed by tapering . There was no significant difference between the two groups in ScvO 2 andall the secondary outcomes (p>0.05 for all comparisons). The present study found no significant physiologic benefit of corticosteroid administration during and after resuscitation in hospitalized patients with cardiac arrest. The experiences of EMS providers taking part in a large randomized trial of airway management during out of hospital cardiac arrest, and the impact on their views and practice. Results of a survey and telephone interviews M Thomas 1 Introduction: The aim is to explore EMS experiences of participating in a large trial of airway management during out-of-hospital cardiac arrest (AIR-WAYS-2), specifically to explore: 1. Any changes in views and practice as a result of trial participation. 2. Experiences of trial training. 3 . Experiences of enrolling critically unwell patients without consent. 4. Barriers and facilitators for out-of-hospital trial participation. An online questionnaire was distributed to 1523 EMS providers who participated in the trial. In-depth telephone interviews explored the responses to the online questionnaire. Quantitative data were collated and presented using simple descriptive statistics. Qualitative data collected during the online survey were analysed using content analysis. An Interpretive Phenomenological Analysis approach was used for analysis of qualitative interview data Results: Responses to the online questionnaire were received from 33% of AIRWAYS-2 study paramedics and 19 study paramedics were interviewed. Paramedics described barriers and facilitators to trial participation and changes in their views and practice. The results are presented in five distinct themes: research process; changes in views and practice regardingairway management; engagement with research; professional identity; professional competence. Conclusions: Participation in the AIRWAYS-2 trial was enjoyable and EMS providers valued the training and study support. There was enhanced confidence in airway management as a result of taking part in the trial. Study paramedics expressed preference for the method of airway management to which they had been randomized. There was support for the stepwise approach to airway management, but also concern regarding the potential to lose tracheal intubation from 'standard' paramedic practice. Causes of medical care-associated cardiac arrest on the intensive care unit S Entz 1 Introduction: Cardiac arrest on intensive care unit (ICUCA) following therapeutic interventions is of imminent importance, because the interventions are comparatively predictable and precautions can potentially be taken. This study investigates medical care associated complications that led to ICUCA. Intensive care database was screened for patients ≥18 years who experienced ICUCA in a tertiary hospital with five ICU (two medical, two surgical, one interdisciplinary, with a sum of 71 ICU beds) in Germany from 2016-2018. ICUCA was defined as receiving chest compression and/or defibrillation after admission on ICU and classified as "medical care associated" if it was preceded by a therapeutic intervention (i.e. induced by medication, bedding procedures, iatrogenic injuries, procedure associated). Subgroups included patients with recurrence of spontaneous circulation (ROSC) vs. no-ROSC and patients with vs. without vasopressor therapy before intervention. There were 125 ICUCA in 114 patients of totally 14,264 ICU patients. Medical care associated complications leading to ICUCA were detected in 28 cases (22%) [Incidence 19.6/10,000(CI95 12.3 -26.9)]. ICUCA following therapeutic interventions occurred because of circulatory insufficiency [n=20(70%)], respiratory failure [n=5(17%)] and airway associated problems [n=3(10%)]. Nine of the 28 patients (32%) with care-associated ICUCA died. Table 1 demonstrates therapeutic interventions followed by ICUCA. Care-associated complications were common reasons for ICUCA. Most of events were induced by circulatory insufficiency due to induction of anaesthesia and bedding procedures. Further investigations should focus on preventive strategies, such as vasopressor infusion before therapeutic interventions. In-hospital cardiac arrest (IHCA) is a lethal event. However, IHCA has received less attention than out-of-hospital cardiac arrest (OHCA). There have been some studies on IHCA; however, there is a lack of information on the evidence and clinical features of IHCA compared with information for OHCA. We therefore conducted this study to clarify important aspects of the epidemiology and prognosis of IHCA in patients with code blue activation. We carried out a retrospective observational study of patients with code blue events in our hospital during the period from January 2010 to October 2019. We obtained information on the characteristics of patients including age and gender, IHCA characteristics including the time of cardiac arrest, event being witnessed, presence of bystander cardiopulmonary resuscitation (CPR), initial shockable rhythm, vital signs 1 h or 6 h before cardiac arrest, survival to hospital discharge (SHD), and the cardiac arrest survival postresuscitation in-hospital (CASPRI) score. The primary endpoint was SHD. We performed univariate and multivariate logistic regression analyses. A total of 293 code blue events were activated during the study period. Finally, 81 patients were included in this study. Overall, the SHD rate was 28.4%. The median time of CPR was 14 min (interquartile range, 6-28 min). The rate of initial shockable rhythm was 19.8%. There were significant differences in CPR duration, shockable rhythm, and CASPRI score between the SHD group and non-SHD group by univariate-logistic regression analysis. CASPRI score was found to be the most effective predictive factor for SHD (OR=0.98, p=0.006) by multivariate-logistic regression analysis. Our results demonstrated that CASPRI score is associated with SHD in CPA patients with in-hospital code blue events. CASPRI score in IHCA patients would be a simple and useful adjunctive tool for management of post-cardiac arrest syndrome (PCAS). Peri-operative cardiac arrest in prematurityincidence and causes at a tertiary care hospital between 2008-2018 G Jansen, J Popp, E Lang, R Borgstedt, B Schmidt, S Rehberg Protestand Hospital of the Bethel Foundation, Anaesthesiology, Intensive Care and Emergency Medicine, Bielefeld, Germany Critical Care 2020, 24(Suppl 1):P058 The peri-operative care of premature pediatric patients requires special expertise and is therefore reserved for specialized centers. Although premature birth is described as a risk factor for peri-operative complications and cardiac arrest (POCA) there are no data on its incidence and causality in this particular population [1] . The present study investigates the incidence and causality of pediatric POCA at a tertiary care hospital and level I perinatal center in Germany. In the anesthesia database of the study center, all anaesthesiological procedures in patients <16 years of age were examined for POCA in preterm infants (gestational age <40th week of gestational age) between 2008 and 2018. The peri-operative period was defined between the beginning of anesthesiological care up to 60 minutes after anesthesia and/or sedation. We defined cardiac arrest as the necessity of chest compressions. The perioperative phase and the cause of the POCA, gestational age and birth weight were recorded. Between 2008 and 2018, 308 (1.3%) of the 22,650 pediatric anesthesiological procedures were performed on 301 premature infants. In total, 10 POCA occurred in 9 of these patients (f=6, M=3; average gestional age 208±27 days; average birth weight 1510±747g (incidence 3.2%, CI 96 1.3-5.2%). The time of occurrence and the causes of POCA are shown in Table 1 . POCA in premature babies is rare and has an incidence of 1.3%, which is significantly higher than the non-premature babies. The main causes are problems or complications associated with the respiratory tract and its management, as well as massive hemorrhage. Introduction: Peri-operative cardiac arrest (POCA) in children's anesthesia care is a dreaded event. Depending on the country and population, studies describe incidences between 2.9-20.6 per 10,000 children's anesthetics. There are no data on the current incidence of pediatric POCA in Germany. The present study investigates the incidence of POCA at a tertiary hospital and level I perinatal center in Germany. In the anesthesia database of the study center, all anaesthesiological procedures in patients <16 years were examined for POCA. The peri-operative period was defined between the beginning of anesthesia care up to 60 minutes after anesthesia or sedation. Cardiac arrest was defined as the necessity of chest compressions. Age, weight, ASA status, cause of death and survival after 30 days were recorded. Results: 18 POCA (median weight was 2525g [Q1 7151;Q75(14748)]) were observed in 22,650 anaesthesiological procedures (incidence 7.9±4.2 per 10,000 [CI95 4.3-11.6]). Table 1 shows the distribution of the individual age groups, incidences and mortalities of POCA. Peri-operative 30-day mortality was 3 per 10,000 [CI 95 [1] [2] [3] [4] [5] . Three children died intraoperatively as a result of hemorrhagic shock, one on the PICU as a result of malignant hyperthermia. 30 days after POCA, 4 more children had died on the ICU due to their underlying disease. POCA is a rare event. Risk factors are an age <28 days and an ASA status ≥ III. The main cause of peri-operative death in patients <16 years of age is massive hemorrhage, the 30-day mortality is determined by the underlying disease. In-hospital cardiac arrest -predicting adverse outcomes T Partington, J Borkowski, J Gross Northwick Park Hospital, Anaesthesia/Critical Care, London, United Kingdom Critical Care 2020, 24(Suppl 1):P060 Introduction: Cardiac arrest occurs in 1.2 per 1000 hospital admissions in the UK. Return of spontaneous circulation (ROSC) is achieved in approximately half of resuscitation attempts, but rate of survival to hospital discharge is substantially lower [1] . In our centre, post-arrest care accounts for 6.4% of ICU admissions. Premorbid social function is purported to affect outcomes, but comorbidity scores are more often used for risk stratification. Using a novel Social Function Score alongside an existing comorbidity scale, we aimed to identify trends to inform management of patients at risk of deterioration. A six-month prospective observational study was conducted in a major UK hospital from October 2017 to April 2018. For all adult inpatient cardiac arrests, medical notes were reviewed and data collected on the following domains: Patient demographics Comorbidities and functional status Admission details Post-arrest events Statistical analysis was performed using Student's unpaired t-test. Results: 54 cardiac arrests occurred. 85% were in medical patients, with the majority male (63%) and aged over 75 (63%). 89% were emergency admissions, with mean duration of hospital stay pre-arrest 9 days. In 17 cases (31%) sustained ROSC was achieved. However, seven of these (41%) were not subsequently admitted to the ICU. Only six patients (11%) survived to hospital discharge. Pre-admission function and comorbidity were worse in patients who did not survive to discharge ( Fig. 1 ), but these were not statistically significant in view of small survivor group size. In an increasingly frail inpatient population, a substantial proportion of patients in whom circulation is restored after cardiac arrest are subsequently considered unsuitable for ICU admission. Given our understanding of inferior outcomes in patients with poor physiological reserve, we encourage early discussion regarding the appropriateness of CPR in selected patients, guided by social function and comorbidity. References: 1. National Cardiac Arrest Audit 2017/18 Introduction: There are studies that determine events related to poor outcome in cardiac arrest [1] . In our study, following parametres were determined OHCA patients; age median years, Asian/Europe/Syrian, Bystander CPR, Bystander AED, EMS defibrillation, initial cardiac rhythm, prehospital ROSC, corneal and pupillary light reflex and day survival. We determineted poor prognostic sign with post-cardiac arrest patients. In this study, we identified the causes of poor outcome in patients with OHCA. This was a single-centre, retrospective study. We determined incidence and epidemiological factors including: demographics, initial cardiac rhythm. Our study population were non-traumatic OHCA. Our ICU, All OHCA patient were evaluated wtih ECHO, and fluid, inotrope and vazopressor were added according to cardiac performance. Results: During our study, 5970 patients who were admitted to intensive care unit between 2012-2019 were screened. 133 of these patients were out-of-hospital arrest and 41 of them were in-hospital arrest. Development of cerebral oedema during treatment in hospital remains a poor prognostic sign. The evaluation of initial cardiac ritm is useful to predict neurological outcome in post-cardiac arrest patients. Survival after OHCA remains low. The evaluation of initial cardiac ritm is useful to predict mortality and neurological outcome in postcardiac arrest patients. Basic life support (BLS) education and training for school children is active in Japan. However, the BLS action by schoolchildren may be limited by school rules. This study aimed to analyse the time factors for basic life support performance and outcome in classmatewitnessed out-of-hospital cardiac arrest (OHCA) and to investigate how schoolchildren act when they detect OHCA. Methods: Nation-wide database for 1,068 school children cases with OHCA and local extended database for 5,478 EMS-unwitnessed OHCA, both of which were prospectively collected during the period of 2011-2016, were retrospectively analysed. Proportion of schoolchildren-detected OHCA was low in classmate cases (16.8%, 179/889) in nationwide database and extremely low in all EMS-unwitnessed OHCAs (1.6%, 88/5,478) in local database. Nationwide database analyses revealed that both emergency call and bystander CPR were delayed when a classmate witnessed the OHCA case: median, 1 vs. 0 min and 3 vs. 2 min, respectively. Classmate-witnessed cases were associated with higher incidences of shockable initial rhythm, AED use and traumatic causes. The rate of neurologically favourable outcome was 19.6% and 12.3%, respectively in classmate-witnessed and other cases: adjusted OR; 99% CI, 1.24; 0.63-2.47. Of 88 cases detected by schoolchildren in our prefecture, 8 (34%) cases had presumed cardiac aertiology and 12(13.8%) cases were caused by suicide attempts (hanging and fall). School children placed emergency 119 calls as the first action only in 32 (36.4%) cases. Emergency calls were largely delayed when school children dialled other numbers or left the scene to seek adult help. School children were rarely involved in bystander CPR (21%) and AED placement (1%). School children are rarely involved in entire BLS. Emergency calls and bystander CPR are delayed when schoolchildren act to seek help. Because schoolchildren detect suicide-related OHCAs, psychological care to schoolchildren involved in BLS may be necessary. Prognostic value of neutrophil/lymphocyte and platelet/ lymphocyte predicting cardiopulmonary resuscitation with spontaneous circulation recovery C Li the Affiliated Suzhou Hospital of Nanjing Medical University, Suzhou, China Critical Care 2020, 24(Suppl 1):P063 To investigate the predictive value of peripheral blood neutrophil-tolymphocyte ratio (NLR) and platelet-to-lymphocyte ratio (PLR) on inhospital mortality in patients with spontaneous circulation recovery after cardiac arrest. A retrospective analysis was made of 30 patients who recovered from cardiac arrest in our hospital from April 2012 to November 2018 and were admitted to the intensive care unit for more than 24 hours. They were divided into survival group and death group according to the outcome of discharge.The dynamic changes and differences of NLR and PLR in 24 hours and 48-72 hours after admission to ICU between the two groups were analyzed and compared. Multivariate analysis and ROC curve were used to explore the predictive value of NLR and PLR for in-patient mortality. Compared with the survival group, PLR in the dead group was significantly lower within 24 hours of admission to the intensive care department (P < 0.05), while NLR in 48-72 hours was significantly higher (P < 0.05). The NLR of surviving group was significantly lower than that of 24 hours (P < 0.05), while the NLR and PLR of death group were not significantly different (P < 0.05) from that of 24 hours (P < 0.05). Multivariate logistic regression analysis and ROC curve showed that NLR of 48-72 h in ICU was an independent risk factor for predicting in-patient mortality, and had high sensitivity and specificity in predicting death outcomes. Neutrophil to lymphocyte ratio, platelet to lymphocyte ratio can help to judge the outcome of patients with cardiac arrest and recovery of autonomic circulation after cardiopulmonary resuscitation. [2, 3] patients with SOFA score > 12 (vs SOFA score ≤ 12) had a higher free iron level (35.5 μmol/l vs 16 μmol/l, p = 0.0333) ( Figure 1 ). We found a positive correlation between free iron level at H0 and changes of SOFA score between H0 and H48 (r= 0.56 IC95[0.08;0.76]). Out-of-hospital cardiac arrest is associated with a significant change of plasma free iron level. Free iron level at admission is associated with short term outcome. Further research is warranted to better determine the significance of such changes. The optimal level of arterial oxygen in the post-resuscitation period is unknown. Recent studies show conflicting results in regard to hyperoxia and its association with survival after out-of-hospital cardiac arrest (OHCA) [1] . The aim of this trial is to study the association between early hyperoxia after OHCA with return of spontaneous circulation (ROSC) and 30-day survival. Observational study using data from three Swedish national registers (i.e. intensive care, cardiac arrest and national patient registries After a successful resuscitation, a systemic inflammatory response occurs, and the C-reactive protein (CRP) level represents the degree of inflammation [1] [2] [3] . This study examined the association between increased inflammation and early-onset pneumonia (EOP) in patients treated with extracorporeal cardiopulmonary resuscitation (ECPR) after out-of-hospital cardiac arrest (OHCA). This retrospective study included data of patients with OHCA treated with ECPR admitted to St. Luke's International Hospital between April 2006 and April 2019. The exclusion criteria were as follows: age < 18 years, therapeutic hypothermia withdrawal due to death or circulatory failure, or sepsis as a suspected cause of cardiac arrest. Patients were diagnosed with EOP according to clinical signs and symptoms acquired after a hospitalization period of >48 h and within 7 days of admission. The CRP levels were measured daily from admission to day 3. We studied 55 patients with a median age of 55 years (interquartile range: 42-65 years). Furthermore, 52 (95%) patients were males, and the median time interval from collapse to adequate flow was 51 (42-63) min. All patients received prophylactic antibiotics, and 18 (33%) of them had favorable neurological outcomes (CPC, 1-2). EOP occurred in 32 (58%) patients, with a significantly higher CRP level on day 3 than that in those without EOP (13. 9 Categorizing reasons for death after eCPR is important for comparing outcomes to other studies, assessing benefits of interventions, and better define this heterogeneous patient collective. A categorizing for death after cardiac arrest in both in-hospital (IHCA) and outof-hospital (OHCA) arrests has been proposed in non-eCPR patients by Witten et al. Here, we adopt this categorization to eCPR patients. Single-center, retrospective, cohort study of patients without ROSC after IHCA or OHCA and eCPR between 2010 and 2017. Patients with survival below 24 hours were excluded. Patients were allocated to one of five predefined reasons for death. Results: 231 va-ECMO patients were included (age 58.6±14.3, 29.9% female, 58% eCPR, 30 day survival 42.9%). Reasons for death for patients with va-ECMO for shock (survival 53%) and eCPR (36%) were: neurological withdrawal of care (10% vs 25%), comorbid withdrawal of care (18% vs 4%), refractory hemodynamic shock (16% vs 33%), respiratory failure (3% vs 2%), and withdrawal due to presumed patient will (0% vs 1%) ( Figure 1 ). The differences in reasons for death among the two groups were significant (p <0.001), driven by withdrawal due to neuroprognostication, comorbidity and hemodynamic instability. Categorizing death after va-ECMO into five categories is feasible. There are significant difference between patients with va-ECMO for shock and eCPR. Interestingly, only a quarter of patients after eCPR died due to brain damage. Introduction: Scarcity of potential dead brain donors and the persistent mismatch between supply and demand of organs for transplantation has led the transplant community to reconsider donation after circulatory death (DCD) as a strategy to increase the donor pool. Normothermic regional perfusion (nRP) by extracorporeal membrane oxygenation (ECMO) may be the most effective method for preserving abdominal organs in DCD, especially in liver transplantation [1, 2] . A pitfall of this method is its complexity and the unavailability of this resource in some hospitals, especially in regional hospitals, where potential DCD donors may exist. Aim of this study is to report the use of Mobile ECMO team in controlled DCD. From June 2018 to November 2019 our group has worked as a mobile ECMO team for cDCD outside our center. Portable equipment included cannulation material and the ECMO device. The transplant team consisted of 1 transplant coordinator (anesthesiologist-intensivist, ECMO operator and organ extraction supervisor), 1 cardiac surgeon (cannulation), 1 interventional radiologist (cannulation) and one cardiovascular perfusionist (ECMO operator). Twenty-five cDCD donations were performed. Characteristics of donors and organs retrieved are summarized in Figure 1 . From 25 cDCD, 17 livers, 4 lungs, 45 kidneys were obtained. The evolution of grafts and receptors was favorable at day 30 post-transplant. Mobile ECMO teams may enable cDCD in hospitals without these resources, thereby increasing the pool of donors and optimizing graft outcomes. What is the useful coagulation and fibrinolysis marker for predicting extracorporeal membrane oxygenation circuit exchange due to intra-circuit thrombus? Y Izutani, K Hoshino, S Morimoto, K Muranishi, J Maruyama, Y Irie, Y Kawano, H Ishikura Fukuoka University Hospital, Emergency and Critical Care Center, Fukuoka-shi, Japan Critical Care 2020, 24(Suppl 1):P071 A thrombus formation is one of the most frequent and adverse complications during extracorporeal membrane oxygenation (ECMO) support. Previous studies have reported that increased D-dimer is a useful predictor of thrombus formation within the ECMO circuit. The purpose of this study was to identify coagulation/fibrinolysis markers for predicting the replacement of ECMO circuit due to intra-circuit thrombus during ECMO support. Fourteen patients who underwent veno-venous ECMO for acute respiratory failure between January 2014 and December 2018 were enrolled. These patients received a total of 125 days of ECMO support. Of these, 9 days (times) on which the ECMO circuits were replaced was regarded as the replacement group, while the remaining 116 days were considered as the non-replacement group. The several coagulation/fibrinolysis markers were routinely measured every day during ECMO support. we compared with the levels of these markers between two group to identify the most relevant marker for ECMO circuit replacement due to thrombus. The mean duration of ECMO support was 9±11 days, and the mean number of ECMO circuit replacement was 0.6±1.0 times per patient. Ddimer, Thrombin-antithrombin complex (TAT), Plasmin-α2 plasmin inhibitor complex (PIC), and soluble fibrin (SF) were significantly higher in the replacement group rather than in the non-replacement group (P < 0.01, respectively). According to a multivariate analysis, SF was the only independent predictor of ECMO circuit replacement due to thrombus. The odds ratio (95% confidence intervals) for SF (10 μg/mL) was 1.2 (1.1-1.3). The area under the curve and optimal cut-off value were 0.94 and 85 ng/mL for SF, respectively (sensitivity, 100%; specificity, 85%). From these results, we concluded that SF may be the useful marker rather than D-dimer for predicting the replacement of ECMO circuit due to intra-circuit thrombosis. Inhomogeneity of lung elastance in patients who underwent venovenous extra corporeal membrane oxygenation (V-V ECMO)-a computed tomography scan study RD Di Mussi 1 , RI Iannuzziello 2 , FM Murgolo 2 , FD De Carlo 2 , E Caricola 2 , NA Barrett 3 , LC Camporota 3 , SG Grasso 2 1 Università degli studi di Bari "Aldo Moro", Department of emergencies and organ transplant, Bari, Italy; 2 Università degli studi di Bari "Aldo Moro", Bari, Italy; 3 Department of Adult Critical Care, Guy´s and St Thomas´NHS Foundation Trust, King´s Health Partners, London, UK Critical Care 2020, 24(Suppl 1):P072 In patients with acute respiratory distress syndrome (ARDS), nonaerated, poorly aerated, and normally aerated regions coexist to variable degrees in lung parenchyma. The recruitment maneuvers aim to reopen collapsed lung tissue. In a theoretical point view, this strategy may also prevent the normal aerated lung tissue hyperinflation [1] . The objective of our study was to evaluate lung characteristics in terms of Hounsfield Units (HU), volume and elastance before and after a recruitment maneuver. In 37 patients with severe ARDS who underwent V-V ECMO, computed tomography scans (CT-scans) at 5 cmH 2 O of continuous positive airway pressure (CPAP) and 45 cmH 2 O were performed. The same CT image was selected at the two different levels of pressure. The distribution of lung opacities, in terms of HU, was classified using the "UCLA" colour coding table (OsiriX image processing software, Geneva, Switzerland). Correspondent lung regions of about 1020 voxels were selected. The quantitative analysis, in terms of Volume air (Vair) was performed with Maluna software (Version 3.17; Maluna, Goettingen, Germany). Elastance was calculated as the pressure(cmH 2 O)/ Vair (ml) ratio. Results: See Figure 1 . Lung inhomogeneity occurs also after recruiting maneuvers. Our data confirm that the elastance of recruited lung regions is higher than the elastance of the normal aerated lung regions at low positive end-expiratory pressure (PEEP) (baby lung). On the contrary the "baby lung" frequently develops hyperinflation. The unpredictable pattern of distribution of volume after recruitment maneuverers may explain the controversial role of PEEP during the ARDS treatment. . Formal recommendations on target, timing, and rate of AT supplementation are lacking. We conceived this study to evaluate the effect of prolonged AT supplementation in adult patients requiring veno-venous ECMO for respiratory failure on heparin dose, adequacy of anticoagulation and safety Methods: Before ECMO start patients were randomized to either receive AT supplementation to maintain a functional AT level between 80 and 120% (AT supplementation group) or not (control group) for the entire ECMO course. Anticoagulation was provided with unfractionated heparin following a standardized protocol [1] . The primary outcome was the dose of heparin required to maintain the ratio of activated partial thromboplastin time between 1.5 and 2. Secondary outcomes were the adequacy of anticoagulation measured with anti-Factor Xa and the incidence of hemorrhagic and thrombotic complications and amount of blood products Fig. 1 B) . Conclusions: This retrospective analysis was not able to show a survival benefit for additive PP to ECMO support in general. Early initiation of PP could be an important factor for improving survival in this setting and should be considered in a randomized controlled trial for further evaluation. Cause-specific mortality during extracorporeal membrane oxygenation, a single center review of medical records M Panigada, D Tubiolo, P Properzi, G Grasselli, A Pesenti Fondazione IRCCS Ca´Granda Ospedale Maggiore Policlinico, Intensive Care Unit, Milano, Italy Critical Care 2020, 24(Suppl 1):P075 Introduction: Mortality during Extracorporeal Membrane Oxygenation (ECMO) settles around 35% and the occurrence of bleeding during ECMO is associated with a high mortality rate. However, cause-specific mortality is rarely reported, probably due to the difficulty of its classification. The purpose of the study was to evaluate the agreement between two expert ICU physician in the classification of the cause of death of patients supported with ECMO for either respiratory or cardiac support. Methods: Two Intensive Care Unit (ICU) expert staff physicians independently reviewed the entire medical records of all ECMO patients who died before ICU discharge from January 2011 to September 2019 at Fondazione IRCCS Ca' Granda, Milan. They were asked to choose the cause of patient's death among six categories. In case of disagreement, a third expert adjudicated the case. The two reviewers were also asked whether, in their opinion, bleeding during the last 24 hours contributed to death. ELSO definition of major bleeding [1] during the last 24 hours was also recorded for each patient. Results: Two-hundred and two patients were supported with ECMO of whom 70 (34.6%) died. Most of these patients (N=53, 75.7%) died during ECMO. Interrater agreement for cause-specific mortality between the two expert physicians was substantial (k 0.71, SE 0.07, p<0.01) Of the 14 discordant cases 6 were categorized as refractory respiratory failure and 4 as multiorgan failure and septic shock respectively. The distribution of cause-specific mortality is shown in Figure 1 . Major bleeding (ELSO) was present in 27 (38.6%) patients, only in 9 (33.3%) of them bleeding contributed to death according to the reviewers. Patients treated with early PP while ECMO showed a superior survival to patients treated with late PP or without PP while ECMO. Optimal cut off value for duration of ECMO initiation to first PP was calculated using ROC-analysis (AUC = 0.789) and the Youden-Index. Highest sensitivity and specificity for beneficial survival were achieved for a beginning of PP in <0.71 days. (Log rank=0.018). PP: Prone positioning P076 Non-invasive mechanical ventilation in veno-venous extracorporeal membrane oxygenation J Rilinger, V Zotzmann, X Bemtgen, PM Biever, D Duerschmied, C Bode, DL Staudacher, T Wengenmayer Heart Center Freiburg University, Department of Cardiology and Angiology I, Freiburg, Germany Critical Care 2020, 24(Suppl 1):P076 Introduction: Veno-venous extracorporeal membrane oxygenation (ECMO) support can be combined with a variety of different non-invasive ways to deliver oxygen to the patient's lung. Several positive effects might be linked to this so called "awake ECMO". So far there is little evidence about indications and outcome of this approach. We report retrospective registry data on all ARDS patients treated with ECMO support at a university hospital between 10/2010 and 04/ 2019. In a systematic review of medical records, we distinguished between patients with invasive mechanical ventilation (IMV) from the initiation of ECMO therapy (IMV group) and patients that received any kind of non-invasive oxygen supply (non-IMV group). A total of 276 patients could be analysed. 16 (5.8%) patients received non-IMV ECMO support. Patients receiving non-IMV ECMO therapy showed severe underlying pulmonary disease and immunosuppression (Fig. 1) . These patients had higher rates of lung fibrosis, long-term oxygen therapy, pulmonary hypertension, renal insufficiency and immunosuppression (p<0.05). 12 of 16 patients (75%) required IMV during the hospital stay in average 5.3±5.0 [0.8-17.1] days after ECMO initiation. Reasons were hypoxia despite of ECMO, insufficient ECMO-flow, insufficient protective reflexes or patient agitation. Patients with initially non-IMV ECMO support showed a numerical but not significant lower ICU and hospital survival (25.0% vs. 45.4%, p= 0.111). Non-IMV ECMO support was applied in patients with severe underlying pulmonary disease and/or immunosuppression. In a high proportion of patients the ventilation regime had to be switched from non-invasive to invasive. Survival in this very selected cohort was low. In this retrospective analysis no evident benefit for a noninvasive ventilation strategy could be found. The high proportion of patients who switched from non-IMV to IMV therapy underlines the need for rigorous patient selection. Intra-hospital transportation on extracorporeal membrane oxygenation (ECMO) -a single centre experience in Ireland. Z Siddique, S O´Brien, E Carton, I Conrick-Martin Mater Misericordiae University Hospital, Department of Critical Care Medicine, Dublin, Ireland Critical Care 2020, 24(Suppl 1):P077 The objective of this study is to evaluate intra-hospital transportation of patients on extracorporeal membrane oxygenation (ECMO). It is a retrospective analysis of prospectively collected database, performed as part of ongoing quality improvement initiatives. The setting of this study is an 18-bed, combined surgical and medical adult Intensive Care Unit (ICU) located in a 570-bed hospital that serves as the national referral centre for Cardiothoracic Surgery, Heart & Lung transplantation and ECMO in Ireland. We reviewed 33 months of data (from 2017 to 2019) regarding patients admitted to our critical care unit who required intra-hospital transfer for diagnostic and/or therapeutic interventions. We also compared the data to available local guidelines. Results: 23 patients were transported on ECMO on a total of 28 occasions; the most common indication being CT brain (Table 1) . ECMO cannulation sites were peripheral in 20 patients, 3 patients were centrally cannulated. Median time from start of the transfer until the patient was returned to ICU was 50 minutes (range: 35-195). The ECMO console was placed on a dedicated ECMO trolley apart from two occasions where it was placed on the patient's bed. Number of staff required for transport was between 4 to 10; with an ICU Consultant as team leader. ECMO specialist nurses were always present on the transport team. 27 transfers were during normal working hours with 1 happening on a weekend. A total of 12 complications occurred during the transports, of Underlying pulmonary disease or status of immunosuppression in ECMO patients without invasive mechanical ventilation which 1 was significant and 11 were not. The significant complication encountered was Ventricular Tachycardia in a V-A ECMO patient which required electrical defibrillation. No adverse events related to transport were seen following return to ICU. In this single-centre study, we have demonstrated safe intra-hospital transport of ECMO patients. The use of local guidelines, appropriate personnel and performance during normal working hours is recommended. A novel approach for flow simulation in ECMO rotary blood pumps A Supady 1 , C Benk 2 , J Cornelis 3 , C Bode 1 , D Duerschmied 1 1 Heart Center Freiburg University, Cardiology and Angiogiology I, Freiburg, Germany; 2 Heart Center Freiburg University, Department of Cardiovascular Surgery, Freiburg, Germany; 3 FIFTY2 Technology GmbH, 79108 Freiburg, Germany Critical Care 2020, 24(Suppl 1):P078 Introduction: Extracorporeal membrane oxygenation (ECMO) is used increasingly in critically ill patients suffering from acute respiratory failure, cardiogenic shock or cardiac arrest. However, this therapy can have deleterious side effects such as bleeding or clotting complications and hemolysis. These complications are particularly caused by physical stress acting upon the blood components while passing through the ECMO system, especially within the rotary pump. We here present a novel approach to simulate blood flows through rotary blood pumps used in current ECMO systems in order to better understand the genesis of these complications. Geometries of the Xenios DP3 (Xenios AG, Heilbronn, Germany) rotary pump were reconstructed by CT-scans and manual measurements using computer-aided design (CAD). The Computational Fluid Dynamics (CFD) simulation was performed using the software Preon-Lab (FIFTY2 Technology GmbH, Freiburg, Germany), which implements a mesh-free Lagrangian method requiring minimal preprocessing of the CAD data. The geometries are introduced to the simulation model as tessellated surfaces. Five operating points have been specified by the rotation of the centrifugal fan and the corresponding inflow and outflow of blood. The blood is approximatively modelled as a Newtonian fluid with a density of 1040 kg/m 3 . PreonLab allows detailed assessment of the blood flow while passing through the rotary pump including analysis of local flow rates, pressure gradients and shear stress acting upon the blood. Dead zones in the fluid flow can be detected which gives reference points for optimizations of the pump design. For the first time, we demonstrate a novel approach for flow simulation in an ECMO rotary pump ( Figure 1 ). This approach may help better understand hemodynamics within the extracorporeal system to define optimal operating points or re-design components aiming to limit hemolysis, coagulation disorders and bleeding in seriously ill patients. One-year experience of bedside percutaneous VA-ECMO decannulation in a territory ECMO center in Hong Kong KM Fong, SY Au, PW Leung, KC Shek, HJ Yuen, SK Yung, HL Wu, SO So, WY Ng, KH Leung Queen Elizabeth Hospital, Intensive Care Unit, Hong Kong Critical Care 2020, 24(Suppl 1):P079 When veno-arterial extra-corporeal membrane oxygenation (VA-ECMO) support can be terminated, arteriotomy wounds of the patients of are traditionally closed by open repair in the operation theaters. Lots of manpower are involved and timeslots in operating theaters are scarce. Transport of the critically-ill is risky. Successful VA-ECMO decannulation using percutaneous device called ProGlide has been reported and our group had adopted and modified this approach [1] . Methods: This is a retrospective study analyzing the one-year experience of bedside VA-ECMO decannulation. Our institution is a 23-bed tertiary ECMO referral center in Hong Kong. Our first bedside decannulation was performed in November 2018, and since then, this practice had replaced the traditional open repair, unless contraindicated. Data from November 2018 to October 2019 were analyzed. In the study period, 39 patients received VA-ECMO. 28 survived to decannulation and 25 received bedside percutaneous decannulation. Their median age was 59 (52-67). The default arterial catheter size was 17Fr, with 15 Fr in 3 cases and 19Fr in one. Five (20%) failed percutaneous closure and they were subsequently surgically repaired without Extra corporeal life support (ECLS) continues to be associated with high mortality rates. Our ability to predict outcome prior to initiation ECLS remains limited. Here we take a single cell RNASeq approach in an effort to identify novel immune cell types that are associated with-and may contribute to-survival on ECLS. Whole genome transcriptomic profiles were generated from~40,000 peripheral blood monocytes obtained from 38 patients at the time of cannulation for veno-arterial ECLS (VA-ECLS). Within each subpopulation, differential gene expression analysis was performed to identify new markers associated with survival. Findings were validated in a additional cohorts by flow cytometry. Surviving patients had significantly higher proportions of CD8 + NKT cells (CD3 + /CD8 + /CD19 -/CD56 + ) that were CD52 + (p = 0.001, FDR < 0.05) ( Figure 1 ). To validate this observation, we performed FC analysis of a second cohort of 20 patients. For each patient, we quantified the proportion of CD8 + NKT cells that were CD52 + . Using the median proportion as the cutoff, we again found that a high proportion of CD52 + cells among CD8 + NKT cells was predictive of 48 hour survival (p=0.024). We noted that while high levels of CD52+ cells among the CD8+ NKT cells was protective in this cohort of VA-ECLS patients, this relationship did not hold for patients with sepsis. As only a few the VA-ECLS patients were septic, we analyzed a third cohort of septic ECLS patients. We observed that high levels of CD52+ cells among the CD8+ NKT populations was not protective in this population. The proportion of CD8+ NKT cells that are positive for CD52 is predictive of survival among patients undergoing VA-ECLS for noninfection related indications. Introduction: The use of calcium sensitizers has grown enormously in the last decade, probably due to their interesting pharmacodynamic properties. Levosimendan (LS) is frequently administered in patients under mechanical circulatory support. We performed a retrospective evaluation of patients treated with LS prior to weaning from mechanical support. This evaluation was combined with a review of the literature. A query of our ICU patient data management system revealed 22 patients receiving LS prior to or during VAD/ECLS support. Outcome data were obtained from the patients medical records. Of our 22 patients, 78% was successfully weaned off ECLS. Fourteen patients (63 %) died before being discharged of whom 5 while on ECLS support. Of the weaned patients, 9 died afterwards. 4 of the converted patients needed subsequent veno-venous ECLS support for right ventricular support after the implantation. Survival to discharge ratio for the whole group was 31 %. More detailed demographic results can be found in Table 1 . A pubmed search using the terms "(ECMO OR ECLS) AND LS AND weaning" resulted in 7 publications which dealt specifically with weaning of ECLS support. Several weaning approaches are available, however poor outcome has remains a problem. Some recent studies show a possible beneficial effect of LS infusion prior to weaning from ECLS. However most of these studies are retrospective or observational at best. Because LS is primarily reserved for the most severe cases, outcome interpretation is difficult. Overall weaning success ranges from 82%-92% and variation is very dependant of inclusion criteria. The calcium sensitizer LS can be used when weaning off patients from ECLS, certainly given its low incidence of complications. Future, large randomized trials are however needed in order to confirm this strategy. Cardiogenic shock is well described in newly diagnosed pheochromocytoma, and crisis may be precipitated by hemorrhage into tumour. V-A ECMO represents a rescue therapy in a subset of these patients refractory to medical management, facilitating cardiac recovery and subsequent definitive surgery. Consent to publish: written informed consent for publication was obtained from the 2 patients. During a spontaneous breathing trial respiratory mechanics can worsen, and respiratory muscle effort can increase, leading to respiratory muscle fatigue, pump failure, hypercapnia and an unsuccessful weaning from mechanical ventilation. This case report discusses the possibility of applying extracorporeal CO 2 removal (ECCO 2 R) to reduce respiratory muscle effort in a liver transplant recipient who already failed three weaning attempts from mechanical ventilation. The ECCO 2 R membrane lung was integrated into a conventional renal replacement therapy circuit and blood flow was increased from 150 to 300 ml/min. Measurements of respiratory mechanics (including esophageal pressure, as shown in Fig. 1 ) were used to assess the reduction of respiratory effort before and during the application of ECCO 2 R. was delivered through a 13fr-double-lumen-cannula; 350 ml/min blood-flow with 10lt oxygen sweep-gas-flow and aPTT 1.5-2 baseline were maintained (iv-heparin). In all cases respiratory and metabolic parameters improved without complications ( Figure 1 ). ECCO2R-CRRT facilitated extubation (4 out 9 IMV pts). In 4 out of 5 pts at risk of NIV failure, it avoided IMV. Treatment mean duration was 73±31 hours, mean lenght of ICU stay was 6±4 days. All patients survived to the treatment, nevertheless 2 patients died due to irreversible multiple MOF. In our aeCOPD series PrismaLung®-Prismaflex® facilitated weaning from IMV and avoided intubation in patients at risk of NIV failure without complications. These positive results may be related to minimal invasiveness of the low-flow device used and may constitute the rationale for a larger randomized controlled trial. Consent: Written informed consent for data publication has been obtained. Extracorporeal The primary outcome findings from the SUPERNOVA trial [1] demonstrated that the use of extracorporeal carbon dioxide reamoval (ECCO 2 R) allows a reduction in tidal volume (TV) to ultraprotective levels (≈4 mL/kg predicted body weight or PBW) during mechanical ventilation in ARDS patients without significant increases in the arterial partial pressure of carbon dioxide (PaCO 2 ). Unfortunately, it was not feasible to directly measure ECCO 2 R rates during the trial. We used a mathematical model of whole-body oxygen (O 2 ) and carbon dioxide (CO 2 ) transport and biochemistry [2] to calculate ECCO 2 R rates that permit a fit to the data reported for Hemolung (ALung Technologies) and iLA (Novalung)/Cardiohelp (Getinge) devices in the SUPERNOVA trial [3] . The mathematical model was calibrated under baseline conditions where patients were mechanically ventilated at a TV of 6 mL/kg PBW in the absence of an ECCO 2 R device; the O 2 consumption rate, CO 2 production rate and pulmonary shunt fraction were adjusted to match the measured baseline arterial partial pressure of O 2 and PaCO 2 . Assuming all baseline parameters were fixed, TV was then reduced to 4.1 mL/kg PBW and the mathematical model predicted the ECCO 2 R rate to the change in the PaCO 2 level. Model predictions for the devices are shown in Table 1 . These predictions suggest that ECCO 2 R rates for iLA/Cardiohelp devices were approximately twice those for Hemolung devices during the SUPERNOVA trial. These results may be useful to evaluate the expected performance of novel ECCO 2 R devices. Efficiency and safety of a system CRRT plus ECCO2R to allow ultraprotective ventilation protocol in patients with acute renal failure F Maldarelli 1 Despite renal function replacement techniques (CRRT), a patient who develops acute renal failure(AKI) in intensive care unit (ICU) has a mortality rate of 5-80%. This risk is partly due to the adverse effect of AKI on other organs than the kidney. Respiratory complications are frequently associated with the development of AKI. New machines combining CRRT with a carbon dioxide removal membrane (ECCO2R) allows the setting up of an ultra-protective ventilation (4 ml/kg of predicted boby weight (PBW)) to reduce any lung damage from mechanical ventilation (MV). The reduction in tidal volume (Vt) is associated with a decrease in lung damage partly triggered by AKI. We evaluated the efficacy of a combined system CRRT+ECCO 2 R to reduce the Vt to ultraprotective values in patients with acute respiratory failure and AKI. ARDS is a syndrome with high morbidity and mortality. An emerging treatment option is ECCO2R, but the benefit its remains unclear. We assess different degrees of ECCO2R and varying dead space (DS) on ventilator settings in order to minimize mechanical power. We calculated mechanical power as (1) Power=RR*{Δ〖Vt〗^2*[1/2*EL+RR*(1+I:E)/(60*I:E)*R]+ ΔVt*PEEP} (EL: system elastance, R: airway resistance, PEEP: positive end expiratory pressure, I:E: Inspiratory to expiratory ratio). We calculated the combination of respiratory rate (RR) and tidal volume (VT) ("optimal RR" and *optimal VT*) leading to minimal applied power for a stable carbon dioxide elimination of 300 ml/min (VCO2) for two scenarios: 1) variation of physiological DS from 10 to 40 % of VT at a fixed rate of ECCOR2. 2) variation of ECCO2R of either 80, 120, 160 or 200 ml/min at a fixed physiological DS of 20%. The alveolar ventilation (VA) necessary to eliminate the VCO2 was calculated as (2) VA= (-VCO2*σ_CO2*R*T*(1+K_c ))/(VCO2/Q-P_vCO2*σ_CO2*R*T*((1+K_c ))/760) σCO2: CO2 solubility in blood, R: gas constant, T: temperature. PvCO2: venous partial pressure, Kc: function of pH (12.5 for a pH of 7.2), Q: blood flow [5 l/min]). Increasing DS from 10 to 40% increases the minimal mechanical power from 5.9 to 10.8 J/min, primarily caused by an increase of optimal Vt (495 -672 ml). Optimal RR was only slightly increased (6.4 -7.5 /min, Figure 1 Panel A). For varying ECCO2R removal, necessary ventilation ranges from 1.6 to 3.6 L/min. This predicts a minimal power between 5.6 and 10.4 J/min with an unchanged optimal Vt (540 -543 ml) and an increasing optimal RR (5.4 to 12.3 /min ( Figure 1 Panel B)). In order to minimize mechanical power, increasing shunt or CO2 production should be met with increases in RR while increases in DS should be met with increases in VT. Our results indicate that during ECCO2R, mechanical power and thus risk for lung injury can be minimized with higher VT compared to conservative ventilation strategies. Validity of empirical estimates of physiological dead space in acute respiratory distress syndrome JD Dianti, EG Goligher, AS Slutsky University of Toronto, Interdepartmental Division of Critical Care Medicine, Toronto, Canada Critical Care 2020, 24(Suppl 1):P091 Increased physiological dead space fraction (V D /V T ) is a hallmark of the acute respiratory distress syndrome (ARDS) and has been shown to predict ARDS mortality. V D /V T is also important in estimating the reduction in tidal volume (V T ) and driving pressure (ΔP) with extracorporeal CO 2 removal (ECCO 2 R). V D /V T can be measured with volumetric capnography but empirical formulae using the patient's age, weight, height, gender and PaCO 2 have been proposed to estimate V D /V T based on estimates of CO 2 production (V CO2 ). The accuracy of this approach in critically ill patients, however, is not clear. Secondary analysis of a previously published trial [1] in which V D /V T and V CO2 were measured in ARDS patients. Estimated dead space fraction (V D,est /V T ) was calculated using standard formulae. Agreement between methods was evaluated by Bland-Altman analysis. The predicted change in ΔP with ECCO 2 R was evaluated using both measured and estimated alveolar dead space fraction (V Dalv /V T ). Results: VD,est/VT was higher than measured VD/VT, with a low correlation between the 2 (R2= 0.21). VCO2 was underestimated by the predicted approach (Table 1) , accounting for 57% of the error in estimating VD/VT. The expected reduction in ΔP with ECCO2R using VDalv/ VT was in reasonable agreement with the expected reduction using Introduction: Acute Respiratory Distress Syndrome (ARDS) is a common condition in critically ill patient. However neuromuscular blockers (NMB) result controvertial in early treatment of ARDS [1] . We ought to search systematically and realize a meta-analysis on the matter. An electronic search of randomized clinical trials in adult patient treated with early neuromuscular blockers compared without neuromuscular blockers in ARDS. The primary objective of the analysis was the mortality at 21 to 28 days. Secondary endpoints included mechanical ventilation free days, ICU acquired weakness and barotrauma. The search obtained 6 studies for the analysis [1] [2] [3] [4] [5] [6] (Figure 1 ). The early use of neuromuscular blockers in ARDS showed no increase in mortality, but the results should be taken with caution. There was no differences in mechanical ventilation free days. Barotrauma is less with the use of NMB. Ultrasound is fairly sensitive in the detection of lung infiltrates in patients with hematologic malignancies. In patients with pneumonia requiring intensive care (ICU) admission, we hypothesise that abnormal right ventricular (RV) function is associated with an increased 90-day mortality. RV dysfunction in critically ill patients has a well-known association with adverse outcomes [1] . However, its impact on mortality in patients with pneumonia has not been directly studied. Patients admitted to the Queen Elizabeth Hospital Birmingham ICU between April 2016 and July 2019 with a diagnosis of pneumonia who had a formal cardiologist TTE were included. Abnormal RV function was defined by either depressed function, dilated size or moderate to severe risk of pulmonary hypertension (pHTN). Abnormal LV function was defined by an LV ejection fraction £ 45% or grade II or more diastolic dysfunction. Patients with a clinical suspicion of pulmonary embolism were excluded. The primary outcome was 90-day mortality. Continuous data is presented as median (IQR). Categorical data is presented as % and analysed using a chi-squared test. Results: 942 patients were admitted to ICU with pneumonia, of which 347 (37%) had a TTE. Patients were 59% male, had a median age of 67 (46-88) and 90-day mortality of 31%. Abnormal RV function was present in 30% (n=103), with 15% depressed, 15% dilated and 14% with moderate to severe risk of pHTN. RV dysfunction was associated with an increased 90-day mortality compared to normal RV patients (62% vs. 18%, p<0.0001). LV function was abnormal in 25% (n=88) and was not associated with a higher 90-day mortality compared to normal LV patients (38% vs 29%, p = 0.20). RV dysfunction was associated with a higher 90-day mortality than LV dysfunction (62% vs 38%, p = 0.001). Conclusions: This is one of the first studies to demonstrate that abnormal RV function is associated with an increased mortality in ICU patients with pneumonia. Interestingly, abnormal LV function was not associated with an increased mortality. Rakuno Gakuen University, Anesthesiology, Hokkaido, Japan Critical Care 2020, 24(Suppl 1):P097 We previously reported a simple correction method of estimating pleural pressure (Ppl) by using central venous pressure (CVP) and that it can be used to estimate Ppl and transpulmonary pressure in pediatric patients with respiratory failure. However, it remains unknown that this method can be applied to patients with various levels of chest wall elastance and/or intravascular volume. The objective of this study is to investigate whether our method is accurate in various conditions of chest wall elastance and intravascular volume. The study was approved by the Animal Care and Use Committee of Rakuno Gakuen University. Ten anesthetized and paralyzed pigs (43.2 ± 1.8kg) were mechanically ventilated and subjected to lung injury by saline lung lavage. Each pig was subjected to 3 different intravascular volume and 2 different intraabdominal pressures; in each condition, the accuracy of our method was tested. Specifically, airway flow, airway pressure (Paw), esophageal pressure (Pes), and CVP were recorded in each condition, then changes in Pes (ΔPes) and ΔPpl calculated using a corrected ΔCVP (cΔCVP-derived ΔPpl) were compared. cΔCVP-derived ΔPpl was calculated as κ × ΔCVP, where κ was the ratio of the ΔPaw to ΔCVP during the occlusion test. Means and standard deviations of the two variables that reflect ΔPpl (ΔPes and cΔCVP-derived ΔPpl) in all pigs with all conditions were 6.1 ± 4.1 and 6.4 ± 5.3 cmH 2 O. The Bland-Altman analysis for the agreement between ΔPes and ΔCVP showed a bias of -0. 3 The activity and functionality of the diaphragm are difficult to measure in patients ventilated in intensive care. Ultrasound can be a useful tool for monitoring diaphragm muscle activity during different ventilation modes. Few data currently exist on diaphragm muscle activity in critically ventilated patients [1] . Our goal is to evaluate the respiratory muscular work of the diaphragm with different settings of the respirator by means of an ultrasound scan. The ultrasound assessments of the diaphragm were performed with a 10MHz linear probe at the apposition zone. We measured the thickening of the diaphragm with the respiratory acts, through the thickening fraction (Thickening Fraction, TF), defined as:TF = (Tdimax -Tdimin / Tdi min)% Tdimax: Diaphragm thickness at the end of inspiration (maximum thickness) Tdimin: Diaphragm thickness at the end of expiration (minimum thickness). Ventilatory support was divided into 4 classes: 1 -spontaneous breathing (SB) or Continous Positive Airway Pressure (CPAP); 2 -Pressure Support Ventilation (PSV) with low pressure support (5-12cmH2O); 3-PSV with high pressure support (> 12 cmH2O); 4 -Controlled Mechanical Ventilation (CMV). A total of 223 assessments were performed in 70 patients. The evaluations were all possible at the right hemidiaphragm, while on the left they were not possible in 7% of the cases. The median TF (IQ range) of the 4 ventilation classes was respectively: 42% (25-62%) in SB / CPAP; 26% (17-31%) in low-PSV; 17% (9-22%) in high PSV; and 5% (2-13%) in CMV. The Kruskal-Wallis test confirms a significant difference between the groups (p <0.0001). The ultrasound of the diaphragm can be a valid tool for monitoring respiratory muscle activity during mechanical ventilation. Introduction: Extubation failure is defined as reintubation after 48 hours of extubation in mechanically ventilated critically ill patients. It is associated with morbidity and mortality. The aim of our study was to assess reintubation rates in a busy district general hospital and evaluate the impact of high flow nasal oxygen therapy (HFNO) on reintubation rates. We performed a retrospective observational study looking at patients admitted to our 7 bedded Level 3 critical care unit (370 patients a year) for a period of 5 years between 1 st November 2014 and 31 st October 2019. We included patients over 16 years of age who were mechanically ventilated and length of stay was greater than 48 hours. Exclusions were age < 16 years, tracheostomy and patients requiring ventilation for < 48 hours. Data was collected from Ward Watcher, a SICSAG database and electronic patient records. Our study failed to show any impact of HFNO on reducing extubation failure. Further work is needed to develop a standardized approach to weaning and to consider routine application of noninvasive ventilation to reduce reintubation rates [1] . Fig. 1 (abstract P097) . The Bland-Altman analysis for the agreement between ΔPes and cΔCVP-derived ΔPpl in various conditions. low: low intravascular volume, normal: normal intravascular volume, high: high intravascular volume, abd-: without an abdominal compression band, abd+: with an abdominal compression band Oral endotracheal intubation is common to critically ill patients in intensive care unit. Oral care for an intubated patient is important to maintain the moisture of oral mucosa. Also, the securement method of oral endotracheal tube developed from cloth tape to commercial tube holder. Training PowerPoint and video for microteaching was prepared to train up 30 ICU nurses to perform the new practice. Demonstration and re-demonstration was arranged to assess skills of every nurse. Afterwards, each nurse answered a quiz to evaluate the understanding of OETTH and its special techniques in application. Questionnaire was designed to collect the feedback from all nurses too. The result showed there was 21 nurses (72%) out of 30 nurses achieved full marks in the post-quiz which demonstrated their full understanding of the use of oral ETT holder and its nursing care. About the feedback from nurse, 72% of nurses claimed that they were confident in using the new OETTH in clinical setting after training. 96% of nurses agreed in time-saving of nursing care routine with the use of an OETTH. However, only 56% of nurses agreed that the OETTH is effective in prevention of oral mucosa injuries and another 24% of nursing staff disagreed on its function in improving the patient's oral care. In conclusion, some of the nurses did not agree the prevention of oral mucosa injuries by the new securement method with OETTH while some nurses welcomed the new OETTH as more easy and effective in oral care to intubated patients. Execution of percutaneous dilatational tracheostomy using the standard laryngeal mask airway for ventilation: a prospective survey study G Gagliardi 1 , V Gagliardi 2 , C Chiani 3 , G Laccania 4 , F Michielan 3 1 AULSS 5 -Veneto, Anesthesia and Intensive Care, Adria, Italy; 2 AULSS 5 -Veneto, University of Padua, Adria, Italy; 3 AULSS 5 -Veneto, Anaesthesia and Intensive Care, Adria, Italy; 4 AULSS 6 -Veneto, Anaesthesia and Intensive Care, Padua, Italy Critical Care 2020, 24(Suppl 1):P101 We fulfilled a survey study dealing with bronchoscope-guided percutaneous dilatational tracheostomies (PDT), using the classic laryngeal mask airway (LMA) for the airway management [1] . The aim was to verify the safety and the effectiveness of the aforementioned procedure Methods: We performed an observational prospective survey study enrolling 150 patients hospitalized in the Intensive Care Unit. Before performing the tracheostomy, the endotracheal tube has been replaced by the laryngeal mask airway. Arterial blood gases, ventilation pressures and tidal volumes have been monitored, registered and compared. The median peak inspiratory pressure has been detected stable in all patients. Furthermore, during the ventilation with the laryngeal mask, the tidal inspiratory and expiratory volume difference observed between before and after the bronchoscope positioning, has shown a statistically significant variation. Finally, in all cases ETCO 2 , SpO 2. , PaO 2, and blood pH values persisted within the normal range. The standard LMA provides for a reliable airway management and allows an effective ventilation while performing the PDT. Once positioned in the supraglottic zone, the LMA does not need to be moved throughout all the PDT performance, avoiding risks of displacement, glottic harm and airway device damage, and permitting an easy handling of the bronchoscope, which gives an appropriated visualization of the trachea and a more efficient aspiration. In consequence to the large internal diameter of the LMA tube, Ppeak has continued to be stable in all patients, providing for minor resistance and inspiratory work. Eventually, no late complications, such as tracheal stenosis and infections, have occurred. Tracheostomies are the most common surgical procedure performed on critically ill patients. Randomized control trials comparing tracheostomy timing in intensive care patients have been equivocal. In order to perform non-urgent tracheostomy in our ICU, consent is required from the patient or a formal guardian appointed ad hoc by the courts. Since tracheostomies are practically the only elective surgery performed in the critically ill, ICU requested guardianship almost always indicates a clinical decision to perform tracheostomy. As appointing a guardian and arranging a tracheostomy takes about a week, the decision to appoint a guardian offers a unique "intention to treat" opportunity to evaluate outcomes in patients for whom tracheostomy is planned. We performed a retrospective analysis over 3 years on patients for whom guardianship was sought excluding those requiring urgent tracheostomy and those with a do-not-resuscitate order. Patients were divided according to outcome (tracheostomy, extubation or death prior to tracheostomy) and compared. Guardianship was sought for 233 ventilated patients. A decision to withhold tracheostomy was made for 13 patients, who were excluded, leaving 220 patients for analysis. Tracheostomy was performed for 131/220 (60%) patients, 62/220 (28%) were extubated and 27/220 (12%) died while waiting for tracheostomy (from nonairway related reasons). Tracheostomy was performed on mean ventilation day 16±1. Comparing extubated patients to those who had tracheostomy (Table) shows similar demographics, but significantly lower mortality and hospital length of stay. A significant proportion of patients initially planned for tracheostomy were successfully extubated. Despite demographic similarities, mortality in this group was significantly lower than for patients undergoing tracheostomy. For a selected subgroup of possibly difficult to characterize patients, delaying tracheostomy may be beneficial. Figure 1 ). PTIs were analysed by speciality and by outcome. Complications occurred in 6 cases (incidence 6.5%). There were 3 cases of subcutaenous emphysema, 1 pneumothorax (occuring D6 post procedure) and 1 case each of stoma and suture site infection. There was 1 unplanned cannula change within 7 days of insertion. 24% of cases had cuff inflated on discharge from ICU. Handover of care was suboptimal; follow up care plans were documented in 18% of cases. A supervising consultant was present for all PTIs. There was a trend of increased insertion by consultant and increased reliance on theatre, with corresponding decrease in the number inserted by trainees. PTI in our training ICU appears safe with low incidence of complications and good senior support for tracheostomy insertion. Emphasis must continue on training junior intensivists in PTI. Transition of care beyond ICU requires further work where currently there is suboptimal handover of care and safety netting for non-ICU colleagues. Supplemental oxygen administration is ubiquitous in the critical care environment, yet evidence is mounting for the deleterious effects of hyperoxia [1] . Concerns over the adverse effects from hypoxaemia often exceed those of hyperoxaemia in developing world settings, and inconsistent availability of blood gas monitoring may limit judicious oxygen titration. The aim of this project was to audit oxygen delivery practice and introduce QI measures to avoid excess oxygen delivery in a tertiary ICU in Lusaka, Zambia. A prospective snapshot of ventilatory parameters were recorded for critically ill patients over a 5-week period, including positive end expiratory pressure (PEEP), FiO 2 , and time-course SpO 2 . Systematic education was provided through group and one to one tutorials to empower nursing and medical staff to titrate oxygen safely and appropriately. Repeat data collection was then performed over 4 weeks. Initially 18/30 patients (60%) were over-oxygenated, as defined by FiO2 >0.5 and SpO 2 consistently >95%. 12/18 patients with an FiO 2 of >0.5 had PEEP ≤ 5cm (67%). No patient had a PaO 2 recorded in the past 24 hours. Education was provided as well as implementation of unit protocols above all patient beds documenting a stepwise approach to titration PEEP and FiO 2 . Post intervention fewer patients were over-oxygenated: 7/21 (33%) had FiO2 >0.5 and SpO 2 consistently >95%, and 7/18 with an FiO 2 >0.5 (39%) had a PEEP ≤ 5cm. In addition, 7/21 (33.3%) had a PaO 2 recorded within 24 hours. This QI project has shown that nurse engagement and systematic education to titrate FiO2 and PEEP can be achieved in a resource poor setting and may decrease the incidence of hyperoxia in critically ill patients. Availability of blood gas monitoring and knowledge of interpretation was a major barrier to oxygen titration Tracheal intubation (TI) in adult burn patients might be unnecessary in 30 to 40% of cases [1, 2] . In pediatric burn patients, there is little data on both the rate of TI and the rate of early extubation [3] . It has been common practice for a child with a facial burn and/or a suspected airway injury to be intubated early due to the risk of losing airway patency. However this risk should be mitigated against the potential risks of TI and mechanical ventilation in children. Therefore the aim of this study was to describe the airway status of child burn victims taken in charge of in our pediatric burn intensive care unit. Focused on patients arriving with TI, we investigated the rate of early extubation. In addition we compared non intubated patients with those with prolonged TI. This retrospective study described a cohort of 1520 patients hospitalized between 2010 and 2018. Data was retrospectively recorded from the patient's paper clinical chart. The mean age of our patients was 2.8 ±3.1 years [mean±sd] with an average burn area of 14±11%. 86% had scald burns and 45% had facial burns. 4% of the children were admitted in the burn ICU with TI. For 36% of them, tracheal tube was removed within the first 48 hours after admission. The probability of prolonged TI increased independently with the burned skin area (BSA) (p <0.0001), the presence of facial burns (p = 0.001), and in case of flame burns (p = 0.007) ( Figure 1 ). Among patients with more than 70% BSA, 85% were intubated more than 48h. Among patients with less than 20% BSA, 0.5% were intubated more than 48h. According to our retrospective data, it seems appropriate to intubate children with 70% and more BSA, while for patient with less than 70% BSA, it might be relevant to seek guidance from physician of the nearest Burn Center. Under 20% BSA, TI seems rarely required. An analysis of the predictive applicability of initial blood gas parameters for the need for intubation and the presence of inhalation injury in patients with suspected inhalation injury C Pirrone 1 , M Chotalia 2 , T Mangham 1 , R Mullhi 1 , K England 1 , T Introduction: We hypothesise that initial blood gas parameters have a good predictive applicability in detecting the need for intubation and the presence of inhalation injury in patients with suspected inhalation injury. To the best of our knowledge, this has not been directly studied in the literature. Patients with suspected inhalation injury admitted to the ICU at Queen Elizabeth Hospital, Birmingham between April 2016 and May 2019 were included. The initial blood gas parameters analysed were PaO 2 (kPa), PaCO 2 (kPa), pH, carbon monoxide level (COHb; %) and PaO 2 /FiO 2 (PF) ratio. Receiver operator characteristics (ROC) for these parameters were plotted against the need for intubation for more than 48 hours and the presence of inhalation injury as detected by bronchoscopy and laryngoscopy. Area under the curve (AUC) for each parameter was calculated. Results: 85 patients were admitted with suspected inhalation injury to the ICU. 68% were intubated for more than 48 hours. Of patients who were intubated, 69% had inhalation injury as indicated by bronchoscopy or laryngoscopy. Table 1 outlines the AUC for initial blood gas parameters in detecting the need for intubation for more than 48 hours and the presence of inhalation injury. pH was the parameter with the most prominent AUC, with reverse correlation indicating fair accuracy. No clear inflection point was identified, although all patients with pH < 7.25 required intubation and had inhalation injury. PaCO 2 had a fair predictive applicability in detecting the need for intubation. PF ratio, PaO 2 and COHb had poor accuracy. Conclusions: Initial blood gas parameters had a broadly poor predictive applicability for the need for intubation and the presence of inhalation injury in patients with suspected inhalation injury. Severe acidosis (pH < 7.25) was the most useful blood gas parameter. Clinicians should be cautious in using blood gas parameters alone to inform intubation decisions. Lung cancer surgery is associated with a high rate of pulmonary complications including ARDS and mandates lung protective ventilation strategies [1, 2] . Such strategies include non-intubated video assisted thoracic surgery (NiVATS) with spontaneous breathing [3] . Currently neither data on respirator settings nor on gas exchange have been reported for applying the latter. This data constitutes a prerequisite for meaningful evaluating the respiratory consequences of non-intubated spontaneous breathing during lung cancer surgery. The aim of this case series was for the first time providing such data from lung cancer surgery including pneumonectomy. During a 12 month period 32 patients without contraindications [3] scheduled for video assisted thoracic surgery (VATS) for non-anatomical and anatomical lung resection including one pneumonectomy (Px) were offered non-intubated spontaneous breathing. All patients gave informed written consent to the procedure as well as for analysis and publication of data. Anaesthetic management included target controlled infusion of propofol and remifentanil, laryngeal mask airway, and pressure support ventilation. We present early data that early trials of cuff deflation within 48 hours of tracheostomy insertion can be achieved using a standardized protocol. Its impact on length of stay, duration of ventilation and patient-centered outcomes needs to be investigated in larger multi-centre trials. Preventing underinflation of the endotracheal tube cuff with a portable elastomeric device. A randomized controlled study JE Dauvergne 1 , AL Geffray 2 , K Asehnoune 2 , B Rozec 1 , K Lakhal 1 1 Hopital Laënnec -CHU de Nantes, Service d´anesthésie-réanimation, Nantes, France; 2 Hotel-Dieu -CHU de Nantes, Service d´anesthésieréanimation, Nantes, France Critical Care 2020, 24(Suppl 1):P112 The management of the endotracheal tube cuff pressure (P cuff ) is routine practice for critical care nursing staff. Underinflation could lead to ventilator-associated pneumonia [1] whereas overinflation exposes to tracheal damage [2] . Multi-daily check and adjustment is recommended to ensure that P cuff lies between 20 and 30 cmH 2 O [3] . To automate this task some devices exist but may be inconvenient, bulky and/or ineffective. Their use is not supported by guidelines. A portable elastomeric device could be appealing for P cuff automated regulation. This prospective randomized controlled study tested whether the Tracoe Smart Cuff Manager TM reduced the rate of patients undergoing ≥1 episode of underinflation (P cuff <20 cmH 2 O), as compared with routine manual P cuff adjustment. Monocentric, randomized controlled study. Patients with acute brain injury and receiving mechanical ventilation were prospectively allocated to one of the two arms: manual reading and adjustment of P cuff at least every 8h (routine care) or adjunction of the Smart Cuff Manager TM (intervention). This study was approuved by an institutional review board. Among 60 randomized patients (routine care in 32, Smart Cuff Manager TM in 28), 506 measurements were performed in 48h. With routine care, a higher rate of patients experienced at least one episode of underinflation (62.5 vs. 17.8%;p<0.001). Episodes of underinflation episodes (15% vs. 2%;p<0.001) and manual adjustments (77% vs. 56%;p<0.001) were more frequent with routine care. For overinflation, there was no between-arms difference (p>0.99). The adjunction of continuous P cuff control with the Tracoe Smart Cuff Manager TM reduced the incidence of P cuff underinflation as compared with manual intermittent adjustments. Overinflation was not promoted by this device. Direct laryngoscopy as a technique for tracheal intubation is a potentially lifesaving procedure that healthcare professionals in a variety of fields are taught. However, this skill is challenging to acquire and difficult to maintain. Poorly performed intubation technique can lead to potentially serious complications [1] . The Intersurgical iView video laryngoscope is a new intubation tool which may have advantages over direct laryngoscopes, such as the Macintosh, in the hands of novice personnel. A prospective randomized counterbalanced trial of 30 medical students, who did not have previous airway management experience, was conducted. Each student received brief didactic teaching,following this, participants were directly supervised performing laryngoscopy and intubation using the Macintosh and iView devices in an alternating pattern. Students were permitted up to three attempts to successfully intubate under four conditions, three laryngoscopy conditions using aLaerdal Intubation Trainer and one using a Laerdal SimMan Manikin. There was no significant difference in the success rate of intubation or time to intubation between the two devices. The iView outperformed the Macintosh in time to intubation in the normal airway in the final scenario, once students gained experience with both devices. No significant difference was found in the number of optimisation manoeuvres, or intubation attempts between groups. Areas where the iView outperformed the Macintosh included severity of dental trauma and participants' perception regarding ease of use ofthe device. The iView may prove to be a useful teaching tool for novice personnel who are acquiring the skills of tracheal intubation. Patients with a primary pulmonary pathology were more likely to respond to APRV. This association has not been described before and warrants further multi-centre exploration in a larger patient group. Introduction: Airway suctioning is common during mechanical ventilation, using either an open endotraqueal suctioning or closed endotracheal suctioning (CES). Closed circuits were developed to prevent arterial desaturation and atelectasis associated to ventilator disconnection. However, CES may cause substantial loss of lung volume. The purpose of this study was to investigate the effects of a compensation method to prevent the loss in aeration during CES. The suctioning technique was performed for 15 seconds, negative pressures limited at 150 mmHg. Closed suction catheters with 14Fr (Halyard Health, Georgia, EUA) were used. Electrical impedance tomography (EIT) monitoring and arterial blood gas were collected. A NihonKoden Mechanical Ventilator (NKV550, California, EUA) was applied, having a newly developed algorithm for suctioning which overcomes any pressure loss during suctioning (InlineSuction-APP). When activated, the APP delivers PCV ventilation, adding 2 cmH 2 O of end-expiratory pressure above PEEP, and delivering driving pressures of 15 cmH 2 O. Results: Pigs (30±5.4kg) with injured lungs and mechanically ventilated. We tested the aspiration procedures using low PEEP=5cmH 2 O, or high PEEP=±12.3cmH 2 O with V T2 O), whereas maintenance of compliance was observed when the APP was ON (from12.2±1.4 ml/cmH 2 O to 12.5±4.5 ml/cmH 2 O. Blood gas in a representative animal showed a drop in PaO 2 when APP was off (from 247, to 149 mmHg after 2 min, and to 176 mmHg after 10 min) ( Figure 1 ). With APP ON the PaO 2 changed from 259 (pre-suction), to 223 (2 min), to 253 mmHg (10 min). The new NKsoftware, delivering PCV ventilation during suctioning, could prevent atelectasis and functional loss associated to the procedure. Tyrosine kinase inhibitor: an effective tool against lung cancer involvement responsible for acute respiratory failure in ICU Y Tandjaoui-lambiotte 1 Patients with advanced-stage non-small-cell lung cancer have high mortality rates in the intensive care unit (ICU). In the last two decades, targeted therapies have changed the prognostic of patients with lung cancer outside the ICU. The fast efficacy of targeted therapies led some intensivists to use them as rescue therapy for ICU patients. We performed a national multicentric retrospective study with the participation of the GRRROH (Groupe de Recherche en Réanimation Respiratoire en Onco-Hématologie). All patients with non-small-cell lung cancer admitted to the ICU for acute respiratory failure between 2009 and 2019 were included in the study if a Tyrosine Kinase Inhibitor was initiated during ICU stay. Cases were identified using hospital-pharmacies records. The primary outcome was overall survival 90 days after ICU admission. Results: Thirty patients (age: 60+/-14 years old) admitted to a total of 14 ICUs throughout France were included. Seventeen patients (59%) were nonsmoker. Adenocarcinoma was the most frequent histological type (n=21, 70%). Most patients had metastatic cancer (n=21, 70%). Epithelial Growth Factor Receptor mutation was the most common oncologic driver identified (n=16, 53%). During the ICU stay, 17 (57%) patients required invasive mechanical ventilation, 13 (43%) catecholamine infusion, 3 (10%) renal replacement therapy and one (3%) extracorporeal membrane oxygenation. Eighteen patients (60%) were discharged alive from ICU and 11 (37%) were still alive after 90 days (see Figure) . Moreover, 6 patients (20%) were alive one year after ICU discharge. Despite a small sample size this study showed that, in the context of lung cancer involvement responsible for acute respiratory failure, the use of Tyrosine Kinase Inhibitor should not be refrained in patients with severe condition in ICU. The burned patient is one of the most complex patients whith a very high mortality. Those patients with inhalation injury have a worst prognosis, typically associated with respiratory complications. The aim of our study is to evaluate the mortality of burn patientes with inalation injury in a critical burn unit. A prospective, observational and descriptive study was conducted over a period of 3 years. Inhalation injury was defined with these criteria (≥ 2): history of injury in an enclosed space, facial burns with singed nasal hair, carbonaceus sputum and stridor. If they were intubated it was diagnosed by bronchoscopy. Demographic data, TBSA, ABSI, Baux score, APACHE II, SOFA, mechanical ventilation (MV), complications, length of stay, hospital course and mortality data were collected. Results: 362 burns patients were admitted. 24% (84 patients) had inhalation injury. Mortality among patients with inhalation injury was 28,6% (24 patients). Most patients were men and those who died were older and with higher severity scores (Fig. 1) . We found no significant differences between groups in the need for MV (95% vs. 85%) or in the percentage of tracheostomy performed (33.3 vs. 28.3). However, patients who died had more respiratory complications like ARDS, and also shock, renal failure and need of renal replancement therapies although infectious complications were similar in both groups. There was no statistically significant difference in volume used during initial resuscitation in the different groups. Patients with inhalation injury who died had higher severity scores at the begining. Although there were no differences in the need for MV patients who died had more respiratory complications as well as shock, renal failure and need of RRT, but no infectious complications.The volume used during inicial resuscitation, that was always related to the prognosis, was similar in both groups. Further studies are needed to see if this greater initial severity corresponds to the degree of inhalation. Aerogen, Medical Affairs, Galway, Ireland; 2 Aerogen, Science, Galway, Ireland Critical Care 2020, 24(Suppl 1):P120 Patients with acute exacerbations such as asthma are prescribed aerosol therapy from presentation in the Emergency Department to progression through to the Intensive Care Unit. However, the variability in dose delivery to the lung across the possible patient interventions is not well characterized. Here, we assess the predicted lung dose of a bronchodilator in a simulated spontaneously breathing adult patient via both facemask and nasal cannula, and via tracheostomy during mechanical ventilation. A standard dose of 2.5 mg in 2.5 mL salbutamol was aerosolized using the Aerogen Solo nebulizer (Aerogen, Ireland). For facemask testing, the nebulizer was used in combination with the Aerogen Ultra with 2LPM supplemental oxygen flow. For nasal cannula testing, the nebulizer was used in combination with the Airvo 2 system (Fisher and Paykel, NZ) system at both 10 and 50LPM gas flow rate. Tracheostomy-mediated ventilation was assessed in combination with a HME, with the nebulizer placed between the HME and the tracheostomy tube. International Standard ISO27427 adult breath settings (Vt 500mL, BPM 15, I:E 1:1) were used across all tests, and generated using a breathing simulator (ASL5000, Ingmar Medical, USA) or mechanical ventilator (Servo-U, Maquet, Sweden). The dose delivered to the lung was assessed using a capture filter at the level of the trachea, with drug mass determined using UV Spectrophotometry at 276nm and interpolation on a standard curve. The results of testing are illustrated in Figure 1 . The bronchodilator dose delivered to the simulated patient was seen to be relatively consistent between progressive interventions, except during high flow therapy, with the more clinically relevant 50LPM gas flow rate having a profound effect on the dose. These results may go some way towards explaining how different patient interventions can affect aerosol dose. The The mechanical ventilation (MV) have been identified as an independent factor indicating a worse prognosis for lung cancer patients [1] . This study was conducted in order to assess the results of noninvasive mechanical ventilation (NIV) and/or invasive mechanical ventilation (IMV) modalities in lung cancer patients admitted to the ICU with acute respiratory failure (ARF). In this study, lung cancer patients with respiratory failure who were admitted to the ICU between January 2017 and December 2018 were evaluated retrospectively. Results: 93 patients were included in the study. The mortality rate was 18.3%. 83 patients had NIV. IMV was applied to 10 patients. In the first 24 hours, 39 of the 83 patients who were initially treated with NIV were administered IMV. The duration of hospital stay, diagnosis of pneumonia and mortality rate were found to be significantly lower in patients treated with NIV alone (p≤0.001, p=0.004, p=0.025), but Glaskow Coma Score (GCS) was significantly higher in this group (p≤0.001). The mortality rate was similar between the patients who were initially treated with IMV and those who were treated with IMV in the first 24 hours. Charlson Comorbidity Index (CCI) and MV duration were significantly higher in patients who died (p=0.01, p= 0.021), but GCS was significantly lower in this group (p=0.032). In the linear regression model for the likelihood of mortality, CCl≥9 and unsuccessful NIV increased the mortality rate by 3.4 (1.1-10.5) and 5.2 times (12-23.6) respectively (p=0.036, p=0.032). NIV has been an effective modality for respiratory support in most lung cancer patients presenting with ARF. However, failed NIV seems to be a factor for increased mortality. Therefore, the choice of respiratory support modality to be applied in this patient group should be decided by considering the GCS, CCI and etiology of ARF. The interaction between ventilator settings and the occurrence of acute kidney injury is not fully elucidated. This study aimed at investigating the effect of stepwise increase in PEEP level on the risk of acute kidney injury as evaluated with the renal resistivity index (RRI).The primary outcome is to investigate whether increased levels of PEEP could lead to increase RRI and whether RRI could predict the occurrence of AKI. Methods: Patients mechanically ventilated for at least 48 hours and without AKI at admission were included in the study. RRI was calculated at ICU admission. Posterolateral approach was used for kidney ultrasound. The peak systolic velocity (V max ) and the minimal diastolic velocity (V min ) were determined by pulse wave Doppler, and the RRI was calculated as (V max -V min )/V max . The exam was performed modifying the PEEP levels: 5, 10 and 15 cm H 2 O in random order for 15 minutes. Occurrence of AKI was defined within 7 days according to KDIGO criteria. Sixty-four patients were enrolled in the study and incidence of AKI was 14/64 (22%). Demographical and clinical characteristics are reported in Table 1 . Increase in PEEP showed a significant increase in RRI from PEEP 5 to PEEP 10 (p<0.001) and from PEEP 10 to PEEP 15 (p=0.001) ( Figure 1 ). The area under the ROC curve of RRI to predict AKI was 0.845 at PEEP 5, 0.898 at PEEP 10 and 0.894 at PEEP 15 (all p<0.001). The Youden index analysis showed an RRI>0.70 as the best cut off for AKI with a sensibility of 65% and a specificity of 96%. Patients with RRI>0.70 were 11/64 (17%), 13/64 (20%) and 22/64 (34%) at PEEP 5,PEEP 10 and PEEP 15 respectively. Patients ventilated with a PEEP value associated with RRI>0.70 had higher incidence of AKI (11/14 vs 6/50, p<0.001). The application of PEEP can increase intrarenal vascular resistance,which is associated occurrence of AKI; PEEP level should therefore be balanced taking into account the RRI. The RRI seems able to predict occurrence of AKI in mechanically ventilated patients. Alveolar and respiratory mechanics modifications produced by different concentrations of oxygen in healthy rats subjected to mechanical ventilation with protective ventilatory strategy D Dominguez Garcia 1 , R Hernandez Bisshopp 1 , JL Martin Barrasa 2 , D Viera Camacho 1 , A Rodriguez Gil 1 , J Arias Marzan 1 , S Garcia Hernandez 3 High oxygen can damage tissues [1] . In this study, we analyze the histological and pulmonary mechanics modifications that can occur when identifying different inspiratory oxygen fractions (FiO 2 ) in lungs of healthy rats during protective mechanical ventilation. We use Sprague-Dawley rat. 4 groups were designed, each with 6 animals, the tidal volume (6 ml/kg), PEEP (3 cmH 2 O) and respiratory rate (90 rpm) were kept constant, changing the FiO 2 between the groups. Four groups were established: FiO 2 0.21, 0.4, 0.6 and 1. After 4 hours, the lungs were removed for histological study and obtaining the wet/dry index. The histological modifications studied were: alveolar septa (AS), alveolar hemorrhages (AH), intraalvelolar fibrin (IF) and inflammatory infiltrates (II). Each parameter was rated from 0 to 3 [2] . Peak pressure (Pp) and pulmonary compliance were monitored every 60 minutes. Different statistical tests will be used to analyze the data. Results: References to the damage produced in the AS, AH, IF, II and the global histological pattern were identified in the groups with the highest FiO 2 and there was more damage (p <0.00001) ( Figure 1 ). The wet/dry index rose significantly as the oxygen concentration increased (p = 0.001). In the groups to which a FiO 2 of 0.6 and 1 was administered, the Pp selected specific values with respect to the baseline intake from the first 60 minutes, an aspect that was not appreciated in the other groups (p <0.0001). Regarding pulmonary compliance, it will be seen that, in the FiO 2 0.6 and 1 groups, it decreased from the first 60 minutes, finding differences with respect to the other groups (p <0.0001). Conclusions: Mechanical ventilation applied for 4 hours in healthy animals produces disorders that are more pronounced as oxygen concentration increase. FiO 2 greater than or equal to 0.6 should be avoided without clinical justification. Introduction: Patients requiring prolonged acute mechanical ventilation (PAMV, defined as 4+ days on MV) are sicker and incur disproportionate morbidity and costs relative to patients on short-term MV (STMV, <4 days of MV). We quantified specific clinical outcomes among patients requiring PAMV vs. STMV in a contemporary database. We conducted a multicenter retrospective cohort study within~700 hospitals in the Premier database, 2014-2018. Using ICD-9-CM and ICD-10 codes we identified PAMV and STMV patients, and compared their baseline characteristics and hospital events. Because of the large sample size, we omitted hypothesis testing. A total of 691,961 patients met the enrollment criteria, of whom 266,374 (38.5%) received PAMV. At baseline, patients on PAMV were similar to STMV with regard to age (years: 62.0 ± 15.8 PAMV vs. 61.7 ± 17.2 STMV), gender (males: 55.6% PAMV vs. 53.9% STMV), and race (white: 69.1% PAMV vs. 72.4% STMV). PAMV group had a higher comorbidity burden than STMV (mean Charlson score 3.5 + 2.7 vs. 3.1 + 2.7). The prevalence of each of the indicators of acute illness severityvasopressors (50.3% vs. 36.9%), dialysis (19.4% vs. 10.3%), severe sepsis (20.3% vs. 10.3%), and septic shock (33.5% vs. 15.9%)was higher in PAMV than STMV, as were hospital mortality and combined mortality or discharge to hospice (Figure 1 ), extubation failure (12.3% vs. 6.1%), tracheostomy (21.6% vs. 4.5%), development of C. difficile (4.5% vs. 1.7%), and incidence density of ventilator-associated pneumonia (2.4/1,000 patient-days vs. 0.6/1,000 patient-days). Conclusions: Over 1/3 of all hospitalized patients on MV require it for 4 days or longer. PAMV patients exhibit a higher burden of both chronic and acute illness than those on STMV. Commensurately, all clinical outcomes examined are substantially worse in association with PAMV than STMV. Identifying the readiness of patients recovering from critical illness for liberation from invasive mechanical ventilation (IMV) is not always straightforward [1] . The Scottish Intensive Care Society (SICS) trainee audit 2018 conducted a Scotland-wide study to understand current practices relating to liberation from IMV. Data were prospectively collected on patient demographics, indication for intubation, spontaneous breathing trial (SBT) practices, physiological markers, ICU outcome and ICU LOS. All patients >18 years ventilated with IMV for > 24hrs from the 1 st Nov. 2018 -30 th Nov. 2018 were eligible for inclusion. Exclusion criteria included extubation for end-of-life, death whilst intubated and presence of tracheostomy. Logistic regression was performed to detect factors associated with extubation failure (EF). Results were analysed via Excel 2010 and Stata v.14.1. Patient Benefit and Privacy Panel approval was granted. Total population of 172 patients were included: 108 (63%) male and median APACHE2 score 19 (IQR 13-23). EF at first attempt occurred on 27 occasions (15.7%), median ICU LOS of 10 days (IQR 7-12), mortality rate 22.2%. The cohort successfully extubated first time had a median ICU length of stay of 5 days (IQR 3-9) and mortality rate of 1.4%. Methods of SBT and extubation outcomes detailed in Table 1 . No SBT prior to extubation had higher odds of EF (OR 2.52, CI 1.09-5.84, p=0.03); patient ventilation for < 3 days had a three times higher odds of EF (OR 3.31, CI 1.09-10.1, p=0.03). These were independently associated with EF on multivariate analysis Conclusions: We found a reintubation rate of 15.7% in Scottish ICUs. Type of SBT most commonly used is divergent from the methods advocated in the literature. The lack of SBT and early extubation attempt was associated with failure, which in turn was associated with longer ICU LOS and higher mortality. In patients undergoing prolonged invasive ventilation we hypothesise that abnormal right ventricular (RV) and left ventricular (LV) function are associated with increased 90-day mortality. Whether changes in LV or RV function could aid in the prognostication of these patients has not been directly studied. Patients admitted to the Queen Elizabeth Hospital Birmingham ICU between April 2016 and July 2019 who were intubated and ventilated for more than 7 days and had a formal transthoracic echocardiogram (TTE) whilst in ICU were included. Abnormal RV function was defined by the presence of depressed function, dilated size or moderate to severe risk of pulmonary hypertension. Abnormal LV function was defined by the presence of LV depression (LV ejection fraction £ 45% or grade II or more diastolic dysfunction) or a hyperdynamic LV (formally mentioned in TTE report). Patients who had a neurological cause for prolonged ventilation were excluded. The primary outcome was 90-day mortality. Categorical data is presented as % and analysed using a chi-squared test. Continuous data is presented as median (IQR). Results: 871 patients required prolonged ventilation, of which 350 (40%) had a TTE. Patients were aged 62 (49-75), were 61% male and had a 36% 90-day mortality. The median ventilator days were 13 (6-20) and 77% required a tracheostomy. Abnormal RV function was present in 26% (n=90) and was associated with an increased 90-day mortality compared to normal RV function (68% vs. 25%, RR 2.71 [2.10-3.50], p< 0.0001). LV function was abnormal in 27% (n=95) and was associated with an increased 90-day mortality compared to normal LV function (54% vs 28%, RR 1.91 [1.47 -2.49], p < 0.0001). Abnormal RV function had a trend towards an increased mortality compared to abnormal LV function (68% vs 54%, RR 1.26 [1.00 -1.60], p = 0.07). In this study, abnormal RV and LV function were present in a quarter of patients undergoing prolonged ventilation and were associated with an increased mortality. Introduction: Tidal volume delivered by mechanical ventilation (MV) in sedated patients is distributed preferentially to ventral alveoli, causing overdistention and associated collapse in dorsal alveoli, driving volutrauma, atelectrauma and ventilator-induced lung injury [1] . Temporary transvenous diaphragm neurostimulation (TTDN) stimulates diaphragm contraction [2] . When used in synchrony with MV, TTDN encourages increased dorsal ventilation due to the change in pressure gradients with diaphragm contraction, mimicking a more normal physiological pattern. This may improve gas exchange and reduce injury. A pilot study was conducted using 50 kg pigs undergoing MV in a mock ICU. Deeply sedated subjects were provided lung-protective volume-control ventilation at 8 ml/kg. TTDN diaphragm contractions were delivered in synchrony with inspiration on every second breath, reducing the ventilator pressure-time-product by 15-20% during MV+TTDN breaths. Tidal volume distribution was recorded in each condition using electrical impedance tomography, and compared to never-ventilated, spontaneously breathing subjects (NV). Results: Dorsal ventilation changed from 49% during MV breaths to 54% during MV+TTDN breaths, compared to 60% in the NV group (p=0.035). Ventral ventilation changed from 51% during MV breaths to 46% during MV+TTDN breaths, compared to 40% in the NV group (p= 0.042, Figure 1 ). Conclusions: TTDN diaphragm contraction used as an adjunct to MV yields a more physiological pattern of volume distribution. This translates into less overdistension in the ventral areas and less atelectrauma in the dorsal areas and reduces ventilator-induced lung injury. This technology Introduction: By measuring the Pes and its derivatives, we can measure the relationship that exist between the diaphragmatic excursion and the oscillation of the esophageal pressure curve: Pswing (PS) so we infer that, just as with the Pes, the variations of it might be related to a weaning failure [1, 2] . However, no nominal value exists in the bibliography to predict the test result. Patients who meet with the inclusion criteria start the weaning process through a test of 30 minutes of spontaneous ventilation, T-Tube (TT). And also the respiratory rate (RR) and the tidal volume (TV). From this analysis, an average PS (APS) is determined for each moment of the test (APS1, initial and APS2, final.).A quotient was obtained in relation to these variables using the value previously obtained (quotient DTV/DPS x100. A total of 13 patients were included (n=13).Regarding the evolution during TT, 9 (n=9) (69%) were successful, while 4 (n=4) (30.76%) failed When analyzing a rate that relates the variables TV and PS, a quotient was obtained in relation to these variables using the value previously obtained (quotient DTV/DPS) for patients who were successful and who failed, (DTV/DPS)/100 Successful patients presented a value of 18.75 while those of the failure group presented a value of 45.83, (OR 1,2 -3 p=0.082) ( Table 1) . When presenting the relationship between TV and PS through the quotient (DVT/DPS)/100, it is observed a tendency to have a higher quotient among patients who failed versus those who did not fail. The process of weaning from mechanical ventilation imposes an additional workload on the cardiovascular system, which may result in impaired myocardial function, increase in left ventricular filling pressure and respiratory distress. Among surgical patients, those undergoing heart surgery are particularly susceptible to cardiac dysfunction induced by weaning because of inadequate cardiovascular reserve. The aim of our study was to depict the pathophysiological changes assessed by echocardiography during the steps of weaning and to identify possible predictors of weaning failure (WF). We enrolled 34 consecutive patients undergoing isolated coronary artery bypass grafting in our institution. Data were obtained by intraoperative transesophageal echocardiography before sternotomy (T0) and by transthoracic echocardiography at the beginning of weaning (T1) and at the time of extubation (T2). WF was defined as deferral of planned extubation or respiratory failure needing reintubation or non-invasive mechanical ventilation within 48 hours. Results: WF occurred in 7 patients (20.6%) and involved manifestations of respiratory distress in 5 (14.7%). We found a significant association between left ventricle outflow tract-velocity time integral (LVOT-VTI) and ventricular-arterial coupling measured at T1 and WF, with LVOT-VTI emerging as the best predictor of WF with an area under ROC curve of 0.8669 ( Figure 1 ); an optimal cutoff value of 15 cm provided 100% sensitivity and 71% specificity. Significant increase in E/e' measured at T2 (13.44 vs 9.96, P 0.02) suggested a cardiac etiology of respiratory distress in patients who failed the weaning trial. Our study showed that serial assessment of hemodynamic parameters by means of echocardiography is feasible in cardiac surgical patients and can provide insight into pathophysiological changes during weaning. Although these preliminary data need to be confirmed in a larger population sample, LVOT-VTI emerged as a promising predictor of subsequent WF. Compliance with guidelines for respiratory therapy in preclinical emergency medicine G Jansen, N Kappelhoff, S Rehberg Protestand Hospital of the Bethel Foundation, Anaesthesiology, Intensive Care and Emergency Medicine, Bielefeld, Germany Critical Care 2020, 24(Suppl 1):P131 Introduction: Current guidelines on pre-hospital emergency ventilation are based on the guidelines for lung protective ventilation in the intensive care unit. The present survey was designed to determine the accordance of actual pre-hospital emergency ventilation by German emergency physicians (GEP) with these recommendations. Recommendations include a respiratory rate (RR) between 10-16/min, a tidal volume (vt) between 6-8 ml/kg, a maximum pressure (Pmax) <30 mbar and a positive end-expiratory pressure (PEEP) of 5 mbar. An anonymous web-based questionnaire encompassing 7 questions was sent to GEP from September to December of 2018. GEP were asked to specify their level of education, their preferred ventilation settings and the usually chosen parameters employed to guide mechanical ventilation. Statistical analysis was performed using the Ch²-test with a significance level ≤0.05. 60% of the questionnaires were completed (159/261). 25% of the participants were trainees (Tr), 75% consultants (Co). As target parameters for guidance of ventilation, 87% of the Tr and 91% of the Co use capnometry. The Vt controlled 62% of the Tr and 54% of the Co on the basis of body weight. 81% of the Tr and 81% of the Co reported to control oxygenation using SpO2. Table 1 shows our analysis of the given answers. There were no statistically significant differences between the groups. Deviations from the guidelines of pre-hospital emergency ventilation settings are common and mainly concern the use of a guidelinecompliant PEEP. In addition, recommended target parameters for guidance of ventilation were not applied in a significant proportion of GEP. Prospective observational study including LTx recipients admitted to our ICU from February2017 to January2019, who underwent a spontaneous breathing trial (SBT) using a T-piece for 30 minutes. Clinical variables and arterial blood gas samples were recorded before starting SBT and after 20 minutes on the T-piece. Diaphragmatic excursion (DE) and thickening fraction (DTF) were also assessed using ultrasound(US) after 20 minutes on the Tpiece. US-DD was defined as DE<10 mm or DTF<0.3 of at least one hemidiaphragm. Patients who successfully completed a SBT, defined according to clinical criteria,were extubated. Extubation failure was defined as the need for reintubation within 48h. Results are expressed as medians (IQR) or frequencies (%). 193 LTx recipients were admitted to the ICU, 79 of whom underwent an SBT. 51 were male, and the median age was 58y. Main indications for LTx were interstitial lung disease (43.0%), COPD and cystic fibrosis. 59 were bilateral LTx, and 13 and 7 were left and right unilateral LTx respectively. 69 patients were extubated after SBT and 6 required reintubation within 48h. 53 presented US-DD, though there were no differences between patients who succeeded and those needing reintubation. In contrast, patients who succeeded showed higher PaO2/FIO2 after 20 minutes on the T-piece (Table 1) . Similarly, higher reductions in deltaPaO2/FIO2 after 20 minutes on the T-piece were observed in patients who failed. Oxygenation after SBT performed using a T-piece may predict extubation failure in LTx recipients with successful SBT. US-DD was not associated with the need of reintubation. Descriptive study about the relationship between self-extubation episodes and patient-ventilator interaction S Nogales 1 , Introduction: To evaluate the relationship between self-extubation and patientventilator interaction, among other physiological variables, in order to predict and to prevent these events. Self-extubation (SE) are quality indicators in patients under invasive mechanical ventilations (IMV) and are related with mortality [1] . Planned secondary analysis of a prospective data base of clinical and physiologic signals of patients receiving IMV. We included SE episodes (2012-2018) with continuous record of ventilator and monitor signals (BCLink BetterCare®). We analysed demographic data, physiological parameters (peripheral oxygen saturation SpO2, heart rate HR, respiratory rate RR and media arterial pressure MAP) and patientventilator interaction (asynchrony index AI, ineffective efforts during expiration IEE and double cycling DC). We studied a period of 12 hours prior to the SE episode. We used the Wilcoxon non-parametric test and for a proper analysis a Linear Mixed Effects Model. We included 21 episodes of SE, mean age 62±13years, 76%men, APACHE II at admission 17±10, 4,6±3,8days under IMV until the episode, reintubation rate 47.6%, ICU stay 20,9±17,6days, ICU mortality 14%. At the time of the SE, 65% were under sedation, 65% with physical restraint. The 67% were in weaning. We observed a trend to increase in SpO2, RR, HR, MAP and asynchronies in the 2-hour period prior to SE episode. We compared these variables from this period with a 2-hour period before and we observed a statistically The data presented in this study show that our results are in accordance with the literature with favorable mortality and early postoperative complication rates and support that this procedure is an excellent alternative for surgery in the elderly patients. It is reported that patients with pulmonary hypertension (PH; systolic pulmonary arterial pressure (sPAP)≥35 mmHg)) have frequent cardiac complications after transcatheter aortic valve implantation (TAVI). PH often gets worse in some patients despite the normal cardiac function after TAVI. No studies have ever examined prognosis after TAVI in patients with or without worsening of PH. Therefore, we retrospectively examined the frequency of mid-to long-term heart failure and cardiac death in patients with and without deterioration of PH after TAVI. Among 113 patients who underwent TAVI at our hospital between February 2014 and March 2016, we analysed 27 patients with PH (sPAP≥35 mmHg) before surgery. sPAP was measured in transthoracic echocardiography before and within 1 week after TAVI. Patients were divided into two groups according to whether sPAP worsened/ did not change or improved after TAVI. We examined the frequency of admission due to heart failure or cardiac death (death caused by heart failure, angina, or myocardial infarction) during the period of 3 years after TAVI. PH worsened or did not change after TAVI in 9 patients, while it improved in 18 patients. The left ventricular ejection fraction measured within 1 week after TAVI showed no difference between the two groups (56.6±11.9% vs 58.4±10.0%, p=0.71). The worsened/ no change group was higher in frequency of admission due to heart failure (logrank; p<0.05) and cardiac death (logrank; p<0.04). Despite successful treatment for AS by TAVI, the frequency of heart failure and cardiac death was higher in patients who did not show improvement of PH after TAVI, even in the absence of cardiac function decrease. Vigorous intervention for PH worsening after TAVI may be helpful to improve prognosis. The There are several different anti platelet drugs that can be used to treat acute cardiac events. Currently there are no effective markers that can assess how these drugs modify coagulation profile and quality. A new functional biomarker that measures Fractal dimension (df ) and clot formation time (TGP) has been developed [1] . df quantifies clot microstructure whereas TGP is a real-time measure of clotting time. We aimed to validate df and TGP in ST elevation myocardial infarction (STEMI) and assess the effect of two P2Y12 inhibitors which have different pharmacological mechanisms: clopidogrel and ticagrelor. We prospectively recruited 72 STEMI patients in the emergency setting. Venous blood samples were collected 12 hours after admission, following treatment with either ticagrelor or clopidogrel, in accordance with the local guidelines at the time. The blood samples were tested using the df and TGP biomarker, platelet aggregometry, clot contraction and standard markers of coagulation. Results: 36 patients received clopidogrel and 36 received ticagrelor. The df for clopidogrel was higher than ticagrelor (1.75±0.05 vs 1.73±0.06, p=0.18 which corresponds to a decrease in clot mass of 20% Figure 1 ) and the TGP was reduced (205±91sec vs 257±89 sec, p=0.06 a 20% reduction in time). The results of the study suggest that clopidogrel is less powerful in its effects on clotting characteristics compared to ticagrelor. Blood from patients receiving clopidogrel formed quicker and denser clots. This would suggest the risk of secondary events or stent occlusion is lower in those patients on ticagrelor, highlighting that df and TGP may be important in identifying patients at risk of future thrombotic events, the study is ongoing and will investigate the long term outcome in these patients. Introduction: New onset atrial fibrillation (NOAF) during critical illness frequently resolves prior to discharge. However long-term risks of NOAF (i.e. heart failure, ischemic stroke and death)remains high [1] . Previous studies noted that nearly half of NOAF cases did not have diagnosis recorded [2] . Addressing this may reduce post critical illness mortality by increasing AF surveillance post intensive care (ICU) discharge. Retrospective data was collected from an electronic health record for ICU admissions over a 10 month period from A biomarker is defined as a measurable indicator of some biological state or condition. Combined with a good clinical evaluation, they can enable an early and safe diagnostic, thus a faster management for the patient. Cardiac biomarker testing is not indicated in routine in the emergency department (ED) because of low utility and high possibility of false-positive results. However, current rates of testing are unknown. The aim of our study was to evaluate the importance of measuring cardiac biomarkers especially Troponins, D-dimer, and Btype natriuretic peptide in our daily practice, and to identify the latest recommendations for a better use of these biomarkers in the diagnostic and therapeutic approaches. We conducted a prospective observational study, over a 13 months periods performed in the ED of the university hospital center Ibn Rochd, Casablanca, Morocco, including all patients admitted during our study period and having a blood test for at least one biological marker. The dataset was analyzed by SPSS statistics 21.0. A total of 182 patients was enrolled. Troponins were tested in 85.3% patients (High sensitive in 49.5% and troponin I TnI in 35.8%), Ddimer in 30.9%, BNP 19% and NT pro BNP in 9.5% of cases. The diagnostic impact was significant in 94.4% of cases for troponins, 84.6% of cases for D-dimer and 87.5% for BNP. The therapeutic impact was considered important in 80.6% cases for troponins, 69.2% for Ddimer and 87.5% for BNP. Cardiac biomarkers have an important role in the ED, not only do they confirm the diagnosis (including the role of troponins in ACS) but also eliminate others (with a strong negative predictive value of D-dimer for thromboembolic disease) and prove the cardiopulmonary origin of acute dyspnea (the significant place of BNP in confirming the diagnosis of acute heart failure). A multicenter study on the comparison of inter-rater reliability of a new and the original HEART score among emergency physicians from three Italian emergency departments The HEART (based on History,ECG,Age,Risk Factors,Troponin) score is a valid tool to stratify the ACS in chest pain. But some reports suggest that its reliability could be low for heterogeneity in the assignment due to the subjective interpretation of the History. We used the Chest Pain Score for the "History". In this study we compare the reliability of the new HEARTCPS and original HEART. This is a multicenter retrospective study conducted in 3 Italian ED between July and October 2019 using clinical scenarios. Ten physicians were included after a course on HEART and HEARTCPS score. We used 53 scenarios which included clinical and demographic data. Each participant independently assigned scores to the scenarios using the HEART and HEARTCPS. We tested the interrater agreement using the kappa-statistic (k), the confidence intervals are bias corrected ; we used Stata/SE 14.2 statistical software . A p-value of < 0.05 defines statistical significance. The overall inter-rater reliability was good for HEART and HEARTCPS: Kappa =0.63 (CI 95%;0.57-0.72)and 0,65(CI95%;0.63 -0.67); with good agreement among all the class of risk for HEARTCPS but moderate in the medium class for HEART . We found significant differences of inter-rater reliability among the senior and junior physicians who used the HEARTCPS:K=0.56(CI95%;0.52-0.57)and 0.75(CI95%;0.70-0.79). HEARTCPS score increased its History inter-rater reliability specially among the junior physicians from K=0.35 (CI 95%; 0.27-0.43) to k=0.69(CI 95%;0.62-0.71).The Junior physicians seem to be more reliable than senior with the HEARTCPS:k=0.75 (0.71-0.79) vs K=0.56 (CI95%;0.52-0.57). The HEARTCPS showed inter-rater reliability better than original HEART among the medium class of risk and the junior group. It could be proposed to young doctors to stratify the ACS risk of chest pain. Limit: we used scenarios rather than real patients. A hybrid approach as treatment for coronary artery disease: endo-CABG or PCI first, does it matter? Introduction: The aim of this study is to discuss the short-term results of a hybrid approach combining minimally invasive endoscopic CABG (Endo-CABG) with a percutaneous coronary intervention (PCI). To bypass the disadvantages and potential complications of conventional CABG via median sternotomy, we developed the EndoCABG technique to treat patients with single-and multi-vessel coronary artery disease (CAD). This procedure is performed with three 5-mm thoracic ports and a mini-thoracotomy utility port (3 cm) through the intercostal space. This technique can be combined with PCI: the hybrid approach. The sequence of the 2 procedures (EndoCABG followed by PCI or vice versa) may result in different outcomes. From 02/2016 to 12/2017 data from 81 consecutive patients scheduled for a hybrid technique at Jessa, Belgium, were prospectively entered into a customized database. This database was retrospectively reviewed. Subgroup analysis was performed to compare outcomes of patients who first received EndoCABG with patients who first received PCI. A p-value < 0.05 is considered significant, a p-value < 0.1 is considered as a trend toward significance. Four patients underwent revision surgery and 2 patients died within the first 30 days. In 79 patients the left anterior descendens artery (LAD) was grafted with the left internal mammary artery (LIMA), the right coronary artery (RCA) was the most stented vessel using PCI. Patients first treated with PCI received more units of fresh frozen plasma after EndoCABG compared to those who were first treated with EndoCABG (p=0.03). There was also a trend toward significant more transfusion of packed cells in this small subgroup (p=0.07). The hybrid approach is a feasible technique as a treatment option for patients with multi-vessel CAD. If CABG follows the PCI, patients are more likely to receive transfusion. A possible explanation could be the need for dual antiplatelet therapy prior to surgery in this group, but this needs further investigation. Prognostic difference between troponin elevation meeting the MI criteria and troponin elevation due to myocardial injury in septic Troponin T (cTnT) elevation in critically ill patients is common and is associated with poor outcome. Using common assays, 40-50% of patients in the ICU will have elevated troponin level. Our aim was to determine whether there is any prognostic difference between troponin elevation meeting the MI criteria (rise and fall more than 20% together with Echo and ECG new abnormalities) and troponin elevation due to myocardial injury in septic patients. We enrolled 101 patients with sepsis and mean SOFA score 5,2 respectively in which cTnT level was measured more than once and analyzed there ECG and Echo findings. Patients were classified into three groups:definite MI (rise and fall cTnT ≥ 20% and contemporaneous changes on ECG and/or Echo),possible MI (rise and fall cTnT ≥ 20% and no other findings),myocardial injury (cTnT rise less than 20%) Results: Data from 101 patients were analyzed (49% female; mean age 61.9 (SD 16.9)). A total of 101 patients had at least one elevated cTnT more than 0.03 mkg/l. In 71 (70%) of patients cTnT level rised more than 20% from the first elevated measurement.64 (63%) of patients met MI criteria considering new ECG and Echo findings. The overall mortality rate in all patients was 53.9%.The mortality rate didn't differ significantly in three groups: in the definite MI group 62.4%, in the suspected MI group 52%, in the non MI cTnT elevation group 56,4%, p=0,6. Coronary angiography was performed in 46 (73%) of patients from the definite MI group,PCI was performed in 18 (39%) of patients. The mortality rate in the invasive group was not significantly lower comparing to the nonivasive group 29% vs 37,8%, p=0,06. Bleeding complications were significantly more frequent in the definite MI group 13% vs 7% and 8% respectively Conclusions: cTnT level elevation is associated with poor outcome regardless coronary or non coronary injury. Myocardial revascularization may be beneficial in patients with sepsis and definite MI, but it is also associated with increased bleeding risk. Diagnostic interest of "Marburg Heart Score" in patient consulting the emergencies department for acute chest pain Chest pain is a common reason for emergency department visits, although this primarily refers to Acute Coronary Syndrome (ACS), this symptom may be frequently related to other non-ischemic etiologies. The aim was to validate the Marburg Heart Score as a tool to exclude coronary artery disease in emergency department patients with nontraumatic acute chest pain. Methods: a prospective, observational, descriptive and analytic cohort study conducted in the emergency department, from February 1st to March 31st, 2019, collecting patients consulting for nontraumatic acute chest pain, the "Marburg Heart" score was calculated for all these patients. Telephone contact was made after 6 weeks to look for an ischemic cardiovascular event. We included 171 patients. The mean age was 57 +/-13 years, the sex ratio was 0.86. The majority of the patients (78.9%) consulted directly to the emergency department, 21.1% were referred by a primary care physician. The median time to consultation after the onset of chest pain was 24 hours. High blood pressure was the most common risk factor (43.9%), followed by smoking (31%), diabetes (24.8%) and dyslipidemia (23.4%). Thirty-five patients (20.5%) had already coronary heart disease, ECG was pathological in 19.3% of patients, 8 patients had an ACS with ST segment elevation. At six weeks, 20.6% of the patients had an acute coronary event. According to the patients' answers on the 5 questions of the Marburg Heart score. The area under the ROC curve of this score was 0.78 with a negative predictive value of 87.2%; The "Marburg Heart Score" is a simple, valid and reproducible clinical score with a discriminatory power to rule out the diagnosis of coronary artery disease from the first contact with the patient presenting for chest pain in emergencies. The abdominal aortic aneurysm (AAA) surgery is a complex procedure in elderly patients with high cardiovascular risk. Anesthesiological techniques should play special attention to the volume status during cross-clamping as well as to the blood loss. Goal directed fluid therapies (GDT) in AAA surgery in elderly patients decrease the perioperative morbidity and mortality [1] . Aim of this study is to investigate administration of fluid-based on either a GDT approach or a control method (fluid administered based on static preload parameters and traditional hemodynamic) in all phases of AAA surgery and especially in the phase of clamping and de-clamping. A total of 30 patients ASA III, randomly scheduled for elective, open AAA surgery were included in this clinical trial. They were randomly assigned to two groups I -GDT with targeting stroke volume variation (SVV) and II -Control group where fluids were administered at the discretion of the attending anaesthesiologist. In both these groups hemodynamic parameters, central venous pressure (CVP), temperature, blood loss and diuresis were registered during the operation and 48 hours postoperatively. Each group was assessed for postoperative complications. GDT group received less fluids and had a higher cardiac index (CI) (3.9± 0.6 vs. 2.9± 0.8 l/minute per m 2 , p < 0.01) and stroke volume index (55.1 ± 5.4 vs. 35.1 ± 5.8 ml/m 2 , p < 0.01) than the control group. There were significantly fewer complications in the intervention than control group (3 vs. 9, p = 0.02). GDT fluid administration enables less use of fluids, improved hemodynamic and fewer postoperative complications in elderly patients undergoing AAA surgery. Ultrasonography is a valid diagnostic tool, used to measure changes of muscle mass. The aim of this study was to investigate the clinical value of ultrasound-assessed muscle mass, in patients undergoing cardiothoracic surgery that present muscle weakness postoperatively. For this study, 221 consecutive patients were enrolled, following their admission in the Cardiac Surgery Intensive Care Unit (ICU) within 24 hours of cardiac surgery. Ultrasound scans, for the assessment of quadriceps muscle thickness, were performed every 48 hours for 7 days. Muscle strength was also evaluated in parallel, using the Medical Research Council (MRC) scale. Of the 221 patients enrolled, ultrasound scans and muscle strength assessment were performed in 165 patients. The muscle thickness of rectus femoris (RF), was slightly decreased by 2.18% ([95%CI: -0.21; 0.15], n=9; p=0.729) and the combined muscle thickness of the vastus intermedius (VI) and RF decreased by 3.5% ([95% CI: -0.4; 0.22], n=9; p=0.530). Patients whose combined VI and RF muscle thickness was below the recorded median values (2.5cm) on day 1 (n=78), stayed longer in the ICU (47 ± 74 vs 28 ± 45 hours, p = 0.015). Patients with MRC score ≤ 48 on day 3 (n=7), required prolonged mechanical ventilation support compared to patients with MRC score ≥ 49 (n=33), (44 ± 14 vs 19 ± 9 hours, p = 0.006). The use of muscle ultrasound seems to be a valuable tool in assessing skeletal muscle mass in critically ill patients after cardiothoracic surgery. Moreover, the results of this pilot study showed that muscle wasting of patients after cardiothoracic surgery is of clinical importance, affecting their stay in ICU. Prediction of cardiac risk after major abdominal surgery S Musaeva, I Tarovatov, A Vorona, I Zabolotskikh, N Doinov Kuban State Medical University, Anesthesiology and Intensive Care, Krasnodar, Russia Critical Care 2020, 24(Suppl 1):P147 The aim is to assess the incidence of cardiovascular incidents in major abdominal surgery [1] using the revised Lee index. A study was conducted of 144 elderly patients who underwent major abdominal surgery in the Krasnodar Regional Clinical Hospital No. 2 under combined anesthesia. In the preoperative period, the risk of cardiovascular incidents was assessed using the revised Lee index and the functional status was assessed by MET. Depending on the Lee index, 3 groups were identified: group 1 (n = 69) -low risk (index value -1), group 2 (n = 52) -intermediate risk (index value -2); group 3 (n = 23) -high risk (index value> 3). We estimated the incidence of critical incidents in groups: hypo-, hypertension, arrhythmias, and bradycardia. In the general population, cardiac risk was 2.2 ± 0.7 points; functional status -7.7 ± 1 MET. The greatest number of critical incidents was recorded in patients with high risk (58.4%), the smallest -in patients with low risk (9.1%), in patients with intermediate risk -26.5% (n <0, 05 between groups according to Chi-square criterion). In the structure of critical incidents, hypotension was most often encounteredin 62 (43%) patients, while some patients revealed several incidents from the circulatory system (n = 116). Overall, the Lee scale showed good prognostic ability (AUROC = 0.81) in predicting hemodynamic incidents. The revised Lee index is a useful tool to help assess the risk of cardiovascular incidents and determine patient management tactics in the perioperative period. Postoperative cognitive dysfunction (POCD) remains an unresolved problem due to lack of consensus on its etiology and pathogenesis. Some believe that POCD is the result of the direct toxic effect of general anesthetics on the nervous system. Others claim that surgical trauma activates proinflammatory factors that induce neuroinflammation. Wistar rats were allocated into 2 groups: 1-minor surgery (n=20), 2major surgery group (n=20). After 5 days of handling and habituation rats undergone surgery under isoflurane general anesthesia (2 vol.%). Group 1 rats underwent laparotomy with gentle gut massage followed by wound closure. Rats in group 2 undergone left side nephrectomy. Starting from the 4th postoperative day spatial memory in rats was studied in Morris Water Maze which is a cylinder metal pool with a diameter of 1.5 and a height of 0.5 m filled with water (temp.26±1 o C) up to half. It has a platform with a diameter of 12 cm and a height of 1 cm below the water level. Testing was preceded by a training stage, which included 8 sessions daily for 4 days. Thus, rats developed spatial memory to the location of the platform. On the 5th day of the study test stage was conducted to assess spatial memory: rats were launched from 3 points into maze without platform and data were recorded for 60 seconds at each session. Time spent on the target quadrant (TTQ) and the number of target area crossings (TAC) were registered. A second test was conducted 14 days after the first test to evaluate long-term spatial memory. The duration of surgery and anesthesia did not differ significantly between groups. There was a significant difference between groups in average TTQ and TAC in test 1 (Table 1 ). In test 2 minor surgery group showed better results but they were less significant. Major surgery is associated with a more pronounced deterioration of spatial memory in rats in early postoperative period compared to minor surgery. Cardiac inflammatory markers in ICU patients with myocardiac ischemia after non cardiac surgery (a pilot study) P Manthou 1 , G Lioliousis 2 , P Vasileiou 3 , G Fildissis 1 1 National Kapodistrian University of Athens, Athens, Greece; 2 National Kapodistrian University of Athens, General Thoracic Hospital´´Sotiria´´, Athens, Greece; 3 National Kapodistrian University of Athens, University of Athens, Athens, Greece Critical Care 2020, 24(Suppl 1):P149 Patients with known coronary artery disease have higher perioperative risk for myocardial ischemia [1, 2] . Mortality is frequent following cardiac ischemia in the intensive care unit (ICU) after non-cardiac surgery. The first group includes patients admitted to the Intensive Care Unit for post-operative follow-up without myocardiac ischemia in the first 24 hours. The second group includes patients with myocardiac ischemia postoperatively and needs intensive care monitoring. Cardiac risk assessment was made with the Lee Index,hemorrhagic risk assessment with the HAS-BLED bleeding score and thrombotic risk assessment with CHA2DS2-VASc score. Postoperatively, pathological test values such as BNP, troponin, CRP, calcitonin were estimated. The Sequential Organ Failure Assessment (SOFA) systeme was used to assess sepsis. The Nursing Activity Score (NAS) scale was used to measure the workload of various nursing activities in the ICU. According to the pilot study, the sample consists of 35 patients. 31.4% had myocardial ischemia. The Lee index was significantly higher in patients with myocardial ischemia. The duration of hospitalization, the high dose of vasoconstrictive drugs, the length of stay in the ICU, the duration of mechanical stay and the nursing workload were higher in patients with myocardial ischemia. CK-MB and troponin levels differed significantly between the two groups. Creatinine, bilirubin and BNP during the 24 hours were significantly higher. Patients with myocardial ischemia had significantly higher mortality. Cardiac risk assessment, HAS-BLED score and CHA2DS2-VASc score in combination with cardiac enzymes such as troponin could predict myocardiac ischemia in severely ill ICU patients. Introduction: According to the literature an airway complication followed thyroid gland surgery are: difficult trachea intubation, tracheomalacia, postextubation stridor and bleeding [1, 2] . Most common cause of death was problem with respiration and airway obstruction [3] . Subsequent hypoxia could require emergency airway and even tracheostomy [3] . Aim of our study was to determine the most common of airway complications and their association with type of surgery in our region. The retrospective cohort study included 400 pts., (369 women, 31 men) was performed in Odessa regional Hospital, Oncology Centre Odessa. There were three types of patients: with euthyroid goiter -170 (43%), polynodos goiter -125(31%) and thyroid cancer -105 (26%) ( Table 1) . Airway complications were diagnosed after trachea extubation based on indirect laryngoscope, presence of stridor, desaturation. The Pearson's criteria was calculated. The ratio of airway complications after thyroid surgery was 9.7% (39 pts). The main reasons of airway complications in thyroid surgery included: laryngeal edema -22 pts (5.5%); recurrent laryngeal nerve injury -12 pts (3.0%) and postoperative bleeding 5 pts (1.2%). Thyroid gland cancer and polynodosal goiter associated with laryngeal edema and recurrent laryngeal nerve injury (Pearsen criteria were 0.271 -moderate and 0.203 consequentially). It's may require more attention from the anesthetists after extubation and readiness for an urgent airway. Serum iron level and development of multiple organ dysfunction syndrome in patients in the perioperative period S Tachyla Mogilev Regional Hospital, Department of Anesthesiology and Intensive Care, Mogilev, Belarus Critical Care 2020, 24(Suppl 1):P151 Recently there has been attention of researchers to the problem of perioperative anemia. It was found that it increases the risk of death and postoperative complications. Threatening complication is multiple organ dysfunction syndrome (MODS). The objective was to determine the level of serum iron in the perioperative period in patients with endoprosthetics of large joints, and with the presence of MODS in abdominal surgery. A prospective cohort study was conducted in 77 patients, including 18 men and 59 women, age 61.9 ± 15.1 years. Two groups were identified: 1st (control) -patients after endoprosthetics of large joints (n = 40), 2nd (main) -patients in abdominal surgery with the presence of MODS (n = 37). The presence of MODS was established based on the criteria for the 2016 SCCM / ACCP Conference. Serum iron was monitored using an AU 680 analyzer (USA). The study identified several stages: 1st -before surgery, 2nd -1st day after surgery, 3rd -3rd day, 4th -7th day, 5th -10th day. When studying the indicators of serum iron, its significant decrease (p <0.05) in the postoperative period was established. In the 1st group: 1st stage -15.2 (10-19.4) mmol / L, 2nd stage -5.2 (3.9-7.6) mmol / L, 3rd stage -6.6 (5-8.7) μmol / L, stage 4 -9.7 (8.6-12.1) μmol / L, stage 5 -9.4 (7.8-11 9) μmol / L. In the 2nd group: 1st stage -11.9 (10-15) mmol / L, 2nd stage -3.7 (1.7-4.1) mmol / L, 3rd stage -3, 6 (2.4-4.5) μmol / L, stage 4 -6.5 (4.4-8.2) μmol / L, stage 5 -7.6 (6.5-9 4) μmol / L. Moreover, in both groups, iron increased at the 4th stage against the 2nd stage (p <0.05). When comparing the level of iron between the groups, significant differences were found (p < 0.05) at the 2nd, 3rd and 4th stages. In patients in the postoperative period, a decrease in serum iron is observed, the level of which rises by the 7th day, but does not reach the initial values. This decrease is more pronounced in patients with the presence of MODS after abdominal surgery. Kidney and pancreatic graft thrombosis happened in 5.6% and 18.9%, respectively, and bleeding in 21.1%. Forty-one (45.6%) developed at least one infection during hospital stay. Infection during ICU was found in 13.3% and main pathogens were gram negative bacilli sensible to beta-lactam. After ICU, the incidence of multi-drug resistant pathogen was 13.5%, predominantly gram negative bacilli. Fungal infection was lower 4%. All-cause hospital mortality rate was 5.6%. Infectious complications are the main cause of morbidity and mortality following SPK transplantation. The administration of broadspectrum prophylactic antibiotics are leading to the appearance of multi-drug resistant pathogens. Knowing local microbiological flora may be helpful, allowing more adequate antibiotic prophylaxis. Introduction: Cardiopulmonary bypass (CPB) is associated with thrombotic complications. Occurrence of thrombosis after CPB is 12% which takes the third place between CPB-associated complications. Our study determined preoperative predictors of thrombosis in children with congenital heart defects. 138 patients with congenital heart diseases in age up to 11 months 29 days (median age -4,7 months, youngest age -2 days after birth, oldest -11 months 29 days), underwent cardiac surgery with CPB, were enrolled in this study. All patients were divided into two groups: 1 st -without thrombosis, 2 nd -with thrombosis. Protein C, Ddimer, von Willebrand factor and Plasminogen plasma levels were assessed directly before surgery. Thrombotic cases were proven by performing doppler ultrasound or MRI. Thrombotic complications were diagnosed in 30 children (21%). Between all thrombotic complications ischemic strokes were diagnosed in 73% (22 cases), arterial thrombosis in 17 % (5 cases), intracardiac thrombus in 7% (2 cases) and mechanical mitral prosthetic valve thrombosis 3%(1). Receiver operating characteristic (ROC) curves are created for the listed indicators. Area under the Curve (AUC) for Protein C 0,64 (Sensitivity(Sn)-65%, Specificity(Sp) -50%), D-dimer is 0,65 (Sn -65%, Sp 50%), for Plasminogen activity -0,62 (Sn 60%, Sp 40%) and for von Willebrand factor level -0,64 (Sn 80%, Sp 55%). An ROC curve was created for all three indicators, the AUC was 0.7 (Sn -80%, Sp -40%). These parameters can be recommended as predictors of thrombosis in children after cardiac surgery. CPB is related with a large number of life-threatening complications. In our work, preoperative predictors of thrombosis were identified. Based on this data, it is possible to create thrombosis risk scale change the tactics of the anaesthetic approach, the prevention of thrombosis in the postoperative period. Further studies are needed to identify other possible predictors of thrombosis. Introduction: Abdominal ischemia occurs in 9% of patients submitted to aortic aneurysm repair. Its early diagnosis requires an elevated index of suspiction, particularly in more severe patients. We hypothesized that earlier increase and higher levels of C-reactive protein (CRP) may help to predict intra-abdominal ischemia. We performed a retrospective study of patients admitted to the intensive care department (ICD) after abdominal aorta aneurism surgery. We included all patients admitted during a two-year period, that survived for more than 48 hours. Primary outcome was splanchnic ischemia assessed by abdominal CT-scan. We also evaluated the presence of bacteremia, abdominal compartment syndrome and ICD mortality. Association between inflammatory parameters and ischemia was evaluated by multivariate logistic regression. Introduction: CRP (c-reactive protein) has been shown to be a useful biomarker in identifying complications after major abdominal surgery. Gastrectomy is a high-risk surgical procedure that requires post-operative critical care support to monitor for complications which are predominantly infective in nature. The aims of this study were to determine whether there is a relationship between post-operative CRP levels and patients who developed post-operative infective complications. A retrospective analysis was performed on patients undergoing elective gastrectomy for gastric cancer at a single centre between September 2011 and July 2016. Post-operative CRP levels for each day following resection were analysed for all patients. ROC curve analysis was used to determine which post-operative day (POD) gave the optimal cut-off. Of 144 patients included, the majority were male (61.8%), mean age was 68.5 years and 53.5% had node-negative disease. A total of 84 patients (58.3%) had an infective complication, which includes those who experienced an anastomotic leak. CRP levels on post-operative day 3 gave the greatest AUC for the gastrectomy group (0.765). CRP cut-off of 220mg/L was significantly associated with infective complications (OR7.29, 95% CI 3.42-15.58, p= <0.001) and gave a sensitivity of 70% and specificity 76% (PPV 67%, NPV 78%). More patients with a CRP >220 on post-operative day 3 experienced an infective complication (67% vs 21%, p = <0.001) or a leak in particular (17% vs 6%, p = 0.059). A CRP level of less than 220 mg/L on POD3 may be useful to predict the development or exclude the likelihood of such infective complications in this group of patients prior to clinical signs (PPV 67%, NPV 78%). This may prompt and facilitate decision-making regarding early investigation and intervention or prevent inappropriate early discharge from critical care, whilst providing more assurance in identifying those who could be stepped down to ward level care. Vasoplegia is commonly observed after cardiopulmonary bypass surgery (CPB) and associated with high mortality. Chronic use of reninangiotensin aldosterone system inhibitors (RAASi) is associated with its incidence and ensuing need for vasopressor support after CPB. Renin serves as marker of tissue perfusion [1] . We examined the role of renin in the setting of RAASi exposure and vasopressor needs in the peri-CPB period. Prospective observational study of 31 adult patients undergoing CPB, aged 66.0±10.5 years (22 men, 9 women). Blood was collected 1) post induction, pre-CPB; 2) 30 min post cardioplegia, and 3) immediately post bypass. Vital signs and perioperative medications were recorded. As control, blood was collected from 5 men and 4 women aged 53.5 ±10.7, not diagnosed with lung disease and not prescribed any RAASi. Baseline plasma renin in CPB patients tended to be higher than in control subjects (mean=38.5pg/ml±9.2 vs. 17.1 pg/ml ± 3.5, respectively, p=0.670). 30 minutes into CPB, mean renin was increased from baseline (77.6 pg/ml±17.5, p=0.211), and remained elevated immediately post CPB (74.6 pg/ml±19.1). Patients using RAASi prior to CPB tended to have a larger increase in renin post CPB (delta=55.3 pg/ ml±30.9) vs. those not previously on RAASi (26.9 pg/ml±9.5, p= 0.092). Renin was elevated in patients requiring vasopressor support in the 24 hours post CPB vs. those not requiring pressors (41.4 pg/ ml±15.7 vs. 25.1 pg/ml±16.1 p=0.0246). In those prescribed RAASi and requiring pressors post CPB, there was a tendency toward greater renin increase than those not requiring pressors postoperatively (49.9 pg/ml±49.7 vs. 8.5 pg/ml±3.9, p=0.036). This study suggests a trend toward higher renin levels, particularly during CPB, in patients prescribed RAASi, and a positive association between renin and postoperative vasopressor needs. We speculate that increased renin levels may predict postoperative vasoplegia. Cardiac surgery is associated with perioperative blood loss and a high risk of allogenic blood transfusion. It has been recognized that high blood product transfusion requirement is associated with adverse clinical outcomes. 1 Guidelines on patient blood management therefor aim at reducing blood loss and blood transfusion requirements in cardiac surgery. 2 As there remains controversy about the advantage of minimal invasive techniques on blood loss an transfusion requirements, 2 we wanted to investigate if the average blood loss and transfusion requirement in minimal invasive endoscopic coronary artery bypass graft surgery (endo-CABG) differ from conventional technique. We assessed the influence of pre-operative anticoagulant medication for blood loss. Estimated average blood loss after conventional CABG 3 is 400ml (+/-200) and transfusion requirement 3,4 units packed red blood cells 4 . We performed a retrospective cohort study of our cardiac surgical database. From 01/01/2016 to 31/12/2017, we collected data from 423 patients undergoing endo-CABG. We analyzed blood loss, transfusion as well as pre-operative use of anti-coagulants as a risk factor for blood loss. We found that mean total blood loss in endo-CABG does not differ from conventional CABG, nonetheless mean transfusion requirement was lower in our cohort. Use of direct oral anticoagulant is aossciated with increased blood loss and transfusion requirements (Table 1) . Total blood loss is not influenced by minimal invasive technique for CABG (endo-CABG). An explanation for the lower transfusion requirements is the use of a minimal extracorporeal circulation, which is known to reduce the risk of transfusion. Another important factor is the implementation of a standardized transfusion-protocol based on available evidence. Reducing transfusion requirements is an important component in improving patient outcome after cardiac surgery and is related to multiple factors in perioperative care of our patients. Retinal microvascular damage associated with mean arterial pressure during cardiopulmonary bypass surgery V Shipulin 1 Retinal perfusion corresponds to cerebral perfusion and it is very sensitive to hemodynamic disturbances [1, 2] . We investigated the association between retinal microvascular damage and hemodynamic characteristics in patients undergoing coronary artery bypass grafting surgery (CABG) with cardiopulmonary bypass (CPB). Methods: 10 patients with coronary artery disease and systemic hypertension were examined. Ophthalmoscopy and optical coherence tomography were performed before and 10-14 days after CABG. The hemodynamic parameters during CPB were analyzed. Results: 4 (40%) patients had changes in the retinal vessels and in the ganglionic fiber structure on 10-14 day after surgery: in 30% of patients the foci of ischemic retinal oedema appeared, in 10% the decrease of the thickness of ganglionic fiber were observed. These changes may be associated with intraoperative ischemia of the central retinal artery. In 6 (60%) patients the mean arterial pressure (MAP) during CPB was increased up to 90 mmHg. In 4 (67%) of them the association between MAP and foci of ischemic retinal oedema were revealed. The ischemic retinal changes were observed significantly more often if the delta of MAP during CPB was over then 20 mm Hg compared with the patients where the delta of MAP was less than 20 mm Hg (p=0.035). This is probably due to an intraoperative disorders of the myogenic mechanism of blood flow autoregulation in the retinal microvasculature in patients with coronary artery disease [3] . The level of MAP up to 90 mm Hg during CPB is associated with retinal blood flow impairment and the foci of ischemic retinal oedema. Delta of MAP more than 20 mmHg was associated with the foci of ischemic retinal oedema and decreased ganglionic fiber thickness in 67% of cases. Atrial fibrillation after cardiac surgery: implementation of a prevention care bundle on intensive care unit improves adherence to current perioperative guidelines and reduces incidence Introduction: Atrial fibrillation after cardiac surgery (AFACS) is a very frequent complication affecting 30-50% of all patients. It is associated with an increase in morbidity, mortality and hospital and intensive care unit (ICU) length of stay. We aimed to implement an AFACS prevention care bundle based on a recently published practice advisory [1] , focusing on early postoperative (re)introduction of β-blockers. Baseline AFACS incidence and β-blocker administration practices in our centre were audited for all patients undergoing valve surgery or coronary artery bypass graft (CABG) during a 6 weeks period. The AFACS prevention care bundlean easy to follow graphical toolwas subsequently introduced to the cardiac ICU by a multidisciplinary team and audited following a model of improvement approach. After exclusion of patients with preoperative AF, differences between pre-and post-implementation groups were compared with Chisquare and Fisher's exact tests for categorical, and One-way ANOVA for continuous variables, using SPSS. A total of 384 patients were analysed. Patient and surgery characteristics did not differ between groups. Significantly more patients received postoperative β-blockers after bundle implementation (82.7% pre-vs 91.3% post-bundle, p=0.019) with a higher proportion on day 1 (36.7% pre-vs 67% post-bundle, p<0.001, Figure 1 ). The incidence of AFACS was significantly reduced from 35.4% to 23.3% (p=0.009), with a particularly marked reduction in the age group 65-75 years and for isolated aortic valve and CABG surgery. There was no significant reduction in hospital length of stay for this cohort. Introduction of an AFACS prevention care bundle using a graphical tool improved adherence to current guidelines with regards to early β-blocker administration and significantly reduced AFACS incidence. Future care bundles should include preoperative interventions and might reduce hospital length of stay. In neonates with univentricular physiology, there is a delicate balance between pulmonary and systemic circulations, with a tendency towards generous pulmonary blood flow, and a risk of systemic underperfusion. Preoperatively, the use of hypoxic gas mixture (HM) has been advocated as a therapy to increase PVR, with the aim of improving systemic oxygen delivery. It is a therapy which has been routinely initiated in our institution in the setting of signs of pulmonary overcirculation. We performed a retrospective analysis of all patients in our institution who underwent a Norwood procedure and who received HM preoperatively. We compared peripheral saturations, arterial blood gas analysis, serum lactate, regional cerebral and renal saturations and invasive blood pressure, prior to, and then 4,8 and 24 hours after HM was commenced. Between 2014 and 2018 (inclusive), 49 patients underwent the Norwood procedure. 18 patients received preoperative HM. Average FiO2 was 17% during administration of HM. Average peripheral saturations were 96.1% prior to HM, and dropped to 87.4% at 4 hours, and 88% at 8 and 24 hours after initiation (p < 0.05). There was no change in any of the measured markers of systemic oxygen delivery, including regional cerebral and renal saturations, lactate, urine output or blood pressure. There was an association between an extended period of HM (> 48 hours) and the need for pulmonary vasodilator therapy post Norwood procedure. Hypoxic gas mixture in patients with parallel systemic and pulmonary cicrculations causes desaturation and hypoxia. It does not lead to an increase in systemic perfusion and thus an improvement in systemic oxygen delivery. Its ongoing use in this fragile population should be considered. Introduction: Analgesia in the critical patient, and especially in the neurocritical patient, is a basic goal in all therapeutic practices. Patients in the ICU are frequently administered prolonged and/or high doses of opioids. Multiple serious complications due to the use of infusion of opioids at large doses has been described. To reduce high doses of intravenous opioids, multimodal forms of analgesia can be used. Prospective observational study of the use of tapentadol enteral and buprenorphine in transdermal patches, at low doses, for the control of pain and its effect on reducing the use of fentanyl infusion in high doses on 84 patients admitted to Neuro ICU of INDISA Clinic during 2 consecutive years (2018-19). Enteral tapentadol (through NG tube) 50 mg/6 hours, was considered in patients who required intravenous fentanyl in continuous administration. Buprenorphine was also added at low doses (5 ug/hr) in a weekly transdermal patch, in cases of neurosurgical spine patients, fractures and long-term neuropathic pain. Pain was controlled on Behavioral Pain Scale (BPS) and Visual Analogical Scale (VAS) scores, according to the conditions of each patient. Their hemodynamic, gastrointestinal complications and the appearance of delirium episodes according to CAM-ICU scale were recorded. Results: 84 patients received tapentadol. 32 of them also received transdermal buprenorphine. All managed to maintain adequate level of analgesia, not requiring fentanyl at doses greater than 0.5 ug / kg / hr. Distribution by diagnoses: neurotrauma 21 patients, Guillain Barre 12, spine surgery 15, HSA 18, HICE 10, malignant ischemic ACV 8. Complications: gastric retention 12 patients (7%), hypotension 1 (1%), acute hypoactive delirium 3 (3.5%), acute hyperactive delirium 8 (9%). No drug interactions were found. The introduction of enteral tapentadol and buprenorphine patches in neurocritical patients was safe and resulted in a decrease in the use of endovenous opioids and its adverse effects. We hypothesized that changing the pain management for our post cardiac surgical patients to an assessment-driven, protocol-based approach using fast acting and easily titratable agents will significantly improve patient satisfaction by reducing pain intensity in the first 24h after surgery as suggested by Society of Critical Care [1] guideline. We prospectively assessed 101 and 99 (05.2018 vs 06.2019) consecutive patients before and after introducing our pain management protocol. The nursing and medical team received rigorous training on the guideline as well as the correct assessment using appropriate pain scores measured at least hourly (Numeric Pain Score, ≥ 2 is Timing of beta-blocker (re)initiation versus incidence of AFACS before and after prevention care bundle implementation, per post-operative day and for postoperative days 1-5 (insets) moderate to severe or Critical Care Observation Tool, > 2 is moderate to severe). We introduced a multimodal approach with a combination of fast acting iv, long acting oral opiates, regular paracetamol and rescue iv boluses for difficult to control situations and we created a prescription bundle on our electronic prescribing record. Among other variables we assessed hours spent in moderate to severe pain in the first 24h after surgery and compared to the data collected before the guideline was introduced. We analysed 101 patients from 2018 and 99 from 2019. Baseline characteristics were similar between the two groups. In 2018 only 41.6% of the patients spent less than 5 hours and 29.4% spend more than 10 hours in moderate to severe pain. The 2019 data showed significant improvement in that 79.5% of patients spent less than 5 hours and only 5% patients who spent more than 10 hours in moderate or severe pain. (p <0.0001, Chi Square) ( Figure 1 ). Only 9% of the patient needed rescue medications. 3% of time was the protocol inadequate necessitating other approach. Introducing an assessment driven, stepwise, protocolized pain management significantly improved patient satisfaction by reducing pain intensity in the first 24h on our Cardiothoracic Intensive Care Unit. Introduction: Proximal femur fractures are most common fractures in the elderly and associated with significant mortality and morbidity, with high economic and social impact. Perioperative pain management influence outcomes and mortality after surgery with early mobilization being possible [1, 2] . The goal of the study was to compare the efficacy and safety of the compartment psoas block for perioperative analgesia in elderly patients with proximal femur fractures. The randomized controlled study was held in medical center "Into-Sana" (Odesa, Ukraine) from January 2018 till July 2019. Patients with proximal femur fractures and older than 60 years were included in the study. They were randomly allocated to 2 groupscompartment psoas block group (bupivacaine analgesia was started as soon as possible before surgery and prolonged during and after surgery with additional ischiadicus block before surgery) and general (inhalational) anesthesia with systemic analgesia perioperatively. Results: 60 patients were included in this study. Perioperative compartment psoas block was associated better pain control, decreased opioid consumption, better sleep quality, earlier mobilization after surgery, decreased incidence of opioid-associated vomiting/nausea and myocardial injury. There were no difference in the incidence of hospital acquired pneumonia and delirium. Perioperative compartment psoas block is effective and safe for perioperative analgesia in elderly patients with proximal femur fractures, and is associated with better pain control and decreased complications incidence. Parenteral olanzapine is frequently used in combination with parenteral benzodiazepines for hospitalized patients with severe agitation. The FDA issued a warning for increased risk of excessive sedation and cardiorespiratory depression with this combination based on post-marketing case reports with overall limited quality of evidence [1] . The purpose of this study is to evaluate the safety and efficacy of concomitant parenteral olanzapine and benzodiazepine for agitation. This retrospective chart review evaluated agitated patients who received concomitant parenteral olanzapine and benzodiazepine within 60 minutes from 1/31/2016 to 9/1/2019 . The primary end points were rate of respiratory depression requiring mechanical ventilation and hypotension requiring vasopressors. The secondary end points were percentage of patients requiring additional sedatives for agitation during the same time frame, cumulative dose of olanzapine and benzodiazepine (midazolam equivalent) received, and rate of cardiac arrest and death. A total of 208 patients were included with notable baseline characteristics: median age of 35 years old, 59% with a history of substance abuse, and 40% with a history of psychiatric illness. For the primary outcomes, 2.9% of patients required mechanical ventilation and 0% required vasopressors. Additionally, 27.9% patients received additional sedating agents to control agitation. Refer to Table 1 for more details. No cardiac arrests or deaths were observed. Concomitant use of parenteral olanzapine and benzodiazepine within 60 minutes for the treatment of agitation appears to have a small risk of respiratory depression without significant hypotension. Hip fracture is very common in the elderly,it causes moderate to severe pain often undertreated. FICB is a simple safe method, easy to learn and use. The aim of our study is to assess the efficacy and safety of preoperative FICB compared with intravenous analgesia for elderly patients with femoral fracture and hip surgery in terms of opioid consumption and perioperative morbidity Methods: After informed consent obtained,54 patients 50-98 yo ASA I-III with hip fracture were randomized to receive either an US guided FICB(40 ml of ropivacaine 0,35%) or a sham injection with normal saline 30' before surgery. Both groups were operated under general anesthesia. Postoperative analgesia was done according to VAS: VAS 0-30 mm, paracetamol 1g iv at 8 h, VAS 30-60 mm, ketoprofen 100 mg iv at 8 h, VAS>60, morphine 0,1mg/ kgBW iv. The primary outcome was the comparison of VAS score at rest over the first 30'following the procedure, at the end of the surgery and at 6h intervals for 24h. The secondary outcome were the incidence of the cardiovascular events, of the PONV and of the confusion episodes, the amount of morphine consumption for 24h Results: At baseline, FICB group (A) had a lower mean pain score than the sham injection group (B). The same difference was observed over 24 h of follow-up (p<0.05). There was a significant difference between the two groups in total cumulative iv morphine consumption at 24 h and in the incidence of PONV and confusion episodes ( Figure 1 ). FICB provides effective analgesia for elderly patients suffering from hip fractures, with lower morbidity and lower opioid consumption compared with intravenous analgesia. Pain assessment in chronic disorders of consciousness patients with ANI monitoring E Kondratyeva, M Aybazova, N Dryagina Almazov National Medical Reseach Centre, Minimally conscious research group, St Petersburg, Russia Critical Care 2020, 24(Suppl 1):P166 Pain and suffering controversies in DOC to be debated by the scientific, legal and medical ethics communities. Methods: ANI (anti nociception index) monitor was used to assess pain in patients with chronic disordersof consciousness (DOC) age range 22 to 56 years -9 in vegetative state/ unresponsive wakefulness syndrome (VS/UWS) and 20 minimal consciousness state (MCS). Average age: in MCS group 31,8±11,29 and 42,1±8,46 in VS/UWS group. Neurological status was assessed using CRS-R scale. The average score on the CRS-R scale was 5±1.4 in VS/UWS and 10.45±4.5 in MCS. Pressure on the nail phalanx was used as a pain impulse. ANI and Nociception coma scale was evaluated before the application of pain stimulus, immediately after and past 30 minutes. Prolactin level was measured before the pain stimulus application and 10 minutes after. ANI less than 50 indicates pain, 70-100 hypoalgesia, 30 severe pain. The mean value of the ANI in MCS patients: before the pain stimulus 66.25±14.11, after the pain stimulus application 45±16.12 and 30 minutes later 66.55±18.13. Prolactin level in MCS patients before pain 13.01±9.06 ng/ml; after pain 13.75±8.73 ng/ml (p>0.05). Prolactin in VS/UWS patients before pain 10.79±7.2 ng /ml, after pain 14.5 ±8.88 ng / ml (p> 0.05). Conclusions: ANI monitor revealed that VS/UWS and MCS patients react equally to the pain impulse. Prolactin dynamics showed poor statistical mean and can not be consider as a marker of nociception in this group of patients. It is possible that the level of pain impulse was insufficient neuroendocrine response activation or the increase of prolactin level occurs in the long term (more than 10 minutes). In all patients the Total hip arthroplasty THA is one of the most common major surgical procedures associated with significant postoperative pain that can adversely affect patient recovery and could increase morbidity. Effective perioperative pain management allows an accelerated rehabilitation and improve the functional status of these patients. Multimodal analgesia MMA combines analgesics with different mechanism of action which by synergistic and additive effects enhance postoperative pain management and reduce complications. The aim of our study is to assess if perioperative association of very low dose of ketamine, a potent NMDA antagonist and dexamethasone, by antiemetic and antiinflammatory properties could decrease opioid consumption and postoperative morbidity of patients with THA. After informed consent, 58 patients scheduled for primary hip joint replacement surgery aged 55-91 yo ASA I-III were prospective randomized in two groups. Both groups were operated under general anesthesia fentanyl/sevoflurane. Supplementary, patients in group A received 12 mg iv dexamethasone and 8mg at 12 h and ketamine 10 mg iv bolus at induction and 10mg/h iv during surgery. Postoperative analgesia was done according to VAS, 0-30 mm paracetamol 1 g iv at 8 h, 30-60 mm ketoprofen 100 mg iv at 12h, VAS>60 mm morhine 0,1 mg/kgBW iv. We recorded perioperative opioid consumption, the number of intraoperative cardiac events, VAS score at the end of surgery and at 24 h, the incidence of PONV and persistance of chronic pain at 3 months. We obtain a significant less pain score at the end of surgery p<0.05 in group A, no significant difference at 24 h, a significant less chronic pain at 3 months, a fewer NPVO and cardiovascular events in group A, p<0.05 ( Figure 1 ). A multimodal approach with very low doses of ketamine and dexamethasone could be efficent in the treatment of pain for elderly patients with hip arthroplasty, decreasing postoperative side-effects and reducing chronic pain persistance. Introduction: Treatment in an Intensive Care Unit (ICU) often necessitates uncomfortable and painful procedures for patients. Chronic pain is becoming increasingly recognized as a long term problem for patients following an ICU admission [1] . Throughout their admission patients are often exposed to high levels of opioids, however there is limited information available regarding analgesic prescribing in the post-ICU period. This study sought to examine the analgesic usage of ICU survivors pre and post ICU admission. Methods: 183 patients enrolled in a post-intensive care programme between September 2016 and June 2018. Intensive Care Syndrome: Promoting Independence and Return to Employment (InS:PIRE), is a 5-week multicentre, multidisciplinary rehabilitation programme for ICU survivors and their caregivers. Patients' level of analgesia was recorded pre-admission and upon attending InS:PIRE, their level of prescribed analgesia was categorized using the Word Health Organisation (WHO) analgesic ladder [2] . Results: 33.3% of patients (n=61) were prescribed regular analgesia preadmission; this increased to 60.7% (n=111) post-admission, representing a significant absolute increase of 27.4% (95% CI: 20.2% -34.4%, p<0.001) in the proportion of patients who were prescribed regular analgesia pre and post ICU. In addition, pre-admission, 22.4% (n=41) of patients were prescribed a regular opioid (step 2 and 3 of the WHO ladder) compared to 38.7% (n=71) post-admission, representing an absolute increase of 16.3% (95% CI: 9.8% -22.8%, p< 0.001). This study found a significant increase in analgesic usage including opioids in ICU survivors. Follow-up of this patient group is essential to review analgesic prescribing and to ensure a long term plan for pain management is in place. Introduction: Pain, agitation, and delirium (PAD) are commonly encountered b patients in the Intensive Care Unit (ICU). Delirium is associated with adverse outcomes, including increased mortality and morbidity. Clinical guidelines suggest that routine assessment, treatment and prevention of PAD is essential to improving patient outcomes. Despite the well-established improvements on patient outcomes, adherence to clinical guidelines is poor in community hospitals. The aim of this quality improvement project is to evaluate the impact of a multifaceted and multidisciplinary intervention on PAD management in a Canadian community ICU. A PAD advisory committee was formed and involved in the development and implementation of the intervention. The 4-week intervention targeted nurses (educational modules, visual reminders), family members (interviews, educational pamphlet, educational video), physicians (multidisciplinary round script), and the multidisciplinary team (poster). An uncontrolled, before-and-after study methodology was used. Adherence to PAD guidelines in the assessment of PAD by nurses was measured 6 weeks pre-intervention and 6 weeks post-intervention. Data on 430 patient-days (PD) and 406 PD were available for analysis during the pre-and post-intervention, respectively. The intervention significantly improved the proportion of PD with assessment of pain and agitation at least 4 times per 12-hour shift from 68.0% to 87.5% and from 69.7% to 82.2%, respectively ( Figure 1 ). Proportion of PD with delirium assessment at least once per 12-hour shift did not significantly improve. A multifaceted and multidisciplinary PAD intervention is feasible and can improve adherence to PAD assessment guidelines in community ICUs. Quality improvement methods that involve front-line staff can be an effective way to engage staff with PAD. Oversedation Introduction: Sedation is a significant part of medical treatment in ICU patients. A too deep sedation is associated with a longer time of mechanical ventilation, lung injury, infections, neuromuscular disease and delirium, which can lead to a longer duration of ICU hospitalization, as well as an increase of morbility and mortality. Many patients spend a considerable amount of time in a non-optimal sedation level. A continuous monitoring system of the sedation level is therefore necessary to improve clinical evaluation. Our goal was to evaluate the incidence of non-optimal sedation (under and over sedation) comparing the parameters expressed from NGSedLine with clinical evaluations and to correlate oversedation and the incidence of delirium. We have studied a cohort of patients admitted to the ICU of Spedali Civili of Brescia University Hospital requiring continuous sedation for more than 12 hours. In addition to standard monitoring, the patients have been studied using Next Generation Sedline (Masimo). Sedation depth was evaluated through RASS scale and the presence of delirium was evaluated with CAM-ICU scale. We collected data from 50 adult patients. Our data showed high incidence of oversedation. 45 of our 50 patients had a SR>2 and 48 had a PSI level<30. A logistic regression analysis was performed and it showed statistically significant association between incidence of delirium and the age of the patients (p 0.009). The association between delirium incidence and Suppression Rate time was at the limits of statistics significance (p 0.059) and was statistically significant for non neurocritical patients (p 0.049). Our study didn't show an association between delirium and the total time of sedation. Non-optimal sedation is an unsolved problem in ICU, affecting lot of patients, with a major incidence of over-sedation compared to under-sedation. Our study shows an association between SR levels and the incidence of delirium. Predictors of delirium after myocardial infarction, insights from a retrospective registry M Jäckel, V Zotzmann, T Wengenmayer, D Dürschmied, C Von zur Mühlen, P Stachon, C Bode, DL Staudacher Heart Center Freiburg University, Department of Cardiology and Angiology I, Freiburg, Germany Critical Care 2020, 24(Suppl 1):P172 Delirium is a common complication on intensive care units. Data on incidence and especially on predictors of delirium in patients after acute myocardial infarction (MI) are rare. By analyzing all patients after acute MI, we aim to identify incidence and potential risk factors for delirium. In this retrospective study, all patients hospitalized for acute MI treated with coronary angiography in an university hospital in 2018 were included and analyzed. Incidence of Delirium within the first 5 days of care attributed to the MI and was defined by a NuDesc Score ≥2, which is taken as part of daily care three times a day by especially trained nurses. This research is authorized by ethics committee file number 387/19. Results: 626 patients with acute MI (age 68.5±13.3 years, 260 STEMI, mortality 3.4%) were analyzed. Delirium occurred in 70 (11.2%) patients and was associated with a longer hospital stay (12±15.9d vs 5.5±4.3d, p< 0.001). Patients with delirium were significantly older than patients without (76.3±11.14 vs. 67.5±13.10 years, p<0.001) and had more often preexisting neurological diseases (24.3% vs. 10.8%, p<0.001) and dementia (11.4% vs. 1.4%, p<0,001). Multivariate logistic regression analysis suggested that odds ratio for delirium was higher in patients after resuscitation OR 7.5 (95% CI 2.1-26.7), preexisting dementia OR 28.9 (CI 3.1-268) and in patients with alcohol abuse OR 18 (CI 2.7-120). While maximum lactate was also connected to delirium OR 1.4 (CI 1.1-1.9), infarct size or type had no effect on the incidence of delirium. In patients with MI, delirium is frequent. Incidence is associated with clinical instability and preexisting neurological diseases rather than infarct size. Incidence and risk factors of delirium in surgical intensive care unit MA Ali, B Saleem Aga Khan University, Anaesthesia, Karachi, Pakistan Critical Care 2020, 24(Suppl 1):P173 Introduction: Delirium in the critically ill patients is common and distressing. The incidence of delirium in the ICU ranges from 45% to 87%. Although delirium is highly common among intensive care patients, it is mostly underreported. To date, there have been limited data available related to prevalence of delirium in surgical patients. In a study published in 2008, the risk was observed 73% in surgical and trauma patients [1] . The purpose of this study was to find out the incidence and associated risk factors of delirium in surgical ICU (SICU) of a tertiary care hospital. We conducted prospective observational study in patients with age more than 18 years and who were admitted to the surgical ICU for more than 24 hours in Aga Khan University Hospital from January 2016 to December 2016. Patients who had preexisting cognitive dysfunction or admitted to ICU for less than 24 hours were excluded. Delirium was assessed by Intensive Care Delirium Screening Checklist ICDSC. Incidence of delirium was computed and univariate and multivariable analyses were performed to observe the relationship between outcome and associated factors. Delirium was observed in 19 of 87 patients with an incidence rate of 21.8%. Multivariable analysis showed that COPD, pain >4 and .76] were also the strongest independent predictors of delirium while analgesics exposures was not statistically significant to predict delirium in multivariable analysis. Delirium is significant risk factor of poor outcome in surgical intensive care unit. . There was an independent association between pain, sedation, COPD, hypernatremia and fever in developing delirium Delirium is an acute mental syndrome which may cause negative consequences if it is misdiagnosed [1, 2] . The aim of this study was to determine the incidence of delirium in different intensive care units and reveal the risk factors. The study was performed with 212 patients hospitalized in intensive care units of anesthesia, neurology and general surgery departments. Written informed consent was obstained from patients or relatives. Delirium screening test was performed twice daily with CAMICU (Confusion Assessment Method for the ICU). Patients who met the study criterias, were evaluated for the possible risk factors of delirium and the data was recorded daily. Patients were reevaluated after the treatment. The incidence of delirium was 32.5%. Delirium was found to increase with the length of stay (p <0.001). The mean age of the patients with delirium was 67.46. this was higher than the patients without delirium (52.48) (p<0.001). Visual impairment (p<0.001), hearing impairment (p=0.001), educational status (p=0.024), hypertension (p=0.002), mechanical ventilation (p = 0.001), oxygen demand (p=0.002), midazolam infusion (p=0.025), propofol infusion (p=0.042), infection (p < 0.001), SOFA (p = 0.001), APACHE II (p <0.001), nasogastric catheter (p=0.012), aspiration (p <0.001), number of aspirations (p<0.001), enteral nutrition (p<0.025), albumin (p=0.025), steroid (p=0.024), hypercarbia (p=0.039) hypoxia (p=0.039), sleep disturbance (p<0.001) were found risk factors for delirium. Oral nutrition (p<0.001) and mobilization (p=0.003) were found to prevent delirium development. Various factors are important in the development of delirium. These risk factors should be considered in reducing the incidence of delirium in intensive care units. ). An unplanned and brutal stop of alcohol consumption, as it can occur during ICU admission, may lead to an alcohol withdrawal syndrome (AWS). The most severe clinical manifestation of AWS is described as delirium tremens (DT). There are no current guidelines available for AWS treatment in ICU. The study's aim was to describe the clinician's practices for DT treatment and the outcome of DT in ICU patients. Observational retrospective cohort study in two ICUs of a universityaffiliated, community hospital in France. Patient diagnosed for DT during their ICU stay, as defined by DSM-V classification, were enrolled in the study. Results: 61 patients with DT were included between 2014 and 2017. Benzodiazepines was administered to 23% of the patients in order to prevent an AWS. As associated measures, vitamin therapy was administered to 83% of the patients and 59% had an increased fluid intake (mean 2.5L+/-0.7). Concerning the curative approach of AWS, the treatment's heterogeneity was notable. There was a high frequency of treatment's association (66% of the patients), every patient had benzodiazepines and the use of second line treatments such as neuroleptic, alpha-2 agonist, propofol was variable ( Figure 1 ). Complications of DT were the following: Need for mechanical ventilation due to unmanageable agitation or acute respiratory distress (33% of the patients) Self inflicted injuries such as pulling out of central lines, tubes, surgical drain (46%) Falls (7%). Seizures (33%). Delirium tremens is a severe complication of an untreated AWS, which can lead to serious adverse events in ICU. The current lack of evidence concerning the management of AWS in ICU probably explains the heterogeneity of treatments. Given the potential severity of AWS in ICU, further evidences are required to optimize care of AWS in ICU patients. The incidence and related risk factor of delirium in surgical stepdown unit S Yoon 1 , S Yang 2 , G Cho 2 , H Park 2 , K Park 2 , J Ok 2 , Y Jung 2 1 Asan medical center, Nursing department, Seoul, South Korea; 2 Asan Medical Center, Seoul, South Korea Critical Care 2020, 24(Suppl 1):P177 Step Down Units (SDUs) provide an intermediate level of care between the ICU and the general medical-surgical wards. The Critically ill patients who are in recovery after long-term intensive care or who require monitoring after acute abdominal surgery are admitted to SDUs. Delirium in critically ill patient is common and leads to poor clinical outcomes. It is, however, preventable if its risk factors are identified and modified accordingly. To determine risk factors associated with delirium in critically ill patients to admitted surgical SDU at Asan Medical Center. This is retrospective study conducted on critically ill patients who were admitted to the SDU from September 2018 to April 2019 and able to express themselves verbally. Delirium status was determined using the Short-CAM tool. Data were analyzed by SPSS 13.0 software, using t-test, Fisher's exact test and logistic regression. The incidence of delirium was 32.1%(25 of 78 patients) and hypoactive delirium(12case, 48.0%) was the most commonly assessed, followed by hyperactive delirium(9case, 36.0%), mixed type(4case, 5.1%). Risk factors associated with developing delirium identified from univariate analysis were age(P=0.048), admission via ICU (P= 0.041), tracheostomy (P=0.037), chronic heart failure (CHF) (P=0.017), invasive hemodynamic monitoring (P=0.041), heart rate (P= 0.037). After adjusted in multivariate analysis; factors those remained statistically significant were old age (RR We identified risk factors consistently associated with incidence of delirium following admitted to surgical SDU. These factors help to focus on patients at risk of developing delirium, and to develop preventive interventions that are suitable for those patients. Patients with sepsis frequently develop delirium during their intensive care unit (ICU) stay, which is associated with increased morbidity and mortality. The prediction model for delirium in ICU patients (PRE-DELIRIC model) was developed to facilitate the effective preventive strategy of delirium [1] . However, the PRE-DELIRIC model has not yet been validated enough outside Europe and Australia. The aim of this study is to examine the external validity of the PRE-DELIRIC model to predict delirium using Japanese cohort. This study is a post hoc subanalysis using the dataset from previous study in nine Japanese ICUs, which have evaluated the sedative strategy with and without dexmedetomidine in adult mechanically ventilated patients with sepsis [2] . These patients were assessed daily throughout ICU stay using Confusion Assessment Method-ICU. We excluded patients who were delirious at the first day of ICU, were under sustained coma throughout ICU stay and stayed ICU less than 24 h. We evaluated the predictive ability of the PRE-DELIRIC model to measure the area under the operating characteristic curve. Calibration was assessed graphically. Of the 201 patients enrolled in the original study, we analyzed 158 patients in this study. The mean age was 69.4 ± 14.0 years and 99 patients (63%) were male. Delirium occurred at least once during their ICU stay in 63 patients (40%). To predict delirium, the area under the receiver operating characteristics curve of the PRE-DELIRIC model was 0.60 (0.50 to 0.69). Graphically, the prediction model was not well-calibrated ( Figure 1 ). To predict delirium in Japanese ICUs, we could not show the well discrimination and calibration of the PRE-DELIRIC model in mechanically ventilated patients with sepsis. Introduction: Delirium is a serious and common complication and in some cases it treatment is difficult. Aim of the study was an evaluation of the prevalence, structure of delirium and efficacy of dexmedetomidine and haloperidol sedation in geriatric patients after femur fracture. After local ethic committee approval 207 case-records of geriatric patients with femur fracture in the period from 2017 to 2018 in the Institute of traumatology and orthopedics in Astana were analyzed. Patients was divided for 2 groups: in Dpatients with delirium treated by i/v dexmedetomidine (0.2-1.4 mkg/kg per hour), in G group patients with delirium treated by i/v galoperidol (0.10-0.15 mkg/kg). Delirium was assessed by RASS at day of permission and every day at 8 a.m. The prevalence, structure of delirium and efficacy of sedation were analysed. Results: By anthropometric and gender characteristics of the group did not differ. The average age in the D-group with delirium was 81.8±0.9 years old, which was comparable to the G-group -79.7±0.7 years old (p = 0.06). All study participants had similar comorbidities. Delirium in all patients debuted at 2.0±1.4 days, with an average duration of 2.2±1.2 days. The effect of dexmedetomidine was better and expressed in 52% decrease in the duration of delirium in compare to haloperidol (p <0.05). Dexmedetomidine provided a more controlled and safe sedation compared with haloperidol. The average consumption of narcotic analgesics in the subgroup with dexmedetomidine was two times less than in the subgroup with haloperidol. Thus, the average consumption of trimeperidine hydrochloride in patients of group D was 6.9 mg versus 14.1 mg in group G (p = 0.004). In gerontological patients with femur fracture treatment delirium by dexmedetomidine was more effective in compare with haloperidol. When using dexmedetomidine, the consumption of narcotic analgesics in postoperative period was 50% less than with haloperidol. Live music therapy in intensive care unit MC Soccorsi 1 , C Tiberi 1 , G Melegari 1 , J Maccieri 2 , F Pellegrini 2 , E Guerra 1 Intensive care units (ICU) are not comfortable for patients, relatives or next of kin. In the last years many news approaches were described to implement the humanization of medical treatments. The positive effect of music therapy in ICU is well described, especially reducing delirium risk [1] . The aim of this paper is describing the effect in patients and their family of a music live performance in ICU. After Ethical Committee approval (Procedure AOU 0018356/18, Italy) for three months (November 2018-January 2019) patients in ICU were treated twice a week with live music therapy performed by Coral Vecchi-Tonelli of Modena, Italy (Fig. 1 ). Data were collected all awake and conscious patients. Vitals parameters, GCS, RAAS and CAM ICU were collected before, during and after the treatment, at every performance. After the treatment a feedback questionnaire were given to patients and to next of kin. Results: 18 subjects were enrolled in the research with mean age of 66.66 years old, delirium rate before the treatment was 16.38% later 15.38%, RAAS does not show any difference. Over 85% of patients were satisfied, and relatives felt less anxiety. We recorded also a satisfaction also in relatives not enrolled. The study does not demonstrate a delirium risk reduction for the small sample and the length treatment, anyway it was recorded a low delirium rate. The safety and the potential effect of music therapy are well known, surely the research underlines the feeling of patients and their next of kin: ICU is the most stressful setting for admitted patients and its humanization is a current topic for medical literature. Live performances could be an entertainment moment and probably create a moment of an interaction among patients, their family and medical and nurse: ICU become more Human. The high level of satisfaction push us to continue this experience. Introduction: Patients undergoing medical procedures benefit from distraction techniques to reduce the need for drugs alleviating pain and anxiety. This study investigates if medical hypnosis or virtual reality glasses (VRglasses) as adjuvant method reduces the need for additional drugs. In a prospective, randomized, interventional trial, patients undergoing procedures were stratified in four age groups, and randomly assigned into three arms by means of a closed envelope system. All patients received standard care for pain before the procedure; the control group received further drugs for pain and stress as indicated by the Visual Analog Scale (VAS; threshold 3/10) and ComfortScore (threshold 14/30), two index groups received either medical hypnosis or VR glasses as a plus before and during the procedure. VAS and Comfort were scored continuously and analysed with the Kruskal-Wallis Test. Patients, parents and healthcare providers scored their satisfaction at the end. Of 104 included patients 6 to 86 years old, 47% were female. Regardless of age, pain and comfort scores were similar before and at the start of the procedure (VAS 3.7-4.2; Comfort 16-16.7), but as of one minute after starting the procedure, both VAS and Comfort reduced significantly more in both index groups compared to the control (p< 0.001), remaining far below the threshold for both pain and stress ( Figure 1 ). There was no advantage of one index group over the other (p=0.43). There were no adverse effects. Patients in the VR group were more satisfied than in the standard group (p=0.02) or in the hypnosis group (p=0.04). There was no significant difference in satisfaction of parents or healthcare providers. From the very start of the intervention, the application of either medical hypnosis or VR glasses significantly reduces pain and anxiety in patients undergoing medical procedures. More studies are needed but both are promising safe adjuvant tools to standard pharmacological treatment. Music to reduce pain and distress due to emergency care: a randomized clinical trial NE Nouira, I Boussaid, D Chtourou, S Sfaxi, W Bahria, D Hamdi, M Boussen, M Ben Cheikh Mongi Slim Academic Hospital, Emergency Department, Tunis, Tunisia Critical Care 2020, 24(Suppl 1):P182 Recent clinical studies have confirmed the benefits of music therapy in managing pain and improving quality of care in the emergency department. The aim wasTo evaluate the impact of receptive music therapy on pain and anxiety induced by emergency care Methods: A randomized controlled study in patients consulting the emergency department. Two groups: the music therapy group; Patients needed venous sampling, peripheral venous catheter or arterial catheter. Will bless ten minutes music therapy by headphones and a second control group of patients with the same care without music therapy. Consent was requested from all participants. The level of pain caused by the act of care was assessed by visual analogic scale. Heart rate, blood pressure and the mood of the patient were assessed before and after emergency care. We assessed patient satisfaction, adverse events. Patients admitted to the emergency room, patients with communication difficulties and non-consenting patients were not included Results: Two hundred and forty patients were included randomized in both groups, 123 with music therapy and 117 without music therapy, the results showed comparable characteristics between the two groups: demographic data, pathological history, and initial clinical presentation. After the session of music therapy a difference was noted in the evaluation of the mean VAS who was in the group with Music of 2.87 ± 1.82 versus 3.60 ± 2.06 in the control group p< 0.004 CI 95% [-1.23; -0.23], and the mean of diastolic blood pressure which was 71, 07 mmHg in the first group against 74.27 mmHg for the control group p = 0.048 CI 95% [-6.37; -0.25]. As for the mood, the patients were more smiling after the act of care in the group music therapy. All patients were satisfied with their experience and 93% recommend this therapy to their relatives . Music therapy may reduce pain and anxiety in patients during emergency care. The music therapy is the intervention of music and/or its elements to achieve individual goals within a therapeutic.the music has proved to have positive physiological and psychological effects on patients [1] . Patients admitted to the Intensive Care Unit (ICU) experience anxiety and stress even when sedated, negatively influencing recovery [2] . Methods: Two groups are established, a music therapy group (MG) and a control group (CG). The first one undergoes music therapy interventions, it consists of 10-minutes sessions of live music. Patients of the GC will receive the usual treatment established by the service protocol for weaning management and the data are collected during the same time interval. Data collection includes Mean arterial pressure (MAP), heart rate (HR), respiratory rate (RR), oxygen saturation (SaO2) and temperature (T). A total of 28 patients were recruited, of which 6 patients had to be excluded for meeting any of the exclusion criteria (n=22). Of which 13 (n=13) were randomized in the GM and the rest to the GC (n=9) IC95%. Regarding delirium in GM 9 (40.9%) presented a POSITIVE CAM-ICU, while in the CG were 13 (59.9%) (P=0.09). When analyzing the variables in the CG and GM, it was observed that there were no differences with respect to HR, RR and MAP variable ( Figure 1 ). According to the results, we can say that music therapy as a nonpharmacological strategy for management of anxiety and delirium in patients of critical care units, might be an useful tool for the management of patients in weaning of mechanical ventilation Introduction: Coagulopathy and basopenia are common features of anaphylaxis, but the role of coagulopathy in anaphylaxis remains uncertain. The aim of this study is to evaluate the association between coagulopathy and clinical severity or basopenia in patients with anaphylaxis. We conducted a single-center, retrospective study of patients with anaphylaxis about their coagulopathy. Levels of fibrin degradation products (FDP) and D-dimer were analyzed with the cause of anaphylaxis, clinical symptoms, medications and outcomes. We also studied the levels of intracellular histamine as a biomarker of basophil degranulation in the peripheral blood in relation to FDP and Ddimer. In total, sixty-nine patients were enrolled to the study, and the levels of intracellular histamine were analyzed in 14 patients. The symptoms included respiratory failure (n=47), shock (n=56), abdominal impairment (n=20), and consciousness disturbance (n=37). Thirty-two patients needed continuous intravenous vasopressors for refractory shock. The increase of FDP was significantly associated with consciousness disturbance (p=0.029) and refractory shock (p<0.0001). The increase of D-dimer was also significantly associated with refractory shock (p=0.0066). There was no correlation between the levels of intracellular histamine and either of FDP or D-dimer (p=0.13 and p=0.16, respectively). The increase of FDP and D-dimer were associated with severe symptoms of anaphylaxis, while they were not correlated with intracellular histamine. These results suggest that anaphylaxis is closely associated with coagulopathy in a mechanism which is different from basophile degranulation in anaphylaxis. Cardiac manifestations of H1N1 infection in a Greek ICU population E Nanou 1 , P Vasiliou 1 , E Tsigou 2 , V Psallida 1 , E Boutzouka 2 , V Zidianakis 1 , G Fildissis 1 1 Agioi Anargiroi Hospital, Attiki, Greece; 2 Agioi Anargiroi Hospital, ICU, Attiki, Greece Critical Care 2020, 24(Suppl 1):P185 Introduction: Cardiovascular involvement in influenza infection occurs through direct effects on the myocardium or through exacerbation of pre-existing cardiovascular disease [1] . The aim was to study cardiac manifestations in all pts admitted to the ICU with severe influenza's attack. Clinical, laboratory, electrocardiographic, echocardiographic and hemodynamic data were retrospectively recorded in all pts admitted to the ICU due to influenza infection (winter 2018-spring 2019). Diagnosis was established by PCR on bronchial aspirates the next 7 days after admission. Myocardial injury was defined by troponin levels >116 pg/ml (10 fold ULN). Left ventricular systolic dysfunction was defined as EF <50% and was characterized as either global or regional. Hemodynamic monitoring by Fig. 1 (abstract P183) . Comparison between MG and CG transpulmonary thermodilution method (PICCO) was recorded in pts with shock (norepinephrine >0.1 μg/kg/min). Values are expressed as mean±SD or as median (IR). Results: Nine pts (5 males) with a mean age 49.78±17.01 years, APACHE II 19±5.29 and SOFA score 10.50±2.93 were assessed. ICU admission was due to ARDS (7) and COPD exacerbation (2) . ICU LOS was 24.44± 14.19 days and mortality rate was 18%. No history of vaccination or coronary heart disease was referred. Results are shown in table 1. Levosimendan was administered in 2 pts with severe cardiogenic shock. In all survivors, shock and indices of myocardial dysfunction subsided till discharge. Coronary angiography was performed in 1 pt showing no abnormalities. Mortality was attributed to septic shock and multi-organ failure. Myocardial involvement, though common in influenza pts admitted to the ICU, didn't contribute to a dismal prognosis. The cardioprotective effects of levosimendan could be related to the modulation of oxidative balance. We aimed to examine the effects of levosimendan in patients with cardiogenic shock or with ejection fraction (EF) lower than 30% on cardiac systo-diastolic function and plasma oxidants/antioxidants (glutathione, GSH; thiobarbituric acid reactive substances, TBARS). In 4 patients undergone coronary artery bypass grafting or angioplasty, cardiovascular parameters were measured at T0 (before the beginning of levosimendan, 0.1mcg/Kg/min), T1(1 h after the achievement of the therapeutic dosage of levosimendan), T2 (at the end of levosimendan infusion), T3 (at 72 h after the end of levosimendan infusion), T4 (at the end of cardiogenic shock). The same time-course was followed for plasma GSH and TBARS measurements. We found an improvement in cardiac output, cardiac index and systolic arterial blood pressure. EF increased from mean 25% to 45%. A reduction of central venous pressure and wedge pressure was also observed. Moreover, indices of diastolic function were improved by levosimendan administration (E/E' from 14 to 6; E/A from >1 to <1) at early T2. It is to note that an improvement of GSH and TBARS was observed early after levosimendan administration (T1), as well ( Figure 1 ). The results obtained have shown that levosimendan administration can regulate oxidant/antioxidant balance as an early effect in low cardiac output patients. The modulation of oxidative condition could be speculated to play a role in exerting the cardio-protection exerted by levosimendan in those patients. Table 1 . Early administration of vasopressors and their use in the emergency department was associated with survival in septic shock. This seemed to be independent of median MAP recorded in the ED. We excluded all the traumatic or post-myocardial infarction forms. Out of 83 patients, the tuberculous etiology was identified in 15 cases (18,1%), mean age was 34 years, 57,8% were men. 9 patients reported a TB contact in their environment, 5 had a medical history of pulmonary TB. After pericardiocentesis, the liquid was citrine yellow in 6 cases and hematic in 5 patients, no patient underwent surgical drainage in our serie. Mycobacterium tuberculosis was found in the expectorations in 4 cases and ADA was positive in 4 patients. HIV serology was negative in all our patients. A 6 months anti bacillary therapy with isoniazid, rifampin, pyrazinamide, and ethambutol was initiated in all our patients with a good evolution in 7 cases, 2 deaths, 1 chronic constrictive pericarditis, 2 small pericardial effusion and 3 lost to follow-up. Althought cardiac tamponade is rarely caused by tuberculosis, this condition remains common in endemic countries such as Morocco and affect younger population, hence the importance of a better knowledge of its prevalence and and multidisciplinary management and more importantly the treatment of the underlying cause using Combined antibacillary medication that has shown satisfying results. 1. The main perceived limiting factor is the absence of a standardized didactic program, followed by mentor's availability in residents' perception and by mentor's experience in consultants' one. PoCUS teaching is present although not optimal and not homogenous in Italian ACC residency schools. Standardisation of residents' ultrasound curriculum is suggested to improve ultrasound teaching. The study included a convenience sample of critically ill patients with supradiaphragmatic CVCs and a CXR for confirmation. US is used for direct confirmation of the guidewire in the Internal Jugular (IJV) or Subclavian (SCV) vein and visualizing the guidewire in the right atrium. To evaluate for pneumothorax, "sliding sign" of the pleura was noted on US of the anterior chest. Results: 34 patients have been included, 35% of the catheters have been placed in the SCV and 65% in the IJV. It was possible to confirm the position of the CVC tip for 70.6% (23 correct, 1 incorrect CXR) of 34 (Figure 1 ). Overall, it was not possible to identify the guide in the right atrium 11 cases (10 false negatives, 2 of them due to the presence of defibrillator leads). Regarding the 1 case where an incorrect position was seen on CXR it was also detected on ultrasound: US of the inserted vein and a negative TTE confirmation. In all cases it was possible to exclude a pneumothorax by US. These results show that bedside ultrasound might be a feasible technique to confirm the CVC positioning. It is important to note that the level of the operator's expertise is significant when assessing the feasibility of this method. We only had a limited sample size and the occurrence of only one misplaced catheter. These preliminary results need to be confirmed on a larger scale. Central venous catheter (CVC) misplacement occurs more frequently after cannulation of the right subclavian vein compared to the other sites for central venous access. Misplacement can be avoided with ultrasound guidance by using the right supraclavicular fossa view to confirm correct guidewire J-tip position in the lower part of the superior vena cava. However, retraction of the guidewire prior to the CVC insertion may dislocate the J-tip from its desired position, thereby increasing the risk of CVC misplacement. The aim of this study was to determine the minimal guidewire length needed to maintain correct guidewire J-tip position throughout an US-guided infraclavicular CVC placement in the right subclavian vein. Methods: 100 adult intensive care patients with a computed tomography scan of the chest were retrospectively and consecutively included in the study. The distance from the most plausible distal puncture site of the right subclavian/axillary vein to the junction of the right and left brachiocephalic veins (= vessel length) was measured using multiplanar reconstructions. In addition, measurements of the equipment provided in commonly used 15-16 cm CVC kits were performed. The minimal guidewire length was calculated for each CVC kit. The guidewires were up to 90 mm too short to maintain correct J-tip position throughout the CVC insertion procedure in seven of nine commercial CVC kits. Four of these are shown in Table 1 . When US guidance is used to confirm a correct guidewire J-tip position, retraction of the guidewire prior to the CVC insertion must be avoided to ensure correct CVC-tip positioning. This study shows that most of the commonly used 15-16 cm CVC kits contain guidewires that are too short for CVC placement in the right subclavian vein. The reliability of lung B-lines to assess fluid status in patients with long period of supine Introduction: Ultrasound-guided cannulation is usually done using either longitudinal or transverse approach. The oblique approach utilizes advantages of both these approaches allowing visualization of the entire course of needle including tip and lateral discrimination of artery from vein [1] . The reported incidence of the complete overlap of femoral vein by the femoral artery is 8-10 percent [2, 3] . We describe the use of the oblique approach for successful cannulation of such a femoral vein which is not possible by usual approaches (Figure 1 ). Endothelial cells play a pivotal role in the atherogenic process. Endothelial cell dysfunction (ED) is the main risk factor for cardiovascular diseases such as hypertension, coronary heart disease (CHD) and peripheral occlusive disease (POD). These diseases significantly increase the risk for perioperative complications. Therefore, identifying patients with ED is important and should influence our prospective perioperative strategy. However, sensitive tools to diagnose ED are still missing and do not belong to our standard of care. Aim of this study was the validation of a new non-invasive method to detect ED and a correlation with a set of established an new endothelial biomarkers. The cohort includes 20 preoperative patients without anamnestic relevant cardiovascular disease and 20 patients with known peripheral occlusive disease (POD). We used non-invasive EndoPAT® Technology from ITAMAR-Medical to measure ED by changes in vascular tone before and after occlusion of the brachial artery and calculate a reactive hyperemia index (RHI). In addition, we measured established markers and alternative biomarkers potentially indicate vascular diseases such as substrates and products from the NO-metabolism L-Arginin, asymmetric/symmetric dimethylarginine (ADMA/SDMA), von-Willebrand factor (vWF) and Sphingosine-1-phosphate (S1P). RHI was able to identify patients with POD. RHI was significant lower in patients with clinical signs and symptoms of POD (P<0.05). Among other markers ADMA was significant higher in POD patients compared to controls and correlates with RHI. The PAD technology is a helpful non-invasive functional test to measure ED and seems able in identify patients with vascular disease. In future, a combination of anamnesis, new diagnostic tools and biomarkers may further increase our sensitivity in identifying risk-patients. Single-lumen 5Fr and triple-lumen 6Fr peripherally inserted central catheters (PICCs) for cardiac output assessment by transpulmonary thermodilution S D´Arrigo 1 Achieving effective critical care in low-and middle-income countries is a global health goal [1] , which includes the provision of effective point of care ultrasound [2] . We sought to establish Zambia's first focused critical care echocardiography training programme in a 16bedded ICU at University Teaching Hospital, Lusaka. The programme was accredited by the UK Intensive Care Society FICE programme, with teaching adapted for local disease patterns such as tuberculous pericardial effusions. Parasternal, apical and subcostal windows were used to assess ventricular dysfunction, hypovolaemia, pleural effusion, alveolar interstitial syndrome and pneumothorax. Zambian doctors working with critically ill patients received an intensive one-day course, followed by mentored scanning at the bedside. Teaching was delivered by visiting fellows from the UK who are accredited in echocardiography and experienced ultrasound educators. Patients with abnormal mean CI or HR suffer from increased hospital mortality. Abnormality of mean SVI was not associated with mortality. These data support accurate measurement of CI as a hemodynamic target and the normal range defined for CI. Since CI also carries the HR information, CI seems to be the more important target than SVI. Our data cannot necessarily be interpolated to less invasive and less precise measurements of CI. An evaluative study of the novelty device with the function of auto-aspirating and pressure indicator for safety central venous catheterization LY Lin, WF Luo, CY Tsao National Taiwan University Hospital, Taipei, Taiwan Critical Care 2020, 24(Suppl 1):P203 Previous studies have shown that 0.8% of CVC attempts resulted in arterial punctures that were not recognized by blood color. To overcome the problem, our team has developed a concept of pressure detecting syringe that can indicate the artery puncture [1] . Based on previous research, different springs, the actuator of the design, have been evaluated to optimize the proposed device and reduce the risk of CVC procedure. Tested devices -The inner-spring is set between the pressure indicator and plunger (Fig. 1A ). Three springs are tested. Test condition -Blood samples were simulated by glucose solution with absolute viscosities of 2 and 6 mPa-s. Different blood pressures were applied to simulate the artery and vein (Fig. 1B) . The response time (RT) is defined as the time required to show the indicating signal (IS) which is the movement of the piston from the position in Fig. 1B : A2-1 to A2-2. The RT is strongly influenced by spring (Fig. 1B) but every design can show the IS when pressure is higher than 50 mmHg, the assumed minimum artery pressure. The RT of S1, the strongest spring design, is about 10s in the 50mmHg-pressure and high viscosity condition. During our tests we found the user can realize the IS before the position be fully changed from Fig. IB : A2-1 to A2-2. Thus, we believe the 10s RT, the worst case, is still acceptable. We also found the weak spring force may lead to difficulty to empty the syringe because the spring must to overcome the blood pressure and the friction between the piston and barrel. As a result, it was difficult for S3 to absolutely empty the syringe even if the blood pressure is only 30 mmHg. The spring will be compressed as Fig. 1B : A2-2 and fail to push the piston when pushing the plunger forwardly, which is not acceptable in clinical use. The results indicate the feasibility of using the device to facilitate CVC and we believe the S1 or S2 are more suitable for the future application. Introduction: Models using standard statistical features of hemodynamic vital sign waveforms (VS) enable rapid detection of covert hemorrhage at a predetermined bleed rate [1] . By featurizing interactions between VS we can train powerful hemorrhage detectors robust to unknown bleed rates. Waveforms (arterial, central venous, pulmonary arterial pressures; peripheral and mixed venous oxygen saturation; photoplethysmograph; ECG) of healthy pigs were monitored 20 min prior and during a controlled hemorrhage at 20 mL/min (N=38) and 5 mL/min (N=13). Two sets of VS features were extracted: statistical features [1] and maximal pairwise cross correlations between pairs of VS within a 5s lag over various time window sizes (30s, 60s, 180s, 300s); and normalized with pre-bleed data of each given animal. For each feature set, a tree-based (ERT) model [2] was trained and tested in a one-animal-out setting to mitigate overfitting on the 20 mL/min cohort, and another trained on the 5 mL/min and tested on the 20 mL/min cohort. We evaluated models with Activity Monitoring Operating Characteristics curves [3] that measure false alert rate as a function of time to detect bleeding. Models using cross-correlations show no significant deterioration of performance when applied to detect bleeding at different rates than trained for, while standard models require 64s longer on average to detect hemorrhage at 1% false alert rate in the previously unknown setting ( Figure 1 ). Correlations between VS data encode physiologic responses to hemorrhage in a way independent of the actual bleed rates. This enables training effective hemorrhage detectors using only limited experimental data, and using them in practice to detect bleeding that occurs at rates other than used in training. We validated a dataset of 634 data lines containing hemodynamic variables and treatment options. We selected nine hemodynamic variables as inputs. Furthermore, data were collected regarding underlying conditions: heart failure, septic shock, renal failure or respiratory failure or a combination. We applied DataStories regression on the dataset (Turnhout, Belgium, www.datastories.com). Six different interventions were analyzed as KPI: administration or removal of fluids, increasing or decreasing inotropes and increasing or decreasing vasopressors. Finally, we elaborated and challenged 63537 predictive models to generate a decision algorithm to predict each KPI. We first looked at how each hemodynamic parameter impacts the prediction of each KPI individually and performed a standard correlation analysis as well as a more involved analysis of the mutual information content between each KPI and all other hemodynamic parameters individually. Confusion matrix and variable importance was obtained for each KPI. The baseline hemodynamic parameters were: GEDVi 738±218 ml/m2, EVWLi 11.2±5.5 ml/kg PBW, SVV 14.8±8 %, MBP 77.8±16.5 mmHg, HR 94.2±23.5 bpm, CI 3.3±1.2 L/min.m2. The results of the regression analysis identified the different variables of importance for each of the different interventions ( Fig 1A) . Based on these results the hemodynamic variables (HR, MBP, GEDVI, ELWI, CI, SVV) were used to develop the final HemoGuide prediction model ( Fig 1B) . The HemoGuide app can be used to advise physicians with respect to basic therapeutic decisions at the bedside or as an educational tool for students. With the collection of new data, the accuracy of the system may grow over time. The next step of the project is to develop a more-sophisticated suite: the ICU cockpit. Feedback function contributes to accurate measurement of capillary refill time R Kawaguchi 1 , TA Nakada 2 , M Shinozaki 2 , T Nakaguchi 2 , H Haneishi 2 , S Oda 2 1 Chiba University, Department of Emergency and Critical Care Medicine, Chiba, Japan; 2 Chiba University, Chiba, Japan Critical Care 2020, 24(Suppl 1):P206 Capillary Refill time (CRT) is well known as an indicator of peripheral perfusion. However, it has been reported to have an intra-observer variance, partly because of manual compression and naked-eye measurement of the nailbed color change. We hypothesized that a We developed a novel portable CRT measurement device with an OLED display that feedbacks weather the strength of the nailbed compression is enough and counts the time. We settled the target strength and time as 5N and 5seconds according to the study we reported before [1] . 20 examiners measured CRT with and without the feedback function. The pressing strength and time during the measurement were evaluated. There was a significant difference among the pressing strength and time between the CRT measurement using the device with and without the feedback function (strength: P<0.001; time: P<0.001). Furthermore, intra-examiner variance was significantly reduced with the feedback function (strength: P<0.001; time: P<0.001). In all measurements without the feedback function, 41% was outside the optimal strength while the measurements with the feedback function 100% achieved the targeted range. Without the feedback function, 12% could not reach the optimal time, while 100% with the feedback function did. In total, 49% of the measurements could not achieve the optimal pressing strength and time. The feedback function for CRT measurements, guiding examiners to an optimal pressing strength and time, fulfilled the required measurement conditions and reduced intra-examiner variance. Our novel portable device would assist an accurate CRT measurement regardless of personal work experience. Introduction: The aim of the study was to detect the difference of conjunctival microcirculation between septic patients and healthy subjects and evaluate the course of conjunctival microcirculatory changes in survivors and non-survivors over a 24 hours period of time. This single-centre prospective observational study was performed in mixed ICU in a tertiary teaching hospital. We included patients with sepsis or septic shock within the first 24 hours after ICU admission. Conjunctival imaging using IDF videomicroscope as well as systemic hemodynamic measurements were performed at three time points: at baseline, 6 hours and 24 hours later. Baseline conjunctival microcirculatory parameters were compared with healthy control. A total of 48 patients were included in the final assessment and analysis. Median APACHE II and SOFA scores were 16 (12-21) and 10 (7-12) respectively. 44 (92%) were in septic shock, 48 (100%) required mechanical ventilation. 19 patients were discharged alive from the intensive care unit. We found significant reductions in all microcirculatory parameters in the conjunctiva when comparing septic and healthy subjects. We found a significant lower proportion of perfused vessels and microvascular flow index (MFI) of small vessels during all three time points in non-survivors compared with survivors. In nonsurvivors we observed no significant changes in conjunctival microcirculatory parameters over time. However, survivors had significantly improved MFI of small vessels at second and third time points compared to first time point. Microcirculatory perfusion in conjunctiva was altered in septic patients. Over 24 hours evaluation survivors in comparison with nonsurvivors had better microcirculatory flow with incremental improvement of microvascular flow index. 16 healthy pigs were centrally cannulated for veno-arterial ECMO and precision flow probes were placed on the pulmonary artery main trunk for reference. 10ml boluses of iced 0.9% saline chloride solution were injected into the ECMO circuit and right atrium at different ECMO flow settings (4, 3, 2, 1 L/min). Rapid response thermistors of standard PA-catheters in the ECMO circuit and pulmonary artery recorded the temperature change. After calibration of the catheter constants for different injection volumes in the ECMO circuit, the distribution of injection volumes passing each circuit was assessed and enabled calculation of pulmonary blood flow. Analysis of the exponential decay of the signals allowed assessment of right ventricular function. Calculated blood flow correlated well with true blood flow (r 2 = 0.74, p < 0.001, Figure 1 Panel A, individual measurements Organ congestion is susceptible to be a mediator of adverse outcomes in critically ill patients. Point-Of-Care ultrasound (POCUS) is widely available and could enable clinicians to detect signs of venous congestion at the bedside. The aim of this study was to develop prototypes of congestion scores and to determine their respective ability to predict acute kidney injury (AKI) after cardiac surgery. This is a post-hoc analysis of a prospective study in 145 patients for which repeated daily measurements of hepatic, portal, intra-renal vein Doppler and inferior vena cava (IVC) ultrasound were performed before surgery and during the first 72 hours after cardiac surgery [1] . Five prototypes of venous excess ultrasound (VExUS) scores combining multiple ultrasound markers were developed (Figure 1 ). The association between each score and AKI was assessed using timedependant Cox models as well as conventional performance measures of diagnostic testing. A total of 706 ultrasound assessments were analyzed. We found that defining severe congestion as the presence of severe flow abnormalities in multiple Doppler patterns with a dilated IVC (>2cm), corresponding to grade 3 of the VExuS C score, showed the strongest association with the development of subsequent AKI compared with other combinations of ultrasonographic features (HR: 3. There is an increasing awareness on the consequences of fluid administration in patients leading to the development of methods that evaluate the effects of fluids loading on the cardiocirculatory system. However, most of methods used in the clinical practice investigate the effects of fluids on the cardiac function, instead of investigating those on the determinants of venous return. Besides volume of fluids, the determinants of fluid loading are the blood volume distribution and the availability of vascular bed. In this study we aimed to test non-invasively the effects of fluids administration on the venular compartment in the skeletal muscle. In addition to the mean systemic filling pressure (msfp), we calculated changes in the stressed and unstressed volumes (Vs, Vu) and the venular bed availability. We enrolled 10 critically ill patients in our Intensive Care Unit. We assessed volumes and pressures by the Near Infra-Red Spectroscopy on the forearm using graded venous occlusions in steps of 5mmHg from 50 to 0 mmHg. The msfp, Vu and Vs were measured as previously reported (Microcirculation 2014; 21:606-614). The vascular bed availability was measured by changes in the volume recruited from the occlusion maneuvers. All the measures were done at baseline and after a fluid load ranging from 250 to 500 ml. Values were expressed as median and interquartile range. Wilcoxon test was used to compare data and a p< 0.05 was considered as significant. Introduction: Hypotension is a common side effect of general anesthesia (GA) and is associated with organ hypoperfusion and poor perioperative outcome [1] . Post-induction hypotension (PIH) is caused by the depressant cardiovascular effect of anesthetic drugs and could be amplified by hypovolemia. The aim of this study was to assess the ability of two echocardiographic fluid responsiveness markers to predict PIH: the inferior vena cava collapsibility index (IVC-CI) and the velocity time integral change (ΔVTI) after passive leg raising. Sixty patients > 50 years of age and scheduled for elective surgery were included. IVC-CI and ΔVTI were measured before GA induction. Anesthesia protocol, fluid infusion and vasopressor administration were standardized in all patients. PIH was defined as a mean arterial pressure (MAP) <65 mmHg or a relative decline from pre-induction value of at least 30% within 12 minutes of GA induction. Receiver operating characteristic (ROC) curve analysis was used. The optimal cutoff was selected to maximize the Youden index (sensitivity + specificity − 1). The measurement of IVC-CI and/or ΔVTI were unsuccessful in seven patients (11.6%). PIH occurred in 32 patients (incidence 53 %). The areas under the ROC curves ( Figure 1) Preload responsiveness might be detected by the changes of cardiac index (ΔCImini) induced by a "mini-fluid challenge" (mini-FC) of 100 mL or even by the changes (ΔCImicro) in response to a "micro-fluid challenge" (micro-FC) of 50 mL. However, the smaller the fluid challenge, the larger the "grey zone" of diagnostic uncertainty. We tested whether (1) micro-and mini-FC monitored by calibrated pulse contour analysis detect preload responsiveness and (2) adding 50 mL when the result of a micro-FC is within the grey zone improves diagnostic accuracy. In 30 patients with circulatory failure, we infused 50 mL saline over 30s followed by 50 mL over 60s. We measured ΔCImicro and ΔCImini by the pulse contour analysis (PiCCO2). Preload responsiveness was defined by an increase in CI (ΔCIPLR) during a passive leg raising test ≥10%. Diagnostic uncertainty was described by calculating the grey zone after bootstrapping. ΔCImicro were larger in responders than in non-responders (5. For the micro-FC, the area under the receiver operating characteristic curve was 0.975±0.03 (threshold 1%), while it was 0.955±0.03 for the mini-FC (threshold 4%). For the micro-FC, the grey zone ranged from 0.82% to 3.47% and included 9 (30%) patients. For the mini-FC, it ranged from 2.8% to 6.8% and included 9 (33)% patients, among which 6 were already in the grey zone of the micro-FC. When evaluated by pulse contour analysis, micro-and mini-FC reliably detect preload responsiveness but with a large diagnostic uncertainty. It seems that adding 50mL more fluid to a micro-FC when its result is within the grey zone does not improve the diagnostic accuracy. The study is ongoing. The Starling-SV bioreactance device (Cheetah Medical) reliably detects passive leg raising (PLR)-induced changes in cardiac index (ΔCI). We tested whether it can also track the small and short-time ΔCI induced by the end-expiratory occlusion (EEXPO) test, and whether shortening the time over which it averages cardiac output (24 s in the commercial version) improves the detection. In 42 mechanically ventilated patients, during a 15-sec EEXPO, we measured ΔCI (in absolute value and in percentage) through calibrated pulse contour analysis (CI pulse , PiCCO2 device) and Starling-SV. For the latter, we considered both CI Starling-24 provided by the commercial version and CI Starling-8 obtained by averaging the raw data over 8 s. We calculated the correlation between ΔCI pulse and both ΔCI Starling-24 and ΔCI Starling-8 , and the area under the receiver operating characteristic curve (AUROC) to detect preload responsiveness, defined by a PLR test. When considering absolute values, the correlation coefficient r between ΔCI pulse and ΔCI Starling-24 was 0.362 (p=0.02), which was lower than the one between ΔCI pulse and ΔCI Starling-8 (rr comparison). When considering percentage changes, no correlation was observed between ΔCI pulse and ΔCI Starling-24 . Conversely, the correlation coefficient between ΔCI pulse and ΔCI Starling-8 was 0.402 (p=0.01), but it was lower than the one obtained for absolute values (p=0.04 for r comparison). EEXPO-induced ΔCI Starling-8 , both in absolute values and in percentage, detected preload responsiveness with AUROCs of 0.90 (sensitivity 83%, specificity 87%) and 0.89 (sensitivity 83%, specificity 79%), respectively. Shortening the averaging time of the bioreactance signal increases the reliability of the Starling-SV device to detect EEXPO-induced ΔCI. Moreover, the accuracy of the method is increased when absolute rather than percentage changes of CI are considered. Fluids are among the most prescribed drug in intensive care, particularly among patient with circulatory failure. Yet, very little is known about their pharmacodynamic properties and this topic has been left largely unexplored. There is a lack of strong scientific evidence in current guidelines for fluid administration in shock. Several factors may impact the hemodynamic efficacy of fluids among which the infusion rate. The aim of this study was to study the influence of fluids administration rate on their pharmacodynamics in particular by studying mean systemic pressure (P ms ). We conducted a prospective observational study in 17 patients with circulatory failure to compare two volume expansion strategies. When a patient required a fluid bolus, 500 mL of normal saline were administered and several hemodynamic parameters were recorded continuously: cardiac output (CO), arterial pressure (AP), mean systemic pressure (P ms ). Infusion rate was let to the discretion of the attending physician and a "slow" and a "fast" group were determined based on the median of the infusion time. Fluids effect was measured by the area under the curve (AUC), maximal effect (E max ) and time to maximal effect (t max ) for each hemodynamic variable. Results: P ms AUC was higher in the "fast" group compared to the "slow" group (p=0.043). We observed a shorter t max and a higher E max for P ms in the "fast" group compared to the "slow" group (p=0.039 and 0.02 respectively). Regarding CO, t max was also shorter in the "fast" group (p=0.041). AUC and E max were similar between the two groups. Fluid effect dissipated within 60 minutes following the end of fluid infusion for every patient in both groups. The decreasing slope from maximal effect was comparable in the groups, for P ms and CO alike. The effect of a 500 mL fluid bolus in septic shock patients vanished within one hour. A faster infusion rate increased maximal effect and shortened the delay to reach it. Study is ongoing. Fluid management in the control arm of sepsis trials AA Anparasan, AC Gordon, MK Komorowski Imperial College London, Department of Surgery and Cancer, London, United Kingdom Critical Care 2020, 24(Suppl 1):P219 In the past, high-volume intravenous fluid resuscitation in severe sepsis and septic shock was common. More recently, concerns over the harmful effects of this practice have led some clinicians to adopt less liberal fluid strategies. We sought to analyse temporal trends in fluid administration in the control arms of recent adult sepsis trials and assess any correlation with patient severity and mortality. A literature search was conducted to identify relevant randomized controlled trials that reported fluid administration published post 2000. We recorded 4 outcomes: total amount of IV fluid administered in the control arms of these trials between hospital admission and hour 6 and hour 72 following trial enrolment, mortality rates at the latest reported time point and APACHE-II score at admission. We computed the Pearson correlation coefficient and linear regression between study dates and the 4 outcomes. We identified 9 relevant trials [1] [2] [3] [4] [5] [6] [7] [8] [9] , which recruited a total of 2,444 patients in their control arms, from 1997 to 2018. The temporal analysis revealed no obvious trend in the in the total volume of IV fluid given by hour 6 following trial enrolment (Correlation p=0.94) ( Figure 1 ). However, the total volume of fluid given by hour 72 decreased significantly over the period of interest (R=-0.78, p=0.02). In parallel, we observed a decrease in mortality (R=-0.6, p=0.08) but there was no evidence of decrease in illness severity over time (p=0.84). We found that in published RCTs over the last two decades, the amount of intravenous fluid given to patients with sepsis in the initial 6 hours did not appear to change, however less intravenous fluid was given over the first three days. Upcoming large RCTs will test the safety and efficacy of restrictive fluid administration approaches in sepsis. Clinical practice guidelines recommend prompt intravenous (IV) fluid resuscitation for pediatric sepsis, including an initial fluid bolus of 20 mL/kg [1] . However, recent evidence is conflicting as to the effectiveness, volume, and consequences of aggressive fluid resuscitation in septic children. Therefore, we sought to determine the epidemiology of early IV fluid resuscitation in an integrated health system, specifically at community hospital emergency departments (ED). We studied a retrospective cohort of pediatric patients (ages > 1 month to < 18 years) with sepsis identified in electronic health record data at 11 community EDs in southwestern Pennsylvania from 2010 to 2014. Sepsis was defined as 1) suspected infection (combination of fluid culture collection and administration of antibiotics and 2) organ dysfunction (pediatric SOFA score ≥ 1) within 24 hours of suspected infection. Fluid bolus therapy was defined as electronic documentation of administration of 0.9% normal saline IV bolus within 1 hour of the time of sepsis onset. Results: Among 1,247 patients with pediatric sepsis, 513 (41%) received IV fluid bolus therapy within 1 hour of time of sepsis onset. The volume of fluid administered ranged from 2 mL/kg to 67 mL/kg (Figure 1 , Panel A), corresponding to a median volume of 20 mL/kg (IQR 17-22 mL/kg). Patients who received ≥ 20 mL/kg of fluids (n = 258, 50%) were younger (mean age 5 years, SD 5 vs. 9 years, SD 6; p<0.001), more often had blood cultures collected during evaluation (86% vs. 76%, p=0.003), and were more often transferred to another facility (48% vs. 33%, p<0.001) when compared to patients who received < 20 mL/kg of fluids (n = 255, 50%). Mean fluid bolus volume within 1 hour of time of sepsis onset by hospital ranged from 12 mL/kg to 24 mL/kg (Figure 1, Panel B) . In a cohort of community emergency departments, 41% of septic children received intravenous fluid boluses within one hour, and of those, only one half received volumes concordant with guidelines. (Figure 1 ). A wide range of fluid balance exists in septic shock patients cared for in ICU. Trends of serum albumin in septic and non-septic critically ill Introduction: The link between hypoalbuminaemia and poor outcomes in critical care is well established [1] . Limited data are available on serum albumin trends during critical illness [2] . In this study we assessed trends in serum albumin for up to 7 days in both septic and non-septic critically ill patients. We retrospectively examined the records of 1107 adult patients admitted to critical care at the Royal Liverpool University Hospital between 2008 and 2014. We then excluded patients who did not have albumin data available for the first 7 days, leaving us with 758 patients. 506 patients (66.8%) had sepsis, and of these patients 116 had died by day 28. Of the 252 non-septic patients (33.2%), 40 patients had died by day 28. Albumin levels were collected for 7 days from admission to critical care, in addition to other demographic and biochemical data. Statistical analysis was performed using repeated measures analysis. Septic patients had lower serum albumin than non-septic patients throughout the 7 day period (p<0.001). We observed a decrease in albumin by day 2 in all groups, with levels increasing over the subsequent days. There was no difference in daily serum albumin between non-septic patients who survived or died. This is the first study, to our knowledge, to compare albumin trends in septic and non-septic critically ill patients over 7 days. Further research is needed to elucidate the optimal recipients and timing of albumin therapy. Introduction: Burn injury is characterized by marked inflammation, capillary leakage, and profound hemodynamic alterations. Early albumin resuscitation is avoided fearing a paradoxical fluid escape into the interstitium. On the other hand, administration of crystalloids in massive amounts causes tissue edema and fluid extravasation, which deteriorates tissue perfusion by increasing oxygen diffusion distance. Albumin administration could reduce the amount required to maintain hemodynamic stability in this population. We investigated whether albumin improves tissue perfusion and microcirculation by reducing tissue edema. This is an observational study conducted in the Burn Unit of Maasstad Hospital, Rotterdam. Patients with burns higher than 15% of Total Body Surface Area (TBSA) were included in the study. Sublingual microcirculation was measured at admission (T0), 4(T4), and 12(T12) hours after burn injury. Total Vessel Density (TVD) and Functional Capillary Density (FCD) were analyzed. Fluid Management was calculated according to the modified Parkland formula. Albumin (20%) infusion was started 12 hours after the burn insult. A total of nine patients were recruited between January and December 2019. Patients were included in the study after 5.7±2.3 hours of the insult with a mean TBSA of 36±22%. The amount of crystalloid infusion was 2718±3348 ml and 8501±5230 ml at T0 and T12,respectively. Within the first 12h (T12) 502±386 ml albumin was given. TVD decreased from 23.6±2.2 at T0 to 20±1.3 at T4 (p<0.05) (Figure 1) Introduction: Spontaneous bacterial peritonitis (SBP) accounts for ≥24% of the bacterial infections that occur in patients with cirrhosis, and SBP has a high mortality rate (20% to 50%). Albumin infusion has been shown to improve the outcome of SBP. The aim of this study is to examine the impact of albumin infusion on hospital length of stay (LOS) for cirrhotic patients with SBP. We utilized a nationwide Electronic Health Record data set (Cerner Health Facts®) to extract real-world data on adult patients (≥18 years old) with cirrhosis and SBP who received antibiotics and admitted between January 1, 2009, and April 30, 2018. International Classification of Diseases (ICD-9/10) codes were used to identify cirrhosis and SBP. We used laboratory data for calculation of the Model for Endstage Liver Disease Sodium (MELD-Na) score and vital signs data for calculation of the quick Sepsis Related Organ Failure Assessment (qSOFA) score at baseline for each encounter. A generalized linear model was used to assess the relationship between albumin infusion and hospital LOS. Results: There were 2,131 encounters that identified patients with SBP and cirrhosis, of which 1,661 survived hospitalization. Albumin was infused within 24 hours of admission ('early albumin') in 43% (n=718), after 24 hours in 31% ('late albumin', n=517), and not administered in 26% ('no albumin', n=426). MELD-Na was higher at presentation in early albumin cases versus late-or no-albumin cases (mean 24.0 and 19.5). Unadjusted LOS was lower in patients receiving early albumin (8.7 days versus 10.4 days). Risk-adjusted analysis demonstrated that early albumin led to a 17.5% reduction in LOS (95% CI 12.6%-22.2%, p = <0.0001). In these real-world data, albumin infusion within 24 hours of admission in patients with cirrhosis and SBP was associated with a shorter hospital stay despite more severe illness. Early albumin may not only improve clinical outcomes but may also reduce the costs of hospitalization in cirrhotic patients with SBP. Early albumin use in patients with septic shock is associated with a shorter hospital stay: real-world evidence in the United States Introduction: Septic shock is among the most common critical care illnesses and incidence is rising, with mortality in excess of 35%. Septic shock predisposes patients to multiple organ failure. While albumin is effective in management of circulatory dysfunction in septic shock, its utilization in this population is understudied in the US. We evaluated the impact of albumin utilization on hospital length of stay (LOS) among septic shock patients. We used a nationwide Electronic Health Record data set (Cerner Health Facts®) to extract real-world data on adult patients (≥18 years old) with severe sepsis or septic shock, admitted between January 1, 2013, and April 30, 2018, identified by International Classification of Disease (ICD-9/10) codes, and receipt of antibiotics and vasopressors. We calculated the Charlson Comorbidity Index (CCI) and the Acute Physiology Score (APS) at baseline. A generalized linear model was used to examine the association between albumin and hospital LOS, especially accounting for the timing of albumin infusion. We identified 3,156 unique visits for septic shock patients that survived to discharge. Albumin was infused within 24 hours of admission ('early albumin') in 15%, after 24 hours ('late albumin') in 20%, and not administered in 65%. Both CCI and APS were higher, at presentation, in early albumin cases than late-or no-albumin cases (mean: 7.49 and 7.17, and 51.50 and 43.23, respectively). Unadjusted LOS was slightly lower in patients receiving early albumin (11.81 days versus 11.84 days). A risk-adjusted analysis demonstrated that early albumin was associated with 4.92% shorter LOS (95% CI 0.43%-9.22%, p = 0.0322). Albumin infusion within 24 hours of admission was associated with a shorter length of hospital stay. Early albumin infusion may lead to better outcomes and reduced costs in patients with septic shock. Further research is being conducted to assess other potential benefits of early albumin administration in this patient population. Every new septic event follows by hemodynamic instability may lead sequentially to decreased organ perfusion, multiple organ failure. Acute renal failure is recognized clinical feature during sepsis (up to 40-50% in all cases). Furthermore, urine output close monitoring is a cornerstone diagnostic clinical tool in each septic critically ill patient. In present study, we analyzed the dynamic minute-to-minute changes in the urine flow rate (UFR) and also the changes in its minute-to-minute variability (UFRV) during new septic event in critically ill patients. Demographic and clinical data were extracted from the of 50 critically ill patients who were admitted to the ICU and developed new septic event (followed by fever and leukocytosis) and analyzed. A Foley catheter was inserted into the urinary bladder of each study patient. The catheter was then connected to electronic urinometer, a collecting and measurement system which employs an optical drop detector to measure urine flow. The urine flow rate variability (UFRV) is defined and calculated as the change in UFR from minute to minute. Results: UFR and UFRV both decreased significantly immediate after new septic episode until beginning fluid resuscitation (ppvalues <0.001) (Figure 1) . Statistical analysis by the Pearson method demonstrated a strong direct correlation between the decrease in UFR, UFRV and the decrease in the MAP (R=0.03, p=0.003; R=0.03, p=0.004) ( Figure 1 ), and heart rate (R=0.12,p=<0.001) since systemic pressure starts to drop. UFRV and UFR demonstrated good clinical response to fluid administration despite the fact that systemic blood pressure did not improve (Figure 1) . We consider that dynamic changes in UFRV and UFR could potentially serve as a more sensitive signals ofclinicaldeterioration during the new septic event in critically ill patients.We also suggest that those parameters mightbeable to identify the optimal end-point of fluid resuscitative measures in septic critically ill patients. Diminished urinary output (UO) is largely used as marker of acute kidney injury (AKI) in critically ill patients. We aimed to explore the role of urinary output on incidence and mortality of AKI developed during ICU admission. The study population consists of all patients admitted between 2007 and 2018 to one of the Dutch ICUs included in the NICE database with an ICU length of stay of at least 48 hours, having daily measurement of creatinine and UO. Only patients without renal replacement therapy that have a serum creatinine lower than 1.1 mg/dl (97.5 μmol/L) or a UO above 0.5 ml/kg/h on the day of the index ICU admission were considered at risk for AKI. Patients were followed during their ICU stay and classified according to the highest KDIGO criteria reached based on creatinine alone (model 1) and creatinine plus UO (model 2) using ICU admission serum creatinine as baseline. In both models, patients were classified as: no AKI, renal impairment at the first day of ICU admission, AKI stage 1, AKI stage 2, and AKI stage 3. We identified 52,863 patients (60% male, mean age 63 years, median ICU-LOS 4 days). Of those, 51.2% of patients had renal impairment at the first day of ICU admission. Among the remaining patients, 44.4% in model 1 and 29.9% in model 2 were classified as having no AKI, 2.6% and 1.4% as AKI stage 1, 0.7% and 9.3% as AKI stage 2, and 1.1% and 8.2% as AKI stage 3, respectively. Survival at 30-day markedly differed according to the AKI classification model used (Figure) . Similarly, adjusted HRs for 30-day mortality differed among patients with and without AKI compared to patients with renal impairment at the first day of ICU admission ( Figure 1) . Among patients admitted to the ICU 50% had renal impairment at the first day of ICU admission. Our findings suggested that UO plays an important role both on AKI incidence and mortality and should be carefully interpret in the clinical setting especially in AKI stage 2 classification. Introduction: Acute kidney injury (AKI) mostly attributed to renal tubular damage, has a high morbidity and mortality outcome [1] , so a sensitive tool to assess the degree of tubular affection is needed for early detection and management of this condition. We investigated the ability of furosemide stress test (FST) (one-time bolus dose of 1mg/kg or 1.5 mg/kg if on prior furosemide-intake) to predict progression to AKIN Stage-III in critically ill subjects with early AKI. We studied 80 subjects; 40 consecutive patients in group I receiving FST and 40 consecutive patients in group II receiving standard medical management for AKI;15 patients (37.5%) and 20 patients (50%) met the primary endpoint of progression to AKIN-III in groups I and II respectively. Patients with progressive AKI had significantly lower urine output following FST in the first 6 hours (p<0.033). The area under the ROC curves for the total urine output over the first 2 hours following FST to predict progression to AKIN-III was 0.87 (p = 0.001). The ideal-cutoff for predicting AKI progression during the first 2 Fig. 1 (abstract P227) . Thirty-day survival according to AKI classification model 1 and model 2. Hazard Ratios (HRs) for 30-day mortality adjusted by sex, age, type of admission, APACHE IV score, SOFA score at day of admission (excluded renal SOFA score) for patients with AKI classified with model 1 and model 2 Fig. 1 (abstract P226) . Clinical correlation between urine flow rate variability (UFRV) and UFR and mean arterial blood pressure over new septic event (black arrows) and and after initial fluid resuscitation (red arrows). Note: The UFRV and UFR decreased progressively in parallel with the falling mean arterial blood pressure and, than, rose again after the administration of fluids hours was a urine volume of less than 325 milliliters with a sensitivity of 87.1% and specificity 84.1% group receiving FST. On the other hand, statistically significant hypotension, hypo-(kalemia, phosphatemia and magnesemia) occurred in group I. The FST in patients with early AKI could predict liability for progression of AKI, however it should be performed under adequate monitoring. Introduction: Ischemia-Reperfusion (IR) causes renal dysfunction and damage. IR induces renal tubular injury triggered by hypoxia and hyperoxia, mediated by oxidative stress and inflammation. Furosemide inhibits Na + -K + -2Clcotransporter in the thick ascending limb of the renal medulla to decrease Na + reabsorption, reducing oxygen consumption. We investigated if furosemide could improve renal oxygenation, function and damage by reducing O 2 consumption and oxidative stress after IR. Methods: 24 Wistar albino rats were divided into 4 groups, with 6 in each group; Sham-operated Control (C), Control + Furosemide (C+F), IR and IR+F. After anaesthesia (BL), 45 min supra-aortic occlusion was applied to IR and IR+F groups followed by 15 min (T1) and 2 hours of reperfusion (T2). Furosemide 50μg/kg/h infusion was simultaneously administered to C+F and IR+F after ischemia. Systemic hemodynamic, renal blood flow (RBF), renal vascular resistance (RVR), renal oxygen delivery (DO 2ren ), renal oxygen consumption (VO 2ren ), creatinine clearance (Ccr), sodium handling, urine output (UO), cortical (CμO2) and medullar (MμO2) microvascular oxygenation were measured. Results: RBF was reduced in IR (2.1±1) and IR+F (2.3±1) at T1 (p<0.05) but it was further reduced in IR+F (1.9±1) (p<0.05) at T2 compared to C and C+F. RVR was increased in IR (5338±2860) and IR+F (5123±2517) at T1 compared to C. RVR was normalized in IR (2198±879) but not in IR+F (4232±2636) at T2 compared to C (p<0.05). CμO 2 and MμO 2 did not differ between groups after IR insults (Figure 1 ). Tissue O 2 was reduced at the medulla, but not at the cortex in IR+F group compared to IR. DO 2ren and VO 2ren were reduced in IR (56±17 and 26±12 ml/ min) and IR+F (34±20 and 21±14) at T2 (p<0.05). PC was higher in IR+F (37.33±4.27) compared to IR 29.67±3.39 (p<0.05). VO 2 / TNa + was increased in IR+F compared to IR. No change in Ccr and UO was observed. Furosemide after IR causes further impairment of renal perfusion, energy utilization and renal oxygenation resulting in renal damage. Acute renal failure induced by hypoxemia: incidence and correlation study A Trifi 1 , H Fazzeni 2 , A Mehdi 2 , C Abdennebi 2 , F Daly 2 , Y Touil 2 , S Abdellatif 2 , S Ben Lakhal 2 1 La Rabta hopital, Medical intensive care unit., Tunis, Tunisia; 2 La Rabta hopital, Tunis, Tunisia Critical Care 2020, 24(Suppl 1):P230 Introduction: Acute renal failure (ARR) is a common complication in ICUs and usually caused by hypoperfusion. ARF induced by hypoxemia is a concept rarely reported in ICU. Its incidence and pathogenesis are not well understood. We aimed to study the relationship between hypoxemia and the occurrence of ARF. Retrospective cohort study including patients with hypoxemia whatever its etiology between January 2016 and August 2019. Patients with chronic renal failure were excluded. ARF was defined and ranked according to the KDIGO criteria 2012. Arterial blood gas, urea, creatinine and clearance were reordered on the first, third and seventh days of evolution. Results: 50 patients were included and 2 groups were obtained: group of hypoxemic patients with ARF (ARF+, n=30): versus group of hypoxemic patients without ARF (ARF-, n= 20). The incidence of hypoxemie-induced ARF was therefore 60%. Clinical characteristics were comparable in both groups with a mean age of 47 ± 16 and a sex ratio of 1.77. The comparative study showed in ARF+ group: a lower pH (7. .25], p = 0.023). The most significant correlation was showed with MDRD clearance at day 3 and P/F ratio at day 1 (Rho = 0.338, p = 0.038). Multivariate analysis found that septic shock and non invasive ventilation in hypoxemic patients were the factors related to ARF with respectively OR=11.08, 95% CI=1.56-83.84, p=0.016 and OR=6.18, 95% CI=1.16-34.07, p=0.033. Overall mortality was 68% (n=34) and ARF was an independent factor of mortality: OR=6, and 95% CI=1.35-26.64, p = 0.017. Hypoxemia-induced ARF is a common complication associated with excess mortality. Our study suggests that renal function is correlated with the degree of hypoxemia and that this correlation is rather distinct 48 hours from hypoxemia. In preclinical models of sepsis, we have previously demonstrated that activation of AMP activated protein kinase (AMPK) using metformin, improves survival and organ function. Thus, AMPK activation is a potential therapeutic target in sepsis, and we hypothesize that exposure to metformin during sepsis is associated with decreased AKI and mortality Methods: Retrospective analysis of a 13-hospital cohort of adult ICU patients with type 2 diabetes mellitus (T2DM) who presented sepsis. We investigated if exposure to metformin during the hospitalization was associated with reduced 90-day mortality and AKI. We used 1:4 Propensity Score Matching (PSM), Propensity Score Stratification (PSS) and Propensity Score Weighting (PSW) based on the probability to be exposed to metformin using 55 covariates. For PSM an exact match for insulin, amputation, cardiovascular diseases, retinopathy, Charlson Index, eGFR, HbA1C, and APACHE III, were used. Sepsis was defined using sepsis 3 criteria, and AKI as KDIGO stage 2 or 3. From 164,910 patients, we found 673 diabetic adults exposed to metformin during hospitalization and 14,174 who were not. Metformin exposure during hospitalization is associated with decreased 90-day mortality and AKI in septic adult patients with T2DM. These findings suggest that metformin may constitute a potential therapeutic strategy in sepsis, and the potential role of AMPK activation as a protective mechanism. However, studies are needed to confirm this association and the specific mechanisms of action. Introduction: Acute kidney injury (AKI) may occur up to 50% in the intensive care unit (ICU). Predicting AKI recovery may allow for risk stratification of patients, patient and family counseling, and early post-discharge renal care planning. However, predicting AKI recovery at an early stage remains a challenge. Methods: This is a retrospective study of the EPaNIC multicenter randomized controlled trial database [1] , which was split into development (n=2194) and validation (n=2446) cohorts, and patients experiencing AKI stage 3 and/or renal replacement therapy (RRT) in the ICU were included [2] . AKI recovery was defined as being alive, without any stage of AKI, and without need of RRT at hospital discharge. A logistic regression model with backward feature elimination was developed. The model performance was assessed by discrimination, calibration, and net benefit analysis, and internally validated with ten-fold cross validation. Only the results in the development cohort are reported. Of the 229 patients who developed AKI3, 86 patients (37.55%) recovered from AKI. The multivariable model selected age, bilirubin, heart rate, mean arterial blood pressure, surgical diagnostic group on ICU admission, mechanical hemodynamic support on ICU admission, suspected sepsis on ICU admission as AKI recovery predictors. The model had a mean area under the receiver operating characteristic curve (AUROC) of 0.75 (Standard deviation (SD) 0.01), mean calibration slope of 1.02 (SD 0.04), and mean calibration-inthe-large of <0.01 (SD 0.01) (Figure 1 ). At the classification threshold that maximized sensitivity and specificity, mean net benefit with respect to treat-none was 0.16 (SD 0.01) and mean net benefit with respect to treat-all was 0.11 (SD 0.01). By using the routinely collected clinical data, the developed prediction model can fairly identify patients with a higher chance of AKI recovery at hospital discharge. Introduction: Acute kidney injury (AKI) is a frequent complication in critically ill patients and is associated with increased morbidity and mortality. Sepsis is one of the most Common cause of AKI. A prospective study was conducted over 6 months (January 01-June 30, 2018).We included patients with septic shock at admission or at any time during hospitalization.The AKI staging was based on KDIGO criteria.Patients were divided into two groups, a group with AKI (AKI+) and a group without AKI (AKI-).Then we compared the baseline characteristics, laboratory and physiologic data. Patients with AKI (AKI+) were subdivided according to their prognosis. Were enrolled 75 patients. The mean (SD) age was 56.43(±18) years.Sex ratio was 1.91. Fifty-two (70%) patients developed AKI.SAPSII and SOFA score in admission were higher in patients with kidney injury [59 Vs 44 points (p= 0.002), 6.5 Vs 4 points ;(p=0.003)] respectively.The serum lactate level was significantly higher in (AKI +) group patients during the first day of septic shock [6.12± 1.38 mmol/l (AKI+)Vs 4.11± 0.79 mmol/l(AKI-);(p=0.002) ] and its clearance was lower [(32±10.99% (AKI +)Vs 61±13%(AKI-);(p=0.001)]. A significant difference was observed in C reactive protein level [224±114 mg/l (AKI +) Vs 124±77 mg/l (AKI-) ; (p=0.004)].Among (AKI+) patients, KADIGO III was observed in 59.6% of cases.Nineteen (36.5%) patients received hemodialysis.A normal kidney function was recovered in 40.4% of cases.AKI+ patients had a higher occurrence in Disseminated intravascular coagulation (32 Vs 3 patients, p=0.002),acute respiratory distress syndrome (18 Vs 2 patients; p=0.023) and cardiac dysfunction (20 Vs 1 patient, p= 0.001).Mortality was higher in AKI group (67% Vs 9%; p=0.001). The development of septic AKI was associated with poor outcomes and prognosis.A better understanding of sepsis induced AKI pathway will enable us to develop targeted therapeutic protocols.Newer tools,permitting AKI early detection, may make these therapies more fruitful. This study aims to show that contrast procedures do not significantly increase the risk of renal injury and should not be deferred. Traditionally CIAKI is the most important cause of in-hospital renal failure after nephrotoxic drugs and shock. Problem is also the non-uniform definition of CIAKI proposed by three different initiatives (AKIN, ESUR and KDIGO). AKIN, being the most rigorous, defines CIAKI as an increase in serum creatinine >0.3 mg/dL or >50% of baseline within 48hours. A retrospective observational single-centre cohort study analyzed 82 patients who underwent a contrast procedure with Iomeron 350. The first group underwent a CT pulmonary angiography (CTPA), and the Fig. 1 (abstract P232 ). Internally validated model performance: (top row) ROC curve; (middle row) calibration curve; (bottom row) decision curve second a coronary angiography with PCI. No patient was previously prepared (RAAS blockade removal, crystalloid administration etc). We studied demographics, history of CKD and comorbidities and their impact on the CIAKI by the AKIN criteria. A total of 82 patients were divided into two groups (CTPA and PCI). CTPA group (20M, 21F) all had acute PE and the PCI group (28M, 13F) were treated for ACS. The mean age was 69 and 65 years respectively. CKD was more prevalent in the PCI group (8pt vs. 3pt) possibly explained by the more advanced atherosclerotic disease. Advanced CHD (NYHA III/IV) was found in 3pt (PCI) vs. 2pt (CTPA) while diabetes and shock were equally distributed (11pt and 5pt) in both groups. The mean amount of contrast was significantly higher in the PCI group (242.3mL vs. 60mL). The mean creatinine/eGFR measured before and after contrast in the CTPA group was 87. The goal of this study was to determine whether changing the body mass (BM) with fat-free mass (FFM) in Cockcroft-Gault (CG) formula could provide a more accurate prediction of AKI in obese patients undergoing cardiac surgery. In this retrospective study, we reviewed institutional data of patients who underwent elective cardiac surgery in a tertiary referral university hospital. Baseline patient creatinine value was collected and GFR was estimated using the MDRD, CKD-EPI and CG formulas. CG formula was further modified by replacing the BM with FFM derived from the bioelectrical impedance analysis. Postoperative AKI was defined by KDIGO creatinine change definitions. Accuracy of the eGFR values to predict the AKI was calculated with ROC-AUC analysis. All the calculations were performed in different categories of BMI. Figure 1 ). The eGFR is a poor predictor of AKI in obese patients undergoing cardiac surgery. The FFM modified Cauckraft-Gault formula yield more accuracy in this specific group. RetroAKI: a ten-year retrospective study of acute kidney injury in intensive and progressive care units Introduction: Acute kidney injury (AKI) is a frequent condition in intensive care units (ICU) and progressive care units (PCU), affecting 15% to 70% of the patients, depending on the studied population and AKI definition. AKI has been identified as an independent risk factor of ICU mortality and development of chronic kidney desease. The objective of this study was to describe the incidence of each AKI stages as defined by KDIGO definition (with evaluation of urine output, serum creatinine and initiation of renal replacement therapy (RRT)), in a mixed medical and surgical population of patients hospitalized in ICU and PCU over a 10-year period (2008-2018). We included all patients who stayed more than 12 hours in ICU or PCU of Edouard Herriot Hospital from May 2008 to January 2019. Data used to classify the patients were the urine output over a sixhour period, serum creatinine and the need for RRT, according to KDIGO classification Results: 18,882 hospital stays were analyzed. Median ICU/PCU length of stay was 3 days [IQR: 1.5-6.6]. Among ICU patients, 74% had at least one AKI episode graded 1, 2 or 3 and 49% had at least one severe episode (stage 2 or 3). Among PCU patients, 44% had at least one episode of AKI and 20% a severe episode of AKI. Patients had an average of 1.9 episodes of AKI per stay. Table 1 represents the incidence of maximal AKI stage during one stay. We found that urine output was the more frequent criteria to make diagnosis of AKI stage 1 or 2 whereas RRT was more frequent for AKI stage 3. This retrospective study reports a more important AKI incidence in our ICU/PCU than in previous studies. The difference could be Fig. 1 (abstract P236) . When comparing AUC in different categories of BMI, the mCG appeared to be the only statistically accurate formula in patients with BMI 30-34.9 explained by the difficulty to collect urine output from conventional database. Serum creatinine and the use of RRT are often the only two criteria used to define and classify AKI. These results confirm the high incidence of AKI in ICU and PCU and the importance to make an early AKI screening of patients for whom preventive nephroprotective actions are needed. Introduction: ICU-patients with acute kidney injury (AKI) requiring renal replacement therapy (RRT) are at risk for infections [1, 2] . In this study we evaluated the incidence of infection in ICU patients with and without less severe AKI. Finally, impact on outcomes was explored. This is a retrospective study on the PDMS (Protection Data Management System) of the 4 adult ICUs of a University Hospital. AKI was assessed on KDIGO criteria (creatinine (Scr) and urine output), during the first 7-d of ICU stay. Infection was validated in the PDMS by a team of ICU specialists. Results: During a 4-year period, a total of 7485 subjects were enrolled. AKI was diagnosed in 64.7% of patients during ICU stay. AKI patients were older (63 vs. 59 y, p=0.001), had higher SAPS 2 (57 vs. 41, p< 0.001), and had more urgent ICU admission (64% vs. 48%, p<0.001). More AKI patients had mechanical ventilation (55% vs. 41%, p<0.001) and vasopressors on d-1 (47% vs. 23%, p<0.001). AKI stage 1, 2, and 3 was present in 25.5%, 28.0% and 11.1% of patients. More AKI patients had infection (57% vs. 28%, p<0.001) and increasing AKI stages were associated with higher infection rates (AKI-0: 28%; AKI-1: 55%, AKI-2: 55%, AKI-3: 69%, p<0.001) (Figure 1 ). We observed 2-3 times higher mortality in AKI patients with infection, and a stepwise increase of mortality with increasing AKI stages. After correction for infection and other confounders we found that all AKI stages were associated with in-hospital mortality (ORs AKI-1: 1.7, AKI-2: 2.0, AKI-3: 3.6, all p< 0.001). Over half of AKI patients experienced an episode of infection and increasing AKI severity was associated with higher infection rate. AKI patients with infection had marked higher mortality, suggesting that infection was an important driver of outcome. However, after adjustment, AKI stages had strong association with hospital mortality. Several new biomarkers have been introduced to improve early diagnosis of acute kidney injury (AKI). "NephroCheck" (NC; Astute Medical, USA) is a bedside test calculating "AKIRisk" (product of urinary concentration of the cell cycle arrest-markers TIMP-2 and IGFBP7). Several studies suggest the usefulness of NC in selected populations. However, the value of early routine measurement of NC is unclear. Methods: Therefore, we compared the prediction of a combined endpoint (CEP: death <60 days and/or requirement of renal replacement therapy RRT) by NC within 12h of ICU admission (NC1) and 24h later (NC2) with admission values of serum-creatinine, BUN, cystatin C, urinary NGAL, APACHE II and SOFA (ROC-analysis). As a secondary endpoint we investigated the additional value of pathological measurements of NC1≥0. 3 Critically ill patients showed increased relative UCE in the first days of ICU admission, which may be attributed to higher protein catabolism. Increased relative UCE was associated with ARC and both had no effect on 90-day mortality. Introduction: This study compared epidemiology, short-and long-term outcomes for patients with community-acquired (CA) and hospital-acquired (HA) acute kidney injury (AKI). We retrospectively analyzed all episodes of AKI over a period of 3.5 years (2014-2017) on the basis of routinely obtained serum creatinine measurements in 103,161 patients whose creatinine had been measured at least twice and who had been in the hospital for at least two days. We used the "Kidney Disease: Improving Global Outcomes" (KDIGO) criteria for AKI and analyzed the first hospital admission. A total of 103161 were admitted in hospital and fulfilled the inclusion criteria. Average observation period per patient was 248 days. The incidence of CA-AKI among included hospital admissions was 9.7% compared with an incidence of 8.6% of HA-AKI, giving an overall AKI incidence of 18.3%. Patients with CA-AKI were younger than patients with HA-AKI (64 vs 66.2y) and had significantly less comorbidities, including preexisting cardiac failure, ischemic heart disease, hypertension, diabetes. Patients with CA-AKI were more likely to have stage 1 AKI (69,3 vs 58,4%, p<0.001) and had significantly shorter lengths of hospital stay than patients with HA-AKI (14 vs 24d, p< 0.001). Those with CA-AKI had better survival than patients with HA-AKI (Figure 1; p<0 The evidence base for management of fluid removal during renal replacement therapy (RRT) is limited. A recent international survey revealed the extent of practice variation worldwide [1] . Our aim was to summarise the responses from Europe-based healthcare professionals who participated in the survey. The international self-administered, cross-sectional, internet-assisted, open survey was disseminated between January 2018 and January 2019 via website links and emails to members of different critical care societies. Results: 485 participants from 31 European countries completed the survey of whom 365 (75%) were intensivists and 306 (63%) worked in university-based hospitals. Persistent oliguria / anuria was the most common indication for fluid removal (51% responders). The parameters which guided fluid removal included hemodynamic status (47% responders), cumulative fluid balance since admission (23% responders), and 24-hour fluid balance (17% responders). 90% of participants reported using CRRT with a median net ultrafiltration rate 98 mL/hr (IQR 51-108mL/hr) for hemodynamically unstable and a rate of 300 mL/hr (IQR, 201-352mL/hr) for hemodynamically stable patients. Only 26% of practitioners checked net fluid balance hourly (70% nurses, 16% physicians). New hemodynamic instability, defined as new onset or worsening tachycardia, hypotension, or need to start or increase the dose of vasopressors was reported to occur in 20% Fig. 1 (abstract P242 ). Long-term survival patients (IQR 10.0-30.0). Different strategies to re-gain hemodynamic stability were used. (Figure 1 ) Main barriers to fluid removal were patient intolerance (72% physicians, 85% nurses) and interruptions in fluid removal (43% physicians, 64% nurses). The majority of participants agreed that guidelines and protocols would be beneficial. The practice of fluid removal during RRT is very variable across European countries. Nurses and doctors identified a need for evidencebased protocols and clear guidelines. Introduction: Kidney Disease Improving Global Outcomes (KDIGO) guidelines suggest the use of anticoagulation in continuous renal replacement therapy (CRRT) [1] . The effectiveness of the anticoagulation is important because replacing the hemofilter and tube interrupts CRRT and increases total therapy time. Regional Citrate Anticoagulation (RCA) and Unfractionated Heparin (UFH) are most commonly using methods for CRRT anticoagulation [2] . The aim of this study was to investigate the efficacy, safety and metabolic differences of the patients in ICU who underwent CRRT and anticoagulation method changed from UFH to RCA for different reasons. After ethics committee approval (2019-14/9) 100 patients who underwent CRRT between 2018-2019 at Bursa Uludag University Hospital ICU have been investigated and 11 patients who underwent CRRT by both RCA and UFH included in the study. We divided patients in two groups (RCA, UFH), demographic data (sex, age), SOFA score, creatinine, urea, mean filter life time (FLT) and ultrafiltration flow (UF), platelets, electrolytes (Na, K, Ca, Mg), lactate, NaHCO3 and pH of groups at beginning and ending of first RCA and UFH hemodialysis collected. We used t-test and 1000 bootstraps statistic tests. In agreement with other studies [3, 4] , FLT and UF was statistically significant lower in UFH group (Table 1) . There was no statistically significant difference in efficiency (urea and creatinine decrease), pH, lactate, NaHCO3 level, platelets count and electrolytes between two groups. To our knowledge, there are no studies comparing these two anticoagulation methods in the same patients. Small number of patients and retrospective evaluation are limitations of the study. Our results suggest that the implementation of RCA method is safe and effective as UFH method with longer FLT and UF. Regional citrate anticoagulation during CRRT in liver failure MJ Jain, PK Kumar G, DG Govil, JK KN, SP Patel, MS Shafi, RH Harne, DP Pal, SM Monanga Medanta the Medicity, Critical Care, Gurugram, India Critical Care 2020, 24(Suppl 1):P245 Continuous renal replacement therapy (CRRT) with Regional citrate anti-coagulation (RCA) is increasingly being used as a treatment modality in critically ill patients. There is limited experience of use of citrate anticoagulation patients with acute liver failure and acute on chronic liver failure who pose a tough challenge of being at a higher risk for bleeding. An institutional protocol was formulated for use of commercially available citrate solutions and the same was studied to assess filter life and safety of citrate in liver disease. The primary objective was to assess safety of citrate anticoagulation in liver disease. This study was a single centre, prospective, non-randomized, single arm, observational study. All adult patients, with acute liver failure and acute on chronic liver failure requiring CRRT were included. Blood ionized calcium levels of 0.9 to 1.1mmol/l was targeted throughout the therapy and total to ionized calcium ratio of less than 2.4 was maintained. RCA was stopped if the ratio was more than 2.4 for 2 consecutive assessments. Incidence of citrate accumulation and toxicity were assessed. Average filter life was also assessed. Metabolic parameters, electrolytes and strong ion gap were followed till 24 hours after completion on CRRT. A total of 25 patients were included in the study. Nineteen patients of acute on chronic liver failure and 6 patients of acute liver failure underwent CRRT with RCA. Baseline average serum bilirubin, lactate and INR were 11.8 mg/dL, 6.4 mmoL/L and 2.1 respectively. The average filter life was 50 hours 3 minutes. Citrate accumulation took place in (n=13) patients and RCA had to be stopped for ( n=6) patients due to the same. None of the patients had evidence of citrate toxicity. Citrate anticoagulation was well tolerated in patients with acute liver failure in patients with or without pre-existing chronic liver disease on CRRT. Introduction: The intention of this study is to highlight the levels of citrate load for the general population that increases the risk of citrate complications (insufficient trisodium citrate delivery; net citrate overload and citrate accumulation) [1] . This was a prospective data collection between February and March 2019 in a fourteen bedded Critical Care Unit. Eleven consecutive episodes of CRRT were collected (a new episode characterized if CRRT was discontinued for 48 hours and above). One episode was excluded due to short duration (less than 4 hours). Patients undergoing RCA-CRRT received either a fixed 25 or 35 ml/kg/h effluent dose protocol. Median patient age was 59, male 100%. Average time on CRRT was 4.1 days (2-9). 70% of the patients had complications, although 60% were minor ( Figure 1 ). All of the patients with net citrate overload had citrate loads of 13.8mmol/h or above. The main risk factors were found to be shock and liver impairment which occurred in 60% of cases of which 40% developed complications. A fixed dose effluent protocol to standardise practice can potentially lead to a higher risk of minor complications. In our experience this is likely due to a lack of appropriate monitoring for RCA-CRRT complications. Despite this, our complication rate of citrate accumulation is in line with that reported in literature. Citrate loads in our 25 ml/kg/ hr protocol were 22.6% higher than our 35 ml/kg/hr protocol and strongly related to higher complication rate that worsened in patients with risk factors for poor citrate metabolism. Introduction: There is no optimal timing of continuous renal replacement therapy (CRRT) in acute kidney injury (AKI); however, it is based on volume overload, azotemia, hyperkalemia and severe metabolic acidosis [1] . An important reason for metabolic acidosis in AKI is increased unmeasured anions (UA) [2] . Delta-pH-UA (ΔpH UA ) detects the degree of metabolic acidosis caused by UA and is calculated by using 'The Partitioned pH Model' [3] . In this study, we investigated whether ΔpH UA was a predictor to start CRRT in patients with AKI. The study was designed as a multicentric, prospective, observational study in 2019. Patients who were ≥18 years old and diagnosed with AKI [1] were included. The moment AKI was diagnosed, arterial blood gas, albumin, magnesium, inorganic phosphorus, urea, creatinine and ΔpH UA values were recorded. All patients were divided into two groups as CRRT(-) and CRRT(+) which consists of patients performed CRRT due to traditional criteria. Fig. 1 (abstract P246) . Incidence of complications Introduction: Continuous renal replacement therapy (CRRT) is labor intensive and requires advanced nursing knowledge and skills. However, 40% of registered nurses (RN) are less than 2-year post-registration experiences in our unit. Also there is an increasing demand of CRRT from 185 CRRT days in 2017 to 248 CRRT days in 2018. The obstacles for CRRT in our department, includes variation of regimen, complicated workflow and insufficient training of nurses. A continuous quality improvement project is carried out to standardize the regimen, enhance workflow and provide structured training to nurses in the intensive care unit, to enhance nursing competence. Methods: Introduction: Sepsis and septic shock is a leading cause of mortality in the intensive care unit. We tried to evaluate a novel hemoperfusion cartridge through a retrospective evaluation of patient's data in our centre. We used it as an adjuvant therapy in our patients with Sepsis and septic shock due to varied causes. The aim of this study was to evaluate the efficacy of therapeutic hemoperfusion cartridge (HC-Foshan Biosun Medical ® ) in the management of patients with sepsis. We retrospectively analysed data of Group 1 (n=30 Sepsis) and Group 2 (n=30 sepsis+Hemoperfusison; sepsis treated with Hemoperfusion cartridge) admitted between 2015 to 2018. Group 2 had received Hemoperfusion cartridge as adjuvant therapy along with standard of care. Demographic data, procalcitonin [1] and leukocyte levels before and after therapeutic cytokine removal and duration of HC were recorded. While the mean duration of CVVHDF was 96.4 hours, the duration of Hemoperfusion cartridge (application was 32.1±16.4 hours). Among 30 patients who survived 25 patients were administered hemoperfusion cartridge within 12 hours of ICU admission. There was a significant reduction in scores like APACHE and SOFA score post Hemoperfusion cartridge therapy procalcitonin and leucocyte levels after therapeutic Hemoperfusion cartridge were found significantly lower than the pretreatment values (respectively p=0.001, p=0.001). Retrospective analysis showed significant reduction of vasopressors, and improvement in MAP in Group2. Therapeutic Hemoperfusion cartridge with cytokine removal applied with CVVHDF in septic patients have positive contributions to provide survival advantage. Removal of activated leukocytes and endotoxin from the blood is a complex therapeutic effect of the device for removing endotoxin. In the main group (16 patients with abdominal septic shock) after surgery, the traditional treatment was supplemented with two sessions of endotoxin removal (2 hours each with an interval of 24 hours) using "Alteco LPS adsorber" (Sweden). The control group consisted of 8 patients with a similar diagnosis and only traditional treatment. Results: 28% of white blood cells were adsorbed in LPS adsorber. Among them, granulocytes (35%) were maximally extracted, then CD14 + monocytes (CD14 + Mo) (33%), HLA-DR + mononuclear cells (6%), monocytes (2%). IL-6, IL-10, procalcitonin (PCT) were not adsorbed. The 28-day mortality rate in the main group was 50% and was lower compared to the control group -75%. During monitoring, in the main group 24 hours after the first removal of endotoxin, a decrease in the initially increased amount of activated CD14 + Mo by 2.2 times, as well as functionally mature defensin + granulocytes (def + Gran) by 1.6 times was observed. IL-6, IL-10, and PCT decreased by 1.9; 17.8; and 1.2 times, respectively. During this period, the control group showed an increase in CD14 + Mo and def + Gran, while IL-6, IL-10 did not change, and PCT increased 1.9 times. A day after the second removal of endotoxin and then 5 days later, the main group of IL-6, IL-10, and PCT continued to decline. In the control group, only IL-10 decreased after 3 days, the rest continued to grow. The cellular adsorption of endotoxin-bound CD14 + Mo and mature def + Gran is an important part of the mechanism of action of the endotoxin removal device. Does the endotoxin adsorption of PMX column saturate in 2 hours? Preliminary study C Yamashita 1 In the EUPHRATES trial, the polymyxin B-immobilized fiber column (PMX) hemoperfusion (HP) had no significant effect on 28-day mortality. Endotoxin (LPS) burden by endotoxin activity assay >0.90 may exceed 50 μg [1] , so the dose and duration of PMX-HP could be insufficient to lower the LPS burden. To confirm this issue, we experimented in a closed-circuit with 24 h continuous LPS addition, and PMX can adsorb > 50μg [2] . Further, LPS concentration became constant within 2 h in the single LPS spike test for determining PMX-HP duration [3] . To prove our hypothesis that the single LPS spike test reflects the adsorption equilibrium, and not saturation, we added LPS intermittently to reaction. Methods: LPS (10 ng/mL) was mixed with 125 mL deactivated fetal calf serum as a reflux solution, as previously described [2] ; this concentration is much higher than that observed in septic patients. We created a closed circuit that incorporates PMX-01R at 1/14 th the amount of an adult PMX and performed PMX-HP at 10 mL/min for 5 h. LPS was added in two shots (post 2 h: 1250 ng, 10 ng/mL; post 4 h: 3750 ng, 30 ng/mL). LPS was measured using the Limulus Amebocyte Lysate test at 0, 0.5, 1, 2, 3, 4 and 5 hr. After an initial decrease between 0 and 1 h, LPS concentration did not decrease between 1 and 2 h after PMX-HP initiation. Post LPS pulse addition at 2 h, it increased and then decreased till 3 h. Futher, it did not decrease between 3 and 4 h, but it increased and then decreased again after LPS pulse addition post 4 h (Figure 1 ). LPS adsorption rates were 76.2, 43.4, and 40.7% at 2, 4, and 5 h, respectively. Conclusions: LPS adsorption capacity of PMX-01R was maintained even after two additional shots of LPS, suggesting that the constant LPS concentration in the previously reported LPS spike test might be indicative of adsorption equilibrium rather than saturation. A coohort study included 65 patients admitted to three Intensive Care with sepsis / septic shock ( SEPSIS 3 Criteria ) and AKI ( AKIN score). All patients were submitted to CVVHDF with the oXiris filter (Baxter, USA) . The main clinical data, Il 6, Procalcitonin, Endotoxin ( EAA ) and SOFA score were evaluated at basal time ( T0 ) and at the end of the treatment ( T1 ). All data are expressed as mean ± SD or median and IQR . ANOVA TEST was used to compare the changes in the time. Results: 60 patients were submitted to RRT with the oXiris filter for 46 ± 12 hours . 21 patients had AKI 3 stage , 13 patients AKI 2 stage and 25 patients had AKI 1 stage. At T0 all groups had an high vasopressor Fig. 1 (abstract 251) . LPS concentration in LPS pulse addition test support to maintain MAP ≥ 70 mmHg. IL6, Procalcitonin EAA and SOFA total were also elevated with no difference between the groups. At T1 creatinine improved better in AKI 2 ( p< 0.001 vs. T0 ) and in AKI 1 ( p< 0.0001 vs T0) then in AKI 3 group. MAP increased in AKI 2 ( p< 0.01 vs T0) and AKI 1 ( p < 0.01 vs T0) , but not in AKI 3 group. IL6, procalcitonin decreased more in AKI 1 ( p < 0.0001 vs T0) then AKI 3 . At T 2 SOFA total was higher in AKI 3 then AKI 1 ( p< 0.001 ) and AKI 2 ( p< 0.01 ). Conclusions: AKI 2 and AKI 1 stage patients submitted to BP with the Filter oXiris respond better then AKI 3 stage patients . 2 -This transalte in a better clinical course. 3-CRRT with Oxiris filter is useful in septic patients with AKI, but AKI 3 stage septic patients represent an high risk group. A non-interventional, multicenter, non-randomized patient registry for multiple organ dialysis with the ADVOS system Multiple organ failure is a challenging problem in the ICU. As an advanced dialysis system, the ADVOS procedure can eliminate watersoluble and protein-bound substances, regulate the acid-base balance as well as fluid and temperature. In 2017, a national registry was established to collect data under "real-life" conditions of patients treated with ADVOS without any trial-specific interventions (DRKS ID: DRKS00017068). Methods: Data from 01/2017 to 02/2019 from 4 German hospitals (university hospitals in Hamburg-Eppendorf, Mainz, Essen, and Klinikum Weiden) were analyzed. Clinical parameters, treatment settings and adverse events were documented. The 28-and 90-day mortality rates were compared with extrapolated rates based on the SOFA score. Results: 118 patients with a median age of 60 years (IQR 45-69), of whom 70 (59%) were male, were evaluated. Patients had a median SOFA score of 14 (IQR: 11-17) before the 1st ADVOS treatment, which is associated with an expected mortality of 80%. The number of failing organs was 3 (IQR 2-4): cardiovascular (74%), lungs (57%), liver (47%), kidneys (74%), coagulation (69%) and CNS (29%). 429 treatments with a median duration of 16 (IQR: 10-20) hours were evaluated. 87 were discontinued, of which 25 (6%) were due to a device error. 79 adverse events were documented, 13 were related to the device (all due to clotting and recovered without sequelae). Significant removal of protein-bound (bilirubin: 11.2 vs 9.2 mg/dl) and water-soluble toxins (BUN 32 vs 20 and creatinine 1.9 vs 1.4 mg/dl). In addition, improvement in acid-base balance was observed: pH (7.33 vs. 7.40), bicarbonate (21.3 vs. 25.5 mmol/l) and base excess (-4.5 vs. 1.0 mmol/l) ( Table 1) . 28-and 90-day mortality rates were 60% and 65%, respectively. In a cohort of patients with multiple organ failure, we observed an improvement in the expected mortality rate, especially if the ADVOS procedure was applied early. Adverse events are comparable to other dialysis therapies in intensive care patients. Introduction: Acute kidney injury (AKI) due to ischemia-reperfusion affects onethird of the patients in cardiac surgery. We investigated the potential role of Cyclosporine (CsA) to prevent postoperative AKI and mitigate inflammatory response to extracorporeal circulation (ECC). Methods: Double-blind, randomized, placebo-controlled single-center study. Patients (n=67) scheduled for elective cardiac surgery were randomized to 2,5 mg/kg CsA or placebo before the surgery. The primary objective was to assess the role of CsA to reduce the incidence of postoperative AKI. The secondary objective was to study CsA induced changes in the inflammatory response to ECC. Results: All enrolled patients were analyzed. Postoperative AKI was more pronounced in the Cyclosporine group compared to placebo. OR=5.03 (1.76-15.74), 95% CI. The cytokine production in response to ECC was not affected by Cyclosporine (Figure 1) . In patients undergoing cardiac surgery, a single preoperative dose of CsA does not prevent the postoperative decrease in renal function. CsA does not alter cytokine release in response to extracorporeal circulation. Elevated post-ECC levels of pro-inflammatory cytokine IL-6 are associated with kidney dysfunction and may be predictive. New generation adsorbent such as oXiris R was introduced as novel technique in renal support for critically ill patients [1] . Septic shock patients require decatecholaminization strategies emphasizing blood purification to remove catecholamine-producing mediators and evacuate overload fluid in interstitials. Our 64-year-old female patient, admitted to ICU after surgery with history of ovarium cancer. Her septic shock was worsened with ARDS, hypercoagulable state and AKI. Vasopressors were set. Patient was controlled with mode SIMV16,PS12,TV350 ml,PEEP7,FiO270%. Renal support was implemented by diuretic and CVVH started on the second day. At first,regular adsorbent was used, post-filter mode was set, and periodic fluid removal target was 50 ml/h. But after 24hours, no significant changes observed. OXiris R added and after 12 hours passed, requirements of vasopressors reduced, tidal volume increased, hemodynamic parameters stabilized, urine production increased. It was continued for 2 days and patient was recovered. Our patient had fallen into inadequate CARS stage in which not able to counter septic effects on vital organs (Figure 1 ). Renal would be primary target for filtration and monitoring tool. Adsorbent consisted of AN69 and polyethyleneimine was useful to purify blood from endotoxins conjoined with slower filtration. Continuous yet cautious process in CVVH evacuate fluid and mediators while maintain steady hemodynamics. Biomarkers could not be evaluated due to limited resources, but improving parameters could be signs that showed recovery process had already took place. Advanced hemofiltration is a privilege. Implementing and enhancing it with new generation adsorbent would increase survivors by extracting unnecessary fluids and eliminating catastrophic endotoxins and mediators. Consent to publish: written informed consent for publication was obtained from the patient. Analysis of retrospective cohort study data of 120 patients (pt) treated for DKA at ICU of Kaunas Clinics during 2014 -2019 has been carried out. Serum kalemia, glycemia; hypokalemia, hypoglycemia episodes; rate of insulin interruption for hypo-and normoglycemia during ketoacidosis; use of NaH 2 CO 3 for ketoacidosis, and LOS in ICU were analysed. SPSS 23.0 was used for statistic calculations. Traits evaluated as significant at p < 0.05. At the beginning of DKA treatment in totally hypokalemia (3.1 ± 0.3 mmol/l) was recorded in 64/120 pt (53.3 %). Due to ignoring of blood pH (6.8 -7.3 (7.0 ± 0.1) kalemia was falsely misinterpreted as "normo-" or "hyper-" 3.5 -7.1 (5.1 ± 0.9 mmol/l) in 49/68 pt (72.1 %), thus disregarded so complicated by obvious hypokalemia additionally in 26/49 pt (53.1 %). In hypokalemia LOS in ICU was 52.9 ± 29.7 vs 32.8 ± 18.6 h, p < 0.05. Insulin use has caused hypoglycemia (1.2 -3.3 (2.5 ± 0.7 mmol/l)) in 22/120 pt (18.3 %), LOS in ICU 63.2 ± 38.5 vs 38.9 ± 21.2 h, p < 0.05.Insulin use was interrupted in case of normoand hypoglycemia with still persisting ketoacidosis in 39/120 pt (32.5 %), LOS in ICU was found to be 56.5 ± 30.7 vs 37.0 ± 22.5 hr, p < 0.05. NaH 2 CO 3 was given for symptomatic treatment of ketoacidosis during first 10 h of DKA in 33/120 pt (27.5 %) with stable hemodynamic: HCO -3 buffer has increased (4.8 ± 3.3 -7.9 ± 3.1 mmol/l), p < 0.05, but it didn't control ketoacidosis, and LOS in ICU was 55.2 ± 27.5.2 vs 39.1 ± 25.6 h, p < 0.05. Hypokalemia, hypoglycemia, precocious interruption of insulin use were recorded as complications of DKA treatment. All of them have prolonged LOS in ICU. Symptomatic treatment of ketoacidosis with NaH 2 CO 3 had no effect on it, and prolonged LOS in ICU as well. A growing interest exists about CO 2 derived parameters in shock management. Central venous-arterial PCO 2 difference (P cv-a CO 2 ) is strictly related to cardiac output; central venous-arterial PCO 2 difference to arterial-central venous O 2 content difference ratio, P cv-a CO 2 / C a-cv O 2 , has been proposed as anaerobic metabolism when it's >1.4 mmHg/ml [1] . To evaluate P cv-a CO 2 /C a-cv O 2 reliability in detecting anaerobic metabolism, we analyzed it in 7 consecutive patients affected by MALA admitted to our ICU, considering these patients as a prevalent anaerobic metabolism model. We calculated, by Douglas formula, central venous-arterial CO 2 content difference to arterial-central venous O 2 content difference ratio, C cv-Ca CO 2 /C a-Ccv O 2 , as a Respiratory Quotient surrogate. We performed arterial and central venous blood gas analysis simultaneously at admission, we calculated P cv-a CO 2 , P cv-a CO 2 /C a-cv O 2 and C cv-a CO 2 /C a-cv O 2 and we recorded ScvO 2 . We verified relationship between P cv-a CO 2 /C a-cv O 2 and ScvO 2 and arterial pH, arterial lactates, SOFA score at admission and C cv-a CO 2 /C a-cv O 2 by linear regression analysis. Pcv-aCO2/Ca-cvO2 greatly increases in MALA (2.16 ± 0.84). Pcv-aCO2/ Ca-cvO2 (Fig.1) shows significant co-variation with pH (R2=0.618; p= 0.003) and SOFA score at admission (R2=0.628; p=0.003). Pcv-aCO2/ Ca-cvO2 has poor agreement with Ccv-aCO2/Ca-cvO2 (R2=0.008) and disagrees with it in identifying anaerobic metabolism, in our series, in fact, Ccv-aCO2/Ca-cvO2 is, in 3 patients, < 1 like an aerobic RQ value. Pcv-aCO2/Ca-cvO2 shows better agreement with pH, SOFA score and lactate level than ScvO2. In our series, P cv-a CO 2 /C a-cv O 2 is good illness and acidosis severity marker, but it seems to be affected by pH value in accord with Haldane effect [2] . P cv-a CO 2 /C a-cv O 2 , in our study, doesn't seem to be a reliable anaerobic metabolism marker nor a RQ surrogate. It is thought that early administration of basal insulin to patients with diabetic ketoacidosis (DKA) may improve outcomes. Small studies have shown trends towards decreases in time to closure of anion gap (TCAG), rates of rebound hyperglycemia following discontinuation of intravenous (IV) insulin, rates of hypoglycemia, intensive care unit (ICU) length of stay (LOS), and hospital LOS [1] [2] [3] [4] . This was a single-center, retrospective chart review of our institution's DKA protocol between January 2010 and August 2019. Patients that received early basal insulin within 24 hours of initiation of IV insulin and before closure of the anion gap (AG) were compared to those that did not receive early basal insulin. The primary outcome was median TCAG. Secondary efficacy outcomes include: time on IV insulin infusion, time to de-escalation of level of care, hospital LOS, and re-elevation of AG. Secondary safety outcomes included incidences of hyperglycemia, hypoglycemia, and hypokalemia. A total of 334 patients were identified meeting inclusion and exclusion criteria. Median TCAG was longer in the experimental group (9 vs. 6 hours, p <0.01). Incidence of re-elevation of AG and incidence of hyperglycemia were lower in the experimental group. Other outcomes were similar (Figure 1 ). Early administration of basal insulin to patients with DKA resulted in a longer TCAG with a lower incidence of re-elevation of AG and hyperglycemia. Early administration of basal insulin appears to be safe with respect to hypoglycemia and hypokalemia. Glycaemic control continues to be a challenge in critically ill patients. Stress induced hyperglycaemia has been associated with increased morbidity and mortality [1] . Conversely, patients receiving intensive glucose control have a higher risk of death [2] . A quality improvement project was designed to develop a comprehensive insulin protocol that recognized pre-existing diabetes and reduced hypoglycaemia. Data was collected prospectively in all adult patients admitted to the RAH intensive care unit (ICU) between October 2018 and August 2019 from the national ICU audit database and electronic patient records. Daily figures were collected for numbers of hypoglycaemic episodes (<4mmol/l), "in range" (4-10mmol/l) blood sugar measurements and patients with a pre-existing diagnosis of diabetes. Data was collected and analysed using Microsoft Excel. Results: 307 patients were identified; 56 patients (18.2%) had pre-existing diabetes. A total of 6908 blood sugar measurements were reviewed; 5268 (76.3%) were "in range" and 126 hypoglycaemic episodes (1.8%) occurred. There was no significant correlation between number of diabetic patients and measurements within range. Of note, there was an increase in number of measurements per patient in the second half of the time period (11 vs 32). The development of this protocol has improved glycaemic control in our ICU. There are considerably fewer episodes of hypoglycaemia and a large proportion of blood sugar measurements are in range. We hope to continue data collection and interrogate the prevalence of pre-existing diabetes further to reduce glycaemic variability. The optimal management of blood glucose levels for critically ill patients remains unclear. Hypoglycemia, hyperglycemia and glycemic variability are associated with mortality. The time in targeted blood glucose range (TIR) has been suggested to correlate with mortality depending on the status of antecedent glycemic control, but It has not been verified optimal TIR and whether there is an optimal disease-specific TIR. A retrospective observational study was performed at a single center. In the present study, we enrolled all critically ill patients admitted in intensive care unit from 1 January 2016 to 31 October. Patients with diabetic ketoacidosis or hyperosmolar hyperglycemic syndrome and patients who had < 10 blood glucose readings were excluded. Gathered information included, in part, demographics, comorbidities, severity of illness scores, diagnosis at admission, length of ICU stay and hospital discharge status. The primary outcome was 28-day mortality. We analyzed to find the optimal TIR for critically ill patients. Several TIRs were each tested for correlation with mortality. A total of 1,523 patients, 51.8% of whom had diabetes, were studied. TIR 70 to 139 mg/dL (OR, 0.33; 95%CI, 0.18-0.58), TIR 70 to 179 mg/ dL (OR, 0.33; 95%CI, 0.23-0.47) and TIR 110 to 179 mg/dL (OR, 0.28; 95%CI, 0.17-0.44) > 80 % was independently associated with mortality in critically ill patients respectively. The optimal TIR did not differ depending on diagnosis at admission. In this retrospective evaluation, TIR 110 to 179 mg/dL > 80 % was independently associated with mortality in critically ill patients, especially those with good antecedent glucose control. These findings have implications for the design of future trials of intensive insulin therapy. The prevalence of chronic dysglycemia (diabetes and prediabetes) in patients admitted to Swedish intensive care units (ICUs) is unknown. We aimed to determine the prevalence of such chronic dysglycemia and asses its impact on blood glucose control and patient-centred outcomes in critically ill patients. In this retrospective, observational study, we obtained routine glycated hemoglobin A1c (HbA1c) measured in patients admitted to four tertiary ICUs in Sweden between March and August 2016. Based on previous diabetes history and HbA1c we determined the prevalence of chronic dysglycemia (prediabetes, undiagnosed diabetes and known diabetes). We compared indices of acute glycemic control in the ICU and explored the association between chronic dysglycemia and ICU-associated infections, mechanical ventilation, renal replacement therapy, vasopressor therapy, and mortality within 90 days. Of 943 patients, 312 (33%) had chronic dysglycemia. Of these 312 patients, 127 (41%) had prediabetes or undiagnosed diabetes and Fig. 1 (abstract P259) . Results 185 (59%) had a known diabetes diagnosis. During ICU stay, patients with chronic dysglycemia had higher average blood glucose, spent less time in target glucose range, had greater glucose variability, and were more likely to develop hypoglycemia than patients without chronic dysglycemia. Chronic dysglycemia was associated with greater need for renal replacement therapy (odds ratio 2.10, 95% CI 1.35-3.27) and increased 90-day mortality (hazard ratio 1.33, 95% CI 1.01-1.77) after adjustment for Simplified Acute Physiology Score 3. In contrast, chronic dysglycemia was not associated with mechanical ventilation, vasopressor therapy, or ICU-associated infections. In four tertiary Swedish ICUs, measurement of HbA1c showed that 1/ 3 of patients had chronic dysglycemia (prediabetes or diabetes). Chronic dysglycemia was associated with marked derangements in glycemic control during ICU stay, greater need for renal replacement therapy and with increased mortality at 90 days. Case report: modern antidiabetic therapie causes ketoacidosis AM Heiden, M Emmerich Krankenhaus Bad Oeynhausen, Institut für Anästhesie, Bad Oeynhausen, Germany Critical Care 2020, 24(Suppl 1):P263 The modern antidiabetic class of SGLT2-inhibitors, that are known to reduce the risk for cardiac events [1] , are increasingly used in the last few years. A 68-year old male patient with diabetes mellitus suffered 10 days after colectomy surgery from abdominal pain and nausea. The patient had an antidiabetic therapy with empaglifozin that was paused until day 5 after surgery (nutrition start on day 5, weaning on day 6). Methods: This is a case report of one male patient seen in the ICU setting. Daily blood values including arterial blood gases, vital parameters and clinical status of the patient were observed and evaluated. The blood gases showed this metabolic acidosis: pH 7.38; PCO2 20.3 mmHg, bicarbonate 12 mmol/l, BE -11.63 mmol/l, lactate 1.6 mmol/l, glucose 7 mmol/l. A ketonuria despite normal blood glucose values was noticed, so that the diagnosis of ketoacidosis was clear. After analyzing the possible causes we found out, that empaglifozin in times of catabolism and fasting can cause this severe symptomatic. We terminated the therapie with empaglifozin and under the treatment with insulin the symptoms disappeared within 3 days and the patient could be discharged from the ICU on day 17 after surgery. After one episode of ketoacidosis the therapy with SGLT2-inhibitors should lifelong never be started again. We recommend that intensivists should be aware of the modern SGLT2-inhibitors because of the shown severe complications and the increased use of this medication. Consent to publish: written informed consent for publication was obtained from the patient. While obesity confers an increased risk of death in the general population, numerous studies have reported an association between obesity and improved survival among critically ill patients. This contrary finding has been referred to as the obesity paradox. This retrospective study uses two causal inference approaches to address whether the survival of non-obese critically ill patients would have been improved if they had been obese. The study cohort comprises 6,557 adult critically ill patients hospitalized at the Intensive Care Unit of the Ghent University Hospital between 2015 and 2017. Obesity is defined as a body mass index of ≥30 kg/m 2 . Two causal inference approaches are used to estimate the average treatment effect in the untreated (ATU): a naive approach that uses traditional regression adjustment for confounding and that assumes missingness completely at random, and a robust approach that uses super learning within the targeted maximum likelihood estimation framework and that uses multivariate imputation of missing values under the assumption of missingness at random. Obesity is present in 18.9% of patients. The in-hospital mortality is 14.6% in non-obese patients and 13.5% in obese patients. The marginal associational risk difference for in-hospital mortality between obese and non-obese patients is -1.06% (95% confidence interval (CI) -3.23% to 1.11%, P=0.337). The naive approach results in an ATU of -2.48% (95% CI -4.80% to -0.15%, P=0.037), whereas the robust approach yields an ATU of -0.59% (95% CI -2.77% to 1.60%, P=0.599). A robust causal inference approach that may handle confounding bias due to model misspecification and selection bias due to missing data mitigates the obesity paradox, whereas a naive approach results in even more paradoxical findings. The robust approach does not provide evidence that the survival of non-obese critically ill patients would have been improved if they had been obese. Bowel management within an ICU environment is often difficult. Recent data collection from an intensive care unit at the RVI identified either loose stool or constipation on > 50% of patient days. It was postulated this could be improved with a more tightly controlled bowel management regimen. To test this hypothesis a step-wise bowel protocol was created and introduced. Data was collected in the 4 month period following its implementation with the following aims: 1) Assess effectiveness of the protocol 2) Further observe the reasons for loose or constipated stool on an Diarrhea is an important problem in each critically ill pateints [1] . We aimed to investigate the frequency and management of diarrhea in our ICU. In this study 47 patient retrospectively reviewed, in our ICU between 01.01.2017-03.01.2018. Patients were divided into two group as diarrhea "positive" and "negative". Patients with diarrhea had fluid or loose stools 3 or more times a day. Each diarrhea period of the patients with diarrhea was examined separately and compared with the group without diarrhea. Nutritional status, enteral product formulation, leukocyte, neutrophil, albumin values, gastric sparing, antibacterial and antimycotic use, LOS in hospital and in ICU were compared. In diarrhea positive group, on the day of hospitalization, laxative and/or enema administration, toxin A in stool, nitrogen balance before and after diarrhea, enteral product change in diarrhea, probiotic, metronidazole or oral vancomycin use were examined. The incidence of diarrhea was 68.3%. The most common diagnosis of ICU admision was respiratory failure (60-85%) in both groups. Diarrhea occurred in two days after laxative and/or enema treatment. Enteral nutrition was higher in both groups (≥90%). Nasogastric tube feeding was significantly higher in the diarrhea group (p=0.041). There was no difference between nutritional product formulation and diarrhea development (p>0,1). Antibacterial use was high in both groups (75%); however, Teicoplanin use was significantly higher in the group diarrhea negative group (p=0.028). The LOS in ICU, and hospital was higher in diarrhea group (p<0.001). No difference in mortality rates (p>0.5). Many factors may cause diarrhea in ICU, and diarrhea may adversely affect patient treatment and increase morbidity. We think that preventive methods are as important as the treatment of diarrhea. The use of parenteral glutamine is studied in number of RCTs and systemic reviews (Heyland D 2013, Wischmeyer P 2014), while there is a lack of data about the use of enteral glutamine. The aim of our study was to determine the effect of enteral glutamine supplementation on the incidence of hospital infections and death. Design: retrospective cohort study. Inclusion criteria: males and females > 18 years of age, TBSA burned 20%-80%, nasogastric intubation.Patients were divided in two groups: glutamine group (n=25) and control group (n=17). In the study group enteral glutamine was administered to the patients for 5 days after admission to the ICU. Baseline characteristics were well balanced between groups. No significant difference was found between groups on patients' age, sex, TBSA, need for mechanical ventilation and rate of inhalation injury. Primary outcome was all-cause mortality. Secondary outcome was rate of nosocomial infections (skin and skin structure infections (SSSI), lower respiratory tract infections, urinary tract infections, bacteremia, sepsis). Mortality rate was 6 (24%) and 7 (41%) in the glutamine group and the control group, respectively, p=0.40. Rate of nosocomial infections was 14 (56%) in the glutamine group and 11 (65%) in the control group, respectively, р=0.81. Rates of SSSI, lower respiratory tract infections, urinary tract infections and sepsis did not differ significantly between the groups: 11 (44%) and 6 (35%), p=0.81; 6 (24%) and 7 (41%), р=0.40; 1 (4%) and 1 (6%), р=1.00; 6(24%) and 5 (29%), р= 0.97, respectively. Rate of bacteremia was significantly different between the groups: 1 (4%) in the glutamine group and 5 (29%) in the control group, p=0.03. Retrospective design is a significant limitation of our study. Enteral glutamine supplementation may reduce the incidence of bacteremia in burn patients, but has no influence on the incidence of other nosocomial infections and mortality. Further large clinical trials are needed. with outcomes were assessed with multivariable logistic regression and Cox proportional hazard analyses, adjusted for baseline risk factors and randomization. In sensitivity analyses, models were further adjusted for key regulators of ketogenesis to assess whether any effect was direct or indirect. Late PN increased plasma 3HB as compared with early PN, with maximal effect on day 2 (P<0.0001 for day 1 to 5 and for the "maximal effect" day in the 1142 patients). Adjusted for baseline risk and randomization, plasma 3HB associated with a higher likelihood of earlier live weaning from mechanical ventilation (P=0.0002) and of earlier live PICU discharge (P=0.004). As plasma 3HB replaced the effect of the randomization, the 3HB effect statistically explained these benefits of the randomization. Further adjustment for key regulators of ketogenesis did not alter these findings. Plasma 3HB did not independently associate with the risk of infections and mortality. Withholding early PN increased ketogenesis in critically ill children, an effect that statistically mediated part of its clinical benefits. Critical care patients are prone to frequent feeding interruptions for various reasons including feeding intolerance. These interruptions can lead to adverse outcomes. The aim of the study was to determine the reasons for and the duration of interruptions of enteral nutrition (EN). Single-center observational, cross-sectional study in a 19-bed mixed ICU of a tertiary hospital. Duration: 6 months. 50 patients, aged 65.4 years old (±14.6), that stayed in the ICU > 48 hrs and were fed with EN were included. Anthropometric data, BMI, time of initiation of prescribed EN, type of EN formula, daily calories delivered were recorded. Energy intake was calculated according to ESPEN guidelines (25 Kcal/ kg BW/day). The causes for and duration of interruption were reviewed from the patient's chart. APACHE II and mNutric score was calculated for all patients. mNutric score ≤5 was used to diagnose malnutrition. All patients included in the study were endotracheally intubated. APACHE II was 22.4 ±5.6. 58% of patients had increased risk of malnutrition. ICU stay was 24.4 (8.0±32.0) days, and the in-hospital mortality was 24%. There were 318 episodes of EN interruptions over a median ICU stay of 24.4 days. Median 2.5 interruptions/patient. The most common reason for EN interruption was gastric residual volume monitoring followed by diagnostic and therapeutic procedures (Figure 1 ). Other reasons include surgery, intolerance and/or delayed feeding and extubation. The median lost feeding time was 5.4 hours/ day (3.7-7.4) for all causes, while the mean loss of total energy intake was 790 kcal/day (±321)/day. Average body weight of the patients was 78 Kg (±12). Caloric deficit was calculated at 1950 Kcal/day or 40% of the prescribed caloric goal. The results of this study showed that interruptions can lead to substantial caloric deficit, malnutrition and adverse events. An interruptionminimizing protocol could be useful in order to reduce the missing hours and to improve the clinical outcomes. Relationship of goal-directed nutritional adequacy with clinical outcomes in critically ill patients PC Tah There are controversies surrounding the effects of optimal nutritional intake on clinical outcomes in critically ill patients. This study aimed at investigating the relationship of goal-directed energy and protein adequacy on clinical outcomes which includes mortality, intensive care unit(ICU) and hospital length of stay (LOS), and length of mechanical ventilation (LOMV). This was a single centre prospective observational study. Nutritional requirements were guided by Indirect Calorimetry and 24-h urinary urea.Nutritional intake was recorded daily until death, discharge, or until day 14 of ICU stay. Clinical outcomes were collected from patient's hospital record. The relationship between the two groups (< 80% and ≥80% of overall nutritional requirement) with mortality outcomes was examined by using logistic regression with adjustment for potential confounders. Terlipressin, despite being one of the main treatments for acute variceal bleeding, may lead to severe hyponatremia due to its antidiuretic activity.We aimed to identify risk factors for development of hyponatremia during terlipressin treatment. Retrospective study of patients admitted to Acute Intermediate Care Unit for hypertensive upper gastrointestinal bleeding due to chronic liver disease who received terlipressin(december2011-decem-ber2018).Hyponatremia was defined as a decrease in Na serum levels ≥5mEq and severe hyponatremia as >10mEq within 3 days of treatment. We studied 191 patients, 84.3% male, mean age of 58.6 years (SD 10.8). Alcohol-related liver disease was the most frequent etiology. Hyponatremia occurred in 53 patients (27.7%). Serum Na Δbetween -5 and -10mEq and serum Na Δ>-10mEq occurred in 20.4 and 7.3%, respectively (Table 1) . Severe hyponatremia occurred in 11 patients (5.8%) and symptoms were reported in two cases (status epilepticus and altered mental status). Patients with higher baseline levels of Na were more susceptible to terlipressin-induced hyponatremia and a longer length of stay was observed in patients with serum NaΔ>-10 mEq (6.3 vs 4.4 days, p<0.022). The prevalence of hyponatremia in our study was lower than previously reported.Higher serum Na at admission and AIH as etiology of cirrhosis were predictors of terlipressin-induced hyponatremia. Neither the cumulative dose of terlipressin nor the duration of treatment appear to be related to the development of hyponatremia a Δ48h-[Na] >5 mmol/L was associated with larger hazards of mortality ( Figure 1 ). An increase in serum sodium in the first 48 hours of ICU admission is independently associated with a higher mortality in patients admitted with mild hyponatremia, normonatremia, and hypernatremia. Based on our findings, it is possible that mild hyponatremia may be a protective mechanism in critical illness, which questions common practice of routinely correcting serum sodium when it is too low. Introduction: Acute liver failure (ALF) represents a life-threatening organ dysfunction associated with increased mortality and liver transplantation represents the only definitive treatment. The aim of this study was to assess the effects of renal replacement therapy in combination with hemoadsorption in ALF patients. Twenty-nine patients with ALF admitted to the intensive care unit (ICU) of Fundeni Clinical Institute were included in the study. After ICU admission, 3 consecutive session of hemoadsorption in combination with continuous veno-venous hemodiafiltration were applied. Number of organ dysfunctions and SIRS criteria were recorded at ICU admission. The following data were recorded before and after the 3 hemoadsorption therapies: Glasgow coma scale, PaO2/FiO2, creatinine, 24-hours urine output, bilirubin, leucocyte and platelet count, heart rate, mean arterial pressure and vasopressor support, C-reactive protein and procalcitonine. CLIF-SOFA score was calculated before and after the therapy. ICU length of stay and 28-days outcome were noted. The mean age in the study group was 34±14 years. The median number of SIRS criteria was 3 [2, 4] and the median number of organ dysfunctions was 3 [1, 6] . The use of hemoadsorption was associated with a decrease in creatinine (from 1.9±1.4 to 1.2±0.8 mg/dL, p= 0.02), bilirubin (from 14.2±12.6 to 9.2±9.1 mg/dL, p=0.05) and platelet count (96482±70913 / uL to 51275±24393 /uL, p=0.01). We also observed a decrease in Clif-SOFA score from 12.0±2.1 to 10.0±2.6 (p= 0.05). Overall mortality was 37.9% (n=11). Six patients (20.7%) underwent liver transplantation with 100% 28-days survival. The use of hemoadsorption in patients with ALF is associated with improvement in liver and kidney functional tests and may represent a new therapy in bridging these patients to liver transplantation. Introduction: Impairment of intestinal mucosal barrier function is the initiating factor of sepsis. In order to explore the effect of lactic acid bacteria on intestinal barrier function impaired by sepsis, it is necessary to establish sepsis and lactic acid bacteria ecological models. However, how to construct these models is still unclear. Co-cultures with a gradient of lactic acid bacteria and Caco-2 cells were constructed. The symbiotic state was observed under an inverted microscope and lactate dehydrogenase (LDH) toxicity tests, transepithelial electrical resistance(TEER) tests and Western blots were used to determine effective concentrations of lactic acid bacteria in monolayer cell models. Lipopolysaccharide (LPS) was used to treat cells, and Cell Counting Kit-8, quantitative reverse transcription PCR(RT-qPCR) and enzyme linked immunosorbent assays (ELISA) were used to determine the appropriate concentration for sepsis models. The number of living cells decreased significantly when the MOI(number of lactic acid bacteria/cell number) reached 8 ( Figure 1 , panels 1a, b). The release of LDH indicated that damage to cells began to increase when the MOI exceeded 10 (panels 2a, b). At an MOI of 0.5, resistance values began to increase over time, whereas resistance values began to decrease when the MOI reached 10 (panel 3). As the number of lactobacilli increased, the expression of tight junction protein increased and then decreased (panel 4a, b, c). In sepsis model experiments, the cell survival rate began to decrease once the concentration of LPS exceeded 10^4 ng/ml (panel 5). RT-qPCR results showed that 10 2 ng/ml LPS significantly increased inflammatory cytokines (panel 6), and ELISA results consistently showed that TNF-α and IL-6 increased significantly when LPS concentrations reached 10 2 ng/ml (panel 7a, b). It is feasible to construct a cell monolayer model of lactic acid bacteria and LPS. The appropriate MOI of lactic acid bacteria is 0.5 and the optimal concentration of LPS is 10 2 ng/ml. Introduction: Sepsis is associated with high mortality and morbidity. As the severity increases, physiological parameters such as pH changes are one of the most notable features in metabolic acidosis secondary to high lactate. Currently there is no point of care test other than blood gas measurement that could detect these pH changes. This is challenging especially in prehospital environment. The aim of this study is to develop a novel rapid point of care testing using a sensor to detect pH change in blood. Sensors were produced by screen printing graphene and silver electrodes and functionalizing the graphene working electrode with an active layer of melanin. A preclinical sensor model was produced by adding lactic acid to a citrated plasma sample thus altering its pH over a clinically relevant range. The pH sensors were exposed to modified plasma, recording any changes in the voltage. The relationship between the voltage potential and plasma pH was established using weighted least squares regression. A pH dependent change in the measured voltage, with respect to the pH of the solution, was observed with a sensitivity of -111.27 mV/pH +/-15.95 over a physiologically relevant pH range between pH7.1 and pH7.6. In this first phase proof of concept study a low cost, pH sensor was fabricated and demonstrated to be effective in measuring the pH of the plasma. This is the first time that such a sensor has been demonstrated and validated to work in this preclinical model of acidosis. The technology demonstrated here is a promising candidate for a point of care test whereby abnormal blood pH levels can be detected and monitored outside of a laboratory environment in a rapid manner. Further studies are now underway to detect this change in whole blood. (Figure 1) . Over one year only a small proportion of patients (n=7, 4%) were classified as 'Intermediate High' risk and potential candidates for reperfusion therapies. The revised national early warning score (NEWS) with modified Glasgow PROGNOSTIC SCORE (mGPS) is superior to the NEWS for predicting in-hospital mortality in elderly emergency patients T Mitsunaga Jikei university school of medicine, Emergency Medicine, Tokyo, Japan Critical Care 2020, 24(Suppl 1):P286 The National Early Warning Score (NEWS) was developed in the UKto identify the risk of death. The previous study showed that the modified Glasgow Prognostic Score (mGPS) correlate with frailty in elderly patients [1] . The aim of this study is to evaluate the predict value of the revised NEWS with mGPS for in-hospital mortality (in 28 days) in elderly emergency patients. This study is secondary analysis and was carried out in Jikei University Kashiwa Hospital, in Japan, from 1 April 2017 to 31 March 2018. The acute medical patients aged 65 and older were included. The NEWS was derived from seven physiological vital signs. The mGPS was derived from C-reactive protein (CRP) and Albumin. Discrimination was assessed by plotting the receiver operating characteristics (ROC) curve and calculating the area under the ROC curve (AUC). The AUCs for predicting in 28 days in-hospital mortality were 0.818 for revised NEWS with mGPS and 0.797 for the original NEWS. The AUC of the revised NEWS with mGPS was significantly higher than that of the original NEWS for predicting in-hospital mortality (p < 0.001) (Figure 1) . Our single-centred study has demonstrated the utility of the revised NEWS with mGPS as a high predictor of acute phase in-hospital mortality in elderly emergency patients. The diagnostic performance of the five main emergency department (ED) triage systems has been shown to be poor in distinguishing acute coronary syndromes (ACS) from mild severity diseases in chest pain patients. These ED triage systems are either clinically-based, being more sensitive or ECG-based, more specific [1] . The goal of the study was to evaluate if incorporation of cardiovascular risk factors (CVRF) into ECGbased triage could increase his diagnostic performance. CECIDOC is a prospective, observational, single-center study in an academic Hospital. All consecutive adult patients admitted for acute chest pain were included. We compared the ECG-based FRENCH triage system [2] to a modified system upgrading patients with a normal ECG but significant cardiovascular risk from a low acuity triage score (waiting period before medical assessment of max. 60 min.) to a high acuity triage score (waiting period before medical assessment of max. 20 min.). The final diagnosis was determined after a 30-day follow-up. We predefined as being adequate a high-acuity triage score (level 1 or 2) for ACS and a low-acuity score (level 3, 4 or 5) for mild severity diseases. A total of 190 patients was enrolled over a 5-month period (age 56.8 ± 16.4; M/F ratio 1.7). Triage scores of 35 patients (18.4%) with ACS were compared to 103 patients (54.2%) with mild severity diseases. Taking into account CVRF, the sensitivity of the triage system increased from 60 to 80% whereas the specificity decreased from 74 to 61%. Area Under the ROC Curve (AUC) went from 0.69 to 0.72 (Fig. 1) . For chest pain triage at ED, addition of cardiovascular risk factors into ECG-based triage increases his diagnostic performance. Approximately 20% of patients presenting to hospital with an intentional overdose require admission to an intensive care unit (ICU) [1] . There are currently no UK guidelines regarding the optimal use of CT head scans (CTH) in this patient cohort [2, 3] . This study aims to determine whether we should be performing CT head scans in obtunded patients with suspected overdose requiring admission to Intensive Care. We performed a retrospective search of the ICNARC database for Plymouth University Hospital Trust, looking for patients admitted to the ICU with overdose or self-poisoning as a primary diagnosis. 146 patients were identified and 86 of these patients required intubation due to obtundation(GCS<8). There were 59 males and 27 females with an average age of 38 years old. The median length of stay on the unit was 1 day. 52 of the patients has a past medical history of mental illness, and 53 overdosed on prescribed medications. The average GCS recorded on admission was 5. 52 of the 86 (64%) patients had a CTH on admission, of which 5 were part of a trauma scan. 11 were known overdoses and 7 were suspected overdose as per the CTH request form. The main rationale behind those requests were to exclude additional intracranial injury. None of those CTH showed any signs of acute pathology (Figure 1) . In this retrospective study, obtunded patients with suspected or known overdose with no history of apparent trauma or injury do not benefit from CTH. In the absence of a history of trauma or focal neurological signs our conclusions are that CTH provides limited value in the management of these patients. The audit was carried out to objectively investigate the problems associated with technique of folley catheterization in emergency department and 3 indoor units of internal medicine wards [1] . Introduction: Cellular and molecular mechanisms, epigenetic aspects of acute clozapine poisoning are studied insufficiently. The aim of this study was to identify morphological and epigenetic alteratons in brain neurons during acute exposure to clozapine combined wit ethanol. The experiments were carried out on male Wistar rats weighting 200-250g (n=21). Group I (control) received 0.9% NaCl solution enterally; group IIclozapine 150 mg/kg in 0.9% NaCl solution; group IIIclozapine 150 mg/kg in 40% ethyl alcohol. After 4 hours euthanasia was performed. Autopsy included withdrawal of brain samples for histological examination (n = 21) and for determination of global DNA methylation level (n = 21). The global DNA methylation level (5-mC%) was determinated by fluorimetric method. Inter-group comparisons were made by Kruskal-Wallis test. Histological examination of paraffin sections of brains stained with hematoxylin and eosin was performed by light microscopy. In acute сlozapine poisoning and its combination with ethanol morphological changes in neurons of the cerebral cortex were detected. In acute сlozapine with alcohol poisoning an increase of global DNA methylation level was observed. Probably the identified changes have a common pathogenesis which will be clarified in our further studies. There is limited information available regarding the prevalence of adder bites and the complications of envenomation. NHS data suggests there are 100 adder bites annually in the UK with the last fatality in 1975 [1] . We performed an audit into adder bites in South West Wales to identify the number attending our Emergency Departments, their management and clinical course as well as any environmental factors that predict increased likelihood of being bitten or the severity of the bite. A retrospective study of adder bites attending Emergency Departments in South West Wales was undertaken (Jan 2014 to Aug 2019). Measurements included were patient demographics, clinical presentation, type of treatment (conservative vs anti-venom) and outcome. Results: 31 patients were included, age range 2-72 years ( Figure 1 ). The majority of bites occurred in sand dunes (41.9%) and all bites were on extremities. Anti-venom was administered to 45.2% (14/31) of patients. There was a significant positive association between the use of anti-venom and the length of hospital stay (r = 0.520; p=0.003) and a significant negative correlation between the anti-venom use and both diastolic and systolic blood pressure (p= 0.501 and 0.487 respectively p=0.01). All patients fully recovered. In this study, we demonstrated that with a full clinical assessment on presentation it is safe to decide whether anti-venom is required. The current guidelines are safe and effective in the treatment of adder bites. 98μmol/L, for PaO2 < 9.9kPa and > 12.3 kPa, platelets < 165*10^9/L and > 327*10^9/L, and bilirubin > 12μmol/L. In our population of adult ED patients, the thresholds of vital values associated with increased 7-day mortality were very close to routinely used values, and most of the thresholds were included in the lowest urgency level in triage and risk-stratification scoring systems. The workload in the emergency room: direct assessment by the Therapeutic Intervention Scoring System-76 and indirect assessment by the NASA Task Introduction: The number of emergency room admissions continues to increase each year, which increases the care workload of the emergency department staff, who should to use its theoretical and practical knowledge in order to provide quality care in difficult working conditions. The aim of our study was to assess the emergency room staff workload its impact on health workers and patients and to suggest an improvement strategy to decrease this workload. A prospective, monocentric cohort study with descriptive and analytic approach over one month (December 2018) conducted at the Emergency Department of an academic hospital. The workload endured by the emergency room staff was evaluated by the NASA Task Load Index and on patients by the Therapeutic Intervention Scoring System-76. There were 286 cumulative days of hospitalization in 67 consecutive patients admitted to the emergency room. The average age was 61 ±15 years. The average length of stay at the emergency room was about 103 ±48h. The average TISS-76 score was 31.7 ±14.9. Factors associated with important care workload were: age ≥ 65 years, diabetes, more than 3 comorbidities, the use of intravenous antibiotics; the use of vasoactive drugs and the use of mechanical ventilation; a high TISS score was predictive of emergency room mortality. In the indirect assessment of the care workload, 41 medical and paramedical staff were interviewed, 73% of them were under 40 years old with a sex ratio of 0.58. A high level of mental and physical workload was expressed by ED staff with considerable level of frustration; The ED staff suggested mainly to improve the working conditions, communication and to redefine tasks "who does what". Our study had shown a significant workload in the emergency room, a process to reduce this workload is being implemented Medical simulation is a modern teaching tool increasingly used in specialties such as anesthesia, emergency medicine and obstetrics. However, it's not widely used in specialties like cardiology, althought cardiovascular emergencies are very frequent. The purpose of our study was to assess the effectiveness of simulation-based medical education in the management of cardiovascular emergencies among moroccan graduate students. We conducted a prospective, observational, multi-centrer study including the students of three moroccan universities from the 5 th to the 7 th year of medicine who underwent 6 phases: First a pre-test, then a theoretical and practical training on cardiovascular emergencies after which the students were separated in two groups, one undergoing the medical simulation training (group 1) and one who didn't (group 2), followed by a theoretical then a practical post-test on Resusci Anne and SimMan®. At last, the students were asked to answer a satisfaction survey. The reform procedure in the Tunisian army consists in repairing the physical damage and deciding on the applicant's ability to continue working. Terrorism increases the impact of the co-morbidity generated and the socio-economic consequences that result from it. The purpose of this work was to study the epidemiological, clinical and evolutionary profile of terrorist injuries, to specify the rates of consequent Partial Permanent Disability (PPI) and the possibilities of returning to work. Descriptive retrospective cross-sectional study of 177 reform files on military personnel injured during anti-terrorist operations from Fig. 1 (abstract 296) . Changes in total BCPR rate in family-and friends-witnessed OHCA cases with dispatcher-assisted instruction during 20-week period after the day of disaster during three years January 2013 to September 2019. The data collection was carried out on the basis of a collection form. Our 177 wounded were male, 96% of whom belonged to the army. The average age was 36 years and 3 months ± 8.869. Half of our wounded were troopers. Infantry and special forces were the most exposed military units. Half of the accidents were recorded in the Kasserine region (88 cases). Chronic post-traumatic stress disorder (CPTSS) was found in 130 injured, followed by amputations in 18 injured. The after-effects were psychological in 32%, physical in 26% and mixed in 39% of our injured. The PPI rate ranged from 36% to 75% in 23.7% of injuries.. More than half of the injured had returned to their professional activity, 33% were put on reform for health reasons. Our results showed that the ESPTC was the most recorded sequel, and that the PPI rate was significant in a quarter of our injuries. In our series, a third of our wounded were put on reform for health reasons. To state the importance of initial care and adequate and rigorous follow-up to recover a greater number of war wounded. Introduction: The rapid response system (RRS) has been shown to decrease hospital mortality [1] . The Japanese Coalition for Patient Safety has set a major goal for hospitals to more widely implement the RRS. However, prevalence and actual circumstances of use in acute care hospitals (including small scale hospitals) in Japan are as yet not well-known. Web-based questionnaires were sent to acute care hospitals (of scale 75 beds-or-larger) of 17 prefectures in western Japan. Each participant hospital selected a certain department which answered the questionnaire. The RRS included the medical emergency team (MET), the rapid response team (RRT), and the critical care outreach team (CCOT). We investigated the presence and circumstances of in-hospital emergency calls, RRS and other systems, and then illuminated issues to be solved. Our study suggests that delays in patient transfer to the ICU after RRT activation in the wards were associated with slower physiological improvement.These findings support further and larger studies. Blood and blood products use in intensive care unit M Akcivan, S Bozbay, O Demirkiran Istanbul University Cerrahpasa, Anesthesiology and Intensive Care, Istanbul, Turkey Critical Care 2020, 24(Suppl 1):P304 Blood and blood product (BP) transfusions are frequently used in intensive care units (ICU) [1] . It is important to know transfusion epidemiology and the effect of adverse transfusion reactions and their effect on mortality and morbidity.We aimed to investigate the blood and BP transfusions in the ICU. Blood and BP transfusions in ICU, between 2013-2017 were reviewed retrospectively. We evaluated each transfusion as a data and examined the pre-and post-transfusion laboratory values, demographic data, cause of ICU admission and comorbidities. Results: 284 patients who underwent transfusion in the ICU, and 2188 transfusion data from these patients were included. The most frequent cause of hospitalizations were respiratory failure and sepsis. The rate of patients transfused in the five-year period decreased from 73.9% to 36.67%. The hemoglobin threshold before transfusion decreased from 8.34 g / dl to 7.91 g / dl. A total of 148 transfusion reactions were observed and the most common transfusion reaction was febrile non-hemolytic reaction. The most commonly transfused product was red blood cell suspension. Transfusion reactions were found to be slightly higher in men than women in young age group(<65y) (p = 0.44 and p=0.021, respectively). Transfusion reactions were found to be more frequent in emergency transfusions (p <0.01). The number of transfusions was significantly lower in patients with APACHE II score <20 (p <0.01). The need for transfusion was found to be higher in patients with hematological malignancy (p <0.01). It was observed that as the mean number of transfusions increased the mortality is also increased (p <0.01). Transfusion therapies are the treatments that are vital but have a serious mortality and morbidity risk. In particular, intensive care patients should be considered in detail because of their specific features. Restrictive transfusion practices have positive results. Association between anemia or red blood cell transfusion and outcome in oncologic surgical patients. Figure 1A) . The association between RBC transfusion and adverse events also remained after adjustment (OR 4.3 [2.2 -8.8] ; p < 0.001) ( Figure 1B) . In oncologic surgical critically ill patients, there was an independent association between anemia (even moderate anemia) or RBC transfusion and patient outcomes. Our findings highlight the need for further research to determine the optimal transfusion strategy in surgical oncologic patients. Transfusion impaired skin blood flow when initially high E Cavalcante dos santos, W Mongkolpun, P Bakos, AL Alves da Cunha, C Woitexen Campos, JL Vincent, J Creteur, FS Taccone Erasme Hospital, Intensive Care Department, Brussels, Belgium Critical Care 2020, 24(Suppl 1):P306 Red blood cell transfusion (RBCT) increases global oxygen delivery (DO 2 ) and may improve microcirculation. However, the effects on blood flow have been found to be conflicting. We studied ICU patients with stable hemodynamic status (mean arterial pressure (MAP) ≥ 65 mmHg for at least 6 hours) and without active bleeding, who received a RBCT. Skin blood flow (SBF) was determined (Periflux System 5000, Perimed, index finger; perfusion unit, PU) together with MAP, heart rate (HR), hemoglobin (Hb), lactate levels and ScvO 2 before and after RBCT. SBF was measured before RBCT (T0) and after (T1) for each 3 min. According to previous data indicating the lowest SBF value found in noninfected ICU patients was 151 PU, all patients were analyzed according to the baseline SBF (i.e. <151 PU -low SBF vs. ≥151 PUhigh SBF). The relative change of SBF (ΔSBF) was calculated after RBCT and the responders were defined by the function of >10%. Results: 63 ICU patients were studied. RBCT was associated with increases in MAP and ScvO 2 but no change in SBF. At baseline, ScvO 2 was lower in the responders than in the non-responders (p=0.04) and lower in patients with low SBF than in the high SBF (p=0.04). There was no difference in Hb, MAP, and lactate, between the patients with low and high SBF. After RBCT, MAP rose in the responders (p<0.01) and in the non-responders (p=0.03), SBF (p<0.01) rose in patients with low SBF, and SBF (p=0.02) decreased in patients with high SBF. There was a negative correlation between baseline ScvO 2 (r= -0.363, p<0.01) or baseline SBF (r= -0.560, p<0.01) and the relative increase in SBF after RBCT. RBCT increases skin blood flow only when it is impaired at baseline. Severe immune dysregulation is associated with adverse outcomes and is common in intensive care unit (ICU) patients [1] . Erythropoietin-stimulating agents (ESAs) have both anti-apoptotic and immune-modulating properties [2] . Despite potential benefit, both the safety and efficacy of these agents remains unclear [3] . Here we evaluate the impact of ESAs on morality at hospital discharge in critically unwell adult patients admitted to the ICU. We conducted our search strategy in accordance with a predetermined protocol. The use of FFP is associated with an increased incidence of complications such as acute respiratory distress and infections, and the rate of complications increased with the quantities of FFP transfused [1] . PCC contain several important coagulation factors and it has been suggested that they could replace FFP. This has been shown mainly in case reports or series in which coagulation factor deficit was detected by using POC viscoelastic tests in trauma [2] or traditional hemostatic tests in obstetric patients [3] . Multicenter observational study of the safety and efficacy of the prothrombin complex concentrate. A survey of anesthetists was conducted in 19 maternity hospitals at various levels of care in the Russian Federation. Data has been collected and processed. As a result, 251 patients were analyzed. PPH was determined as a volume of blood loss more than 500 ml during vaginal delivery or CS. The most significant risk factors for PPH were: preeclampsia or arterial hypertension and a history of postpartum hemorrhage. 32.3% had no risk factors for PPH. It was determined that the use of Prothromplex 600 IU decreased the number of patients with transfusion FFP 12-15 ml/kg by 27.8% and increased the number of patients without transfusion by 25.9%, compared with patients without use of Prothromplex 600 IU (Figure 1 ). No complications were detected. The use of PCC safety and efficacy reduce use of FFP during PPH. The full analysis included 45 patients on either HFC (n=22) or cryoprecipitate (n=23). The intraoperative and postoperative changes in ETP and fibrinogen concentration are shown in Table 1 . For FIBTEM A20 (intraoperatively) and fibrinogen concentration (intraoperatively and postoperatively), the mean numerical values appeared higher with HFC than cryoprecipitate. FXIII (HFC: 121.86%, 66.85%; cryoprecipitate: 115.55%, 68.68%, at baseline and 4hr after surgery start), FVIII and VWF were maintained throughout surgery in both treatment groups. This was also the case for laboratory tests activated partial thromboplastin time, prothrombin time and platelet count. The FORMA-05 coagulation parameters analyses showed broad overlaps between HFC and cryoprecipitate, with satisfactory maintenance of the clot quality parameters, FXIII concentrations and thrombin generation parameters. The study group includes 118 men and 40 women with a mean age of 43,008 vs. 40.32 years (p=0.639) admitted with the diagnosis of multiple trauma. We found a directly proportional and highly significant statistical correlation between base excess and fibrinogen level diagnosed using the MCF/FibTEM parameter(r=0.6382, p<0.0001)and an inverse proportional correlation between lactate level and fibrinogen level (r= -0.2164, p=0.0065). In the ROC analysis that uses as a variable the level of base excess and as a criterion of classification the fibrinogen deficit (MCF/FibTEM<12 mm) it can be observed that at a value of BE<-7 mmol/l, we can diagnose a fibrinogen deficit with a sensitivity of 88.2% and a specificity of 80.6% (AUC= 0.872,p< 0.0001). Lactate appears to be inferior to the excess base (Figure 1) , but still has a good diagnostic power, a value of 2.6 mmol/l has a sensitivity of 67.1% and a specificity of 75% (AUC= 0.754,p<0.0001). The difference between the two ROC curves (0.118) is statistically significant (p = 0.0028). Both base excess and serum lactate can be used to diagnose fibrinogen deficiency with the mention that base excess appears to have a higher sensibility and specificity ability. based goal-directed algorithm. This approach requires further clinical validation. We conducted a retrospective study comparing transfusion strategies in patients with major trauma between 2013 and 2018. We retrieved demographic data and blood products administered from patients with at least one red-blood cell (RBC) transfusion. Primary outcome was a reduction of RBC administration. Secondary outcomes were mortality, ICU length of stay and acute kidney injury. We included 141 patients admitted in the ICU due to severe trauma (SAPSII:41.5 ±21.9), and mainly after emergent surgery (68.8%). They featured a mean age of 45.3±19.3y, were predominantly male (76.6%) and 73% were in shock. In the first 24 hours of hospital admission a mean of 3.6±4.5 RBC units were administered. Most patients received a fibrinogen-based protocol (FBP) (78%), with an average of 5±3g of fibrinogen and 1±3 fresh-frozen plasma (FFP) units, versus 3±4 g of fibrinogen and 6±4 FFP units in the FFP group. The FBP was associated with a decrease administration of RBCs in the first 24 hours (R = -2.6; p < 0.004), even after adjustment for severity (p=0.003) and for tranexamic acid use (p = 0.003). It was associated also with a decrease of platelet transfusion (p=0.004). Fibrinogen-based protocol was not associated with a decrease in mortality, acute kidney injury or noradrenaline dose. Treatment of TIC in past years has progressively changed to a goaldirected fibrinogen-based approach. In our population, the use of FBP lead to a reduction of RBC administration in severe trauma patients. Prospective, multicenter, randomized study comparing administration of clotting factor concentrates with a standard massive hemorrhage protocol in severely bleeding trauma patients The objective of this study was to assess the ability of the Quantra® QStat® System (HemoSonics) to detect coagulopathies in trauma patients. Many Level 1 trauma centers have adopted whole blood viscoelastic testing, such as rotational thromboelastometry (ROTEM®, Fig. 1 (abstract 315) . Study Treatment Plan Instrumentation Lab) for directing transfusion therapy in bleeding patients. The Quantra QStat System is a cartridge-based point-of-care (POC) device that uses ultrasound to measure viscoelastic properties of whole blood. and provides measures of clot time, clot stiffness and a test of fibrinolytic function. Methods: Adult subjects were enrolled at two Level 1 trauma centers which use a ROTEM based protocol to guide transfusion decisions. Study protocols were approved by the site's Ethics Committee. For each subject, whole blood samples were drawn upon arrival to the emergency department and again, in some cases, after administration of blood products or antifibrinolytics. Samples were analyzed on the Quantra (at POC) in parallel to ROTEM delta (in lab). A total of 54 patients were analyzed. Approximately 42% of samples had a low Clot Stiffness (CS) values suggestive of an hypocoagulable state. The low stiffness values could be attributed to either low platelet contribution (PCS), low fibrinogen contribution (FCS), or a combination ( Figure 1) . Additionally, 12% of samples showed evidence of hyperfibrinolysis based on the Quantra Clot Stability to Lysis parameter. Samples analyzed on standard ROTEM assays showed a lower prevalence of low clot stiffness and fibrinolysis based on EXTEM, FIB-TEM results. The correlation of CS and FCS vs equivalent ROTEM parameters was strong with r-values of 0.83 and 0.78, respectively. This first clinical experience with the Quantra in trauma patients showed that the QStat Cartridge detected coagulopathies associated with critical bleeding and may be useful for directing blood product transfusions in these patients. Ability to perform testing at POC may provide additional clinical advantage. The objective of the study was to describe the conditions of use of FIBRYGA® 1g, a new, highly purified, human fibrinogen (HF) recently granted a temporary import authorization for use in congenital and acquired fibrinogen deficiencies in France. Observational, non-interventional, non-comparative, retrospective study conducted in 5 French hospital centres using FIBRYGA®. Data from patients with fibrinogen deficiency having received FIBRYGA® from December 2017 to July 2019 were retrieved from their medical files. Indications, modalities, efficacy and safety outcomes were recorded. Indications encompassed non-surgical bleeding (NSB) either spontaneous or traumatic, including post-partum hemorrhage (PPH), bleeding during surgery (SB) or administration to prevent bleeding during planned surgery. Treatment success was defined as control of the bleeding or hemoglobin loss <20% for bleeding treatment and as absence of major perioperative hemorrhage for pre-surgical prevention. This analysis included 110 patients aged 56,7 ± 17.7 years and 60% were male. All presented an acquired fibrinogen deficiency requiring administration of HF. Indications were NSB (n=45, 40.9%) including 15 (13.6%) PPH, SB (n=31, 28.2%), and prevention of SB (n=34; 30,9%). Cardiac surgeries were the main procedures associated with treatment and prevention of SB. Mean total doses of FC were 2.95± 1.66g, 2.00±1.37g and 2.21±1.23g for NSB, SB and prevention of SB. Success rates were 88.4% (95%CI 78.8-98.0%), 96.8% (95%CI 90.6-100%) and 91.2% (95%CI 81.6-100%) respectively. For PPH, mean dose of HF was 2.53±0.74g with a success rate of 86.7% (95%CI 69.5-100%). Overall, tolerance was good. Fibrinogen concentrate FIBRYGA® is mostly used for bleeding control. In one third of patients, HF was administered preventively to avoid bleeding during surgery. Use of FIBRYGA® was associated with favourable efficacy outcomes. Functional testing for tranexamic acid effect duration using modified viscoelastometry T Kammerer 1 , P Groene 2 , S Sappel 2 , P Scheiermann 2 , ST Schaefer 2 1 Ruhr-University Bochum, Institute of Anaesthesiology, Heart and Diabetes Center NRW, Bad Oeynhausen, Germany; 2 Ludwig-Maximilans University, Department of Anaesthesiology, Munich, Germany Critical Care 2020, 24(Suppl 1):P318 Tranexamic acid (TXA) is the gold standard to prevent or treat hyperfibrinolysis [1] . Effective plasma concentrations are still under discussion [2] . In this prospective, observational trial using modified viscoelastometry we evaluated the time-course of the antifibrinolytic activity of TXA in patients undergoing cardiac surgery. Methods: 25 patients were included. Modified viscoelastometry (TPA-test) was performed and TXA-plasma-concentration, plasminogen-activatorinhibitor-1 (PAI-1) and PAI-antigen-plasma-concentrations were measured over 96h. Additionally, in vitro dose-effect-curves from blood of healthy volunteers were performed. Data presented as median with interquartile range (Q1/Q3). Results: TXA plasma-concentration was increased compared to baseline (T1:0 μg ml -1 ) at every time-point with a peak concentration 30min (T2) after application (p<0.0001; see Fig.1A ). Lysis was inhibited from 30min (LysisTime TPA-test : p<0.01; LysisOnsetTime TPA-test :p<0.0001). MaximumLysis TPA-test was decreased at T2 (T1:97% (96/97) vs. T2: 9% (6/11); p<0.0001). Of note, after 24h some patients (n=17) had normalized lysis whereas others (n=8) had strong lysis inhibition (ML< 30%;p<0.05) up to 96h. High and low lysis groups differed regarding kidney function (cystatin C:1.64mg l -1 (1.42/2.02) vs. 1.28mg l -1 (1.01/ 1.52);p=0.002) and active PAI-1 (93.05ng ml -1 (33.15/9100.0) vs. 16.13ng ml -1 (6.62/79.98);p=0.047). In-vitro, TXA concentrations >10μg ml -1 were effective to inhibit fibrinolysis. In our trial, after 24h there was still completely blocked lysis in patients with moderate renal impairment. This could be critical with respect to postoperative thromboembolic events [3] . Here modified viscoelastometry could be helpful to detect the individual fibrinolytic capacity. Introduction: Peri-operative coagulopathy correction based on viscoelastic hemostatic assays (VHAs) and single-factor coagulation products has changed the paradigm of bleeding management in cardiac surgery [1] . In a retrospective study, we analysed patients with emergency surgery for thoracic acute aortic dissection (TAAD), before and after the introduction of fibrinogen concentrate in clinical practice. Data were collected from paper and electronic records. The study was approved by the institutional ethical committee. 60 patients were included in the analysis, 19 operated in 2012, before fibrinogen concentrate was approved for human use, and 41 in 2018-2019. Therapy was guided by a rotational thrombo-elastometry (ROTEM) algorithm. Exclusion criteria were non-compliance with the institutional protocol and intra-operative death. We investigated allogeneic blood transfusion (ABT), fibrinogen use, peri-operative bleeding (POB), surgical reexploration and post-operative complications (POC). The groups were similar in gender, age, body weight, additive Euro-Score and aortic cross-clamp time. Fresh frozen plasma, cryoprecipitate and red blood cell transfusion were lower in the fibrinogen group, but not platelet transfusion (table). 48,7% of patients in the study group received fibrinogen concentrate and median dose was 2 g (IQR 2-3). Day 1 postoperative chest tube drainage and surgical reexploration were significantly lower. There were no differences in stroke, renal replacement therapy, mechanical ventilation time and ICU stay. In patients with TAAD surgery, ROTEM-guided algorithms which include fibrinogen concentrate are associated with less (POB), surgical re-exploration and ABT. Further research is needed to document the role of VHAs and concentrated factors in reducing (POC). Andexanet alfa (AA, Portola Pharmaceuticals, San Francisco, CA) represents a modified factor Xa agent which is approved antidote for apixaban and rivaroxaban. Andexanet alfa may also neutralize the anti-Xa effects of betrixaban and edoxaban. This study aims to compare the relative neutralization of these four anti-Xa agents by andexanet alfa in different matrices. Andexanet alfa was diluted at 10 mg/ml. Apixaban (A), betrixaban (B), edoxaban (E) and rivaroxaban (R) were diluted in pH 8.4, 0.5 M tris buffer (TB), blood bank plasma (BBP) and in 5% albuminated buffer (AB) at 0.062 -1.0 ug/ml. Anti-Xa activities of all four agents were measured in three systems and the reversibility indices of AA were profiled. The reversibility index (RI 50 ) of anti-Xa effects by AA was determined at 25 -100 ug/ml. Each of the four agents produced varying degrees of inhibition of anti-Xa at 0.062 -1.0 ug/ml, the IC 50 ranged 0.61 -1.53 ug/ml in BBP, 0.47 -1.28 ug/ml in AB and 0.49 -1.4 ug/ml in TB. Andexanet alfa produced a concentration dependent reversal of all four anti-Xa agents. In the BBP, the RI 50 values for A (192 ug/ml), B (32 ug/ml), E (152 ug/ml) and R (85 ug/ml). In the AB, the RI 50 values for A (140 ug/ml), B (46 ug/ml), E (176 ug/ml) and R (58 ug/ml). In the TB, the RI 50 values for A (154 ug/ml), B (79 ug/ml), E (>400 ug/ml) and R (110 ug/ml). Each of the four anti-Xa agents exhibit varying degrees of matrix independent anti-Xa potencies in different systems, the collective order follows edoxaban > apixaban > betrixaban > rivaroxaban. Andexanet alfa produced matrix dependent differential neutralization of the anti-Xa effects of these agents. Individualized dosing of andexanet alfa may be required to obtain desirable clinical results. The diagnostic and prognostic value of thromboelastogram (TEG) in sepsis has not been determined. This study aimed to assess whether TEG is an early predictor of coagulopathy [1, 2] and is associated with mortality in patients with sepsis. In total, 518 patients with sepsis on intensive care unit admission were prospectively evaluated. We measured TEG and conventional coagulation tests(CCTs)on preadmission and observed for development of 1, 3 days and 1, 3, 7days respectively. Multivariable logistic regression was utilized to determine odds of ICU/hospital mortality. The parameter of TEG (maximum amplitude, reaction time; MA/R ratio) was calculated to evaluate sepsis-induced coagulopathy. The admission patients were divided into three groupsMA/R0 group(MA/R= 5-14mm/min); MA/R1group(MA/R>14 mm/min)and MA/R2 group(MA/R<5mm/min). In our cohort of patients with severe sepsis, coagulopathy defined by MA/R ratio was associated with increased risk of ICU/hospital mortality. Introduction: Blood sampling for coagulation assessment is often carried out in either arterial or venous samples in the Intensive Care Unit (ICU). There is controversy as to the accuracy of this method due to the inherent differences in physicochemical properties as well as the underlying effects of individual diseases in arterial and venous blood. Clot microstructure has shown to be a new biomarker (fractal dimension-d f ) which encompasses the effects of diseases in all aspects of the coagulation system [1, 2] . In this study, we compared the effect of all these factors in venous and arterial blood to see if there is a difference in the clot microstructure and quality. 45 patients admitted to a tertiary intensive care unit and busy teaching hospital were recruited. Arterial and venous blood was sampled from an arterial line and central venous catheter in situ from the same patient. Standard markers of coagulation (PT, aPTT, fibrinogen, full blood count), rotational thromboelastometry (ROTEM), whole blood impedance aggregometry and measured clot microstructure (d f ) were measured on both arterial and venous samples. No significant difference was observed in standard laboratory markers, ROTEM and platelet aggregation between arterial and venous blood. There were no differences in the fractal dimension (d f ) between the arterial and venous blood samples (d f 1.658 ± 0.10 vs 1.654 ± 0.08 respectively, p=0.830). Samples from patients with critical illness give comparable results from either arterial or venous blood despite their underlying pathophysiological process or treatment. This confirms blood for coagulation testing can be taken from arterial or venous blood. Clinicians in the emergency setting use a wide range of hemostatic markers to diagnose and monitor disease and treatment. Current methods rely on the anticoagulant effect of citrate on whole blood prior to laboratory analysis. Despite the well-recognized modulatory effects of citrate on hemostasis, the use of anticoagulated blood has clear analytical advantages, including repeat sampling and storage. However by altering the physiological state of the blood reproducibility and accuracy of the test is affected. Recent studies have shown the potential of a novel functional biomarker of clot formation: Fractal Dimension (d f ), that may give an improved diagnostic accuracy. In this study we assessed the potential of this new biomarker in scientifically measuring the effects of recalcification of citrated samples. Methods: 35 healthy volunteers were included. Unadulterated and sodium citrate samples of blood were taken from each volunteer. Citrated samples were recalcified using (1M CaCl 2 ). In the study we compared unadulterated whole blood d f results to citrated d f results and repeated the citrated d f experiments 5 times for each sample over a 2 hour period to ascertain reproducibility. The d f of citrated blood was significantly lower than that of unadulterated blood (1.57±0.04 vs 1.69±0.04, p<0.001). The results of the citrate samples when tested 5 times over 2 hrs gave a coefficient of variation of 1.16%. For the first time we show that a functional biomarker of clot microstructure, d f , can precisely quantify and measure accurately the direct effect that the addition of the anticoagulant sodium citrate has on whole blood clot microstructure. The study also shows that the test is reproducible and has potential utility as a biomarker of acute disease in the emergency setting in citrated blood. This procedure now needs to be evaluated in a group of acute disease states. In this study, we analyzed the hematological abnormalities of dengue patients by thromboelastography (TEG) at initial and 1-hour of fluid resuscitation. Methods: This is a cross-sectional study evaluating TEG readings of dengue patients with different severities presenting to the emergency department. Laboratory confirmed dengue patient (positive NS1 antigen or IgG/IgM) was consecutively sampled. TEG readings were taken at presentation and after 1-hour of fluid resuscitation. Twenty dengue patients with varying severity had a median Reaction time (R), α -angle, K time, Maximum Amplitude (MA) and Lysis 30% (Ly30) of 0.495 min, 68.74 ο , 3.58 min, 44.64 mm and 0.54% respectively. Mean fibrinogen was normal before and after fluid infusion. There is a non-significant reduction in MA with prolongation of other TEG parameters between different dengue severities. There is a statistically significant reduction of α-angle and MA between pre and post 1-hour fluid resuscitation (p=0.019 and p=0.040). Normal fibrinogen with low MA, which signifies a weak clot strength, may indicate either a platelet reduction, platelet dysfunction or both. Reduction in MA and α-angle post fluid resuscitation is an alarming finding. This is in contrast with previous TEG studies although none of it used normal saline exclusively, studied initial fluid resuscitation in emergency department settings or studied a subject with dengue. A bigger study, especially in severe dengue is needed to validate our findings. Agreement between the thromboelastography reaction time parameter using fresh and citrated whole blood during extracorporeal membrane oxygenation with TEG®5000 and TEG®6s M Panigada, S De Falco, N Bottino, P Properzi, G Grasselli, A Pesenti Fondazione IRCCS Ca´Granda Ospedale Maggiore Policlinico, Intensive Care Unit, Milano, Italy Critical Care 2020, 24(Suppl 1):P327 The R (reaction time) parameter of kaolin-activated thromboelastography (TEG) may be used to assess the degree of heparinization of blood during ECMO. A TEG analysis is usually performed on two types of samples: fresh (F) or citrated-recalcified (C) whole blood. TEG®5000 can perform the analysis on C and F whole blood, the new TEG®6s (Haemonetics Corp., MA, USA) only on C whole blood. Aim of the study was to compare the response of R to heparin using the two types of samples and two TEG devices Methods: During a three months period at Fondazione IRCCS Ca' Granda -Policlinico of Milan, TEG was performed (using TEG5000® and TEG 6s® with and without heparinase, an enzyme that degrades heparin) on 13 consecutive ECMO patients (as part of the GATRA study, NCT03208270) and in 8 consecutive non-ECMO patients in whom a TEG was requested for clinical purposes. Bland Altman analysis and Lin's concordance correlation coefficient were used to assess agreement Results: A total of 84 paired samples were taken (74 in-ECMO and 10 off-ECMO). ECMO patients received 19.2 (12.6-25.8) IU/kg/h of heparin. Among non-ECMO patients, 5 of them did not receive any dose of heparin, two of them a very low prophylactic dose (1.6 and 2.9 IU/ kg/h, respectively), and one of them 13.1 IU/kg/h of heparin. Using TEG®5000, R was -21.0 (-65.5; 23.4) min shorter on C compared to F blood in patients receiving heparin (this difference disappeared using heparinase) and only -1.58 (-8.70; 5.54) min shorter in patients notreceiving heparin. R was -26.6 (-77.3; 24.2) min shorter using TEG®6s (which performs the analysis only on C blood) than TEG®5000 on F blood (Figure 1) . When evaluating the effect of heparin using TEG, clinicians should be aware that results obtained using citrated-recalcified or fresh whole blood are not interchangeable. Using citrated-recalcified blood to perform TEG might lead to underestimation of the effect of heparin Trauma patients are at high risk for venous thromboembolism (VTE). The 2002 EAST guidelines recommend low molecular weight heparin (LMWH) for VTE prevention and antiXa monitoring after initiation of the medication or after adjusting doses in certain populations [1] . Studies have shown standard enoxaparin dosing of 30 mg every 12 hours may result in low antiXa levels [2] . This study aims to evaluate the efficacy of a pharmacist-lead protocol for adjusting enoxaparin dosing based on antiXa levels in trauma patients. This single center retrospective chart review included adult trauma patients admitted from 3/1/2018 to 9/20/2019. Per protocol, patients with body mass index (BMI) ≤40 kg/m 2 were initiated on enoxaparin 30 mg twice daily, and patients with BMI > 40 kg/m 2 were initiated on enoxaparin 40 mg twice daily. Peak antiXa levels were drawn 4 to 6 hours after at least the third dose of enoxaparin with a goal therapeutic range of 0.2-0.4 IU/mL. The primary objective was time in days to goal peak antiXa level. Secondary objectives include VTE occurrence, bleeding attributed to LMWH, and dosing regimens utilized. Subgroups were analyzed based on body mass index (BMI). Of 511 patients identified, 375 patients met inclusion criteria. Median time to therapeutic antiXa level was 3 days (IQR 2-6). Of 337 patients Fig. 1 (abstract 327) . Agreement between TEG®6s and R TEG®5000 on citrated recalcified and fresh whole blood with BMI ≤ 40 kg/m 2 , 319 patients (94.7%) were dosed initially per protocol and 153/319 patients (48.0%) met goal antiXa level at first check (Table 1) . Of 38 patients with BMI > 40 kg/m 2 , 24 patients (63.1%) were dosed initially per protocol and 8/24 patients (33.3%) met goal antiXa level at first check. Our results indicate the protocol is safe due to lack of bleeding attributed to enoxaparin, but less than 50% of patients achieved goal antiXa level at first check. However, despite low rates of achieving goal antiXa level, VTE rates also remained low. Introduction: Most patients in the ICU are given prophylactic anticoagulation with a fixed dose of 40 mg once daily of Enoxaparin (clexane) if CCT is normal and 20 mg if CCT is low. Studies on non ICU patients have shown that AFXa is below desired range for venous thromboembolism (VTE) prevention. In the ICU, many factors might influence AFXa levels including weight, creatinine clearance (CCT), shock and other medication. ATXa activity was not yet reported in a big mixed ICU population with variable morbidity. Our study hypothesis is that Enoxaparin is underdosed in most cases and routine AFXa activity should be monitored in all ICU patients. Preventive enoxaparin (40 mg qd) was given to all patients unless therapeutic dose was needed or contraindication existed. Levels of AFXa activity were taken 4 hours after the 3 rd dose. Therapeutic VTE preventive effect was defined as AFXa activity of 0.2-0.5. Patient data was collected from medical files. The study is still ongoing, preliminary results were analyzed for 31 patients. 14 of 31 patients (45%) had AFXa activity below normal (subtherapeutic). Weight and CCT were negatively correlated with AFXa activity (Figure 1 ). Mean weight in the subtherapeutic AFXa was significantly higher than the therapeutic group (83.2 vs. 72.6 respectively, p=0.031). CCT in the subtherapeutic AFXa was significantly higher than the therapeutic group (95.1 vs. 60.4 respectively, p= 0.003). The normal CCT group (>50) had significantly more patients with subtherapeutic AFXa (13 vs 8, p=0.017). In our ICU, 45% of the patients receive insufficient VTE prophylaxis. Overweight patients and patients with normal CCT should probably receive higher Enoxaprin dose. AFXa activity should be routinely monitored in ICU patients. In this study we use a new bedside biomarker to test its ability to measure anticoagulation effects on patients who present with acute first time deep vein thrombosis (DVT). DVT requires oral anticoagulants to prevent progression to potentially fatal pulmonary embolism and recurrence. Therapeutic efficacy monitoring of direct oral anticoagulants (DOAC) including rivaroxaban is problematic as no reliable test is currently available. Advances in hemorheological techniques have created a functional coagulation biomarker at the gel point (GP) which allows quantitative assessment of: time to the gel point (T GP ), fractal dimension (d f ) and elasticity (G') [1, 2] . The prospective observational cohort study measured T GP , d f , G', standard coagulation and cellular markers in first time DVT patients at three sample points: pre-treatment and approximately 20 and 60 days following 15mg BD and 20mg OD rivaroxaban respectively. Strict inclusion and exclusion criteria applied. Results: 40 DVT patients (mean age 64 years [SD±14.8]; 23 male, 17 female) and 177 non-DVT patients were well matched for age, gender and co-morbidities. Mean T GP on admission was 267s (SD±63.3s) and 262.9s (SD±73.5s) for DVT and Non-DVT respectively. DOAC therapy significantly increased T GP to 392.3s (SD±135.7s) after 20 days, and subsequently increased to 395.5s (SD±194.2s) at 60 days as shown in Table 1 . d f , G' and standard hemostatic markers all remain within the normal range. Conclusions: T GP demonstrates its utility in determining the anticoagulant effect of rivaroxaban. The significant difference in T GP between males and females needs further exploration. Localized stasis as a result of transient provoking factors appears not to generate a systemic strength Fig. 1 (abstract P329) . Correlation of anti factor Xa activity with patient CCT and weight. Anti FXa activity value below 0.2 (red line), was considered "non-effective prevention" Introduction: Trauma remains the leading cause of death all over the world. To better exploit the trauma care system, precise diagnosis of the injury site and prompt control of bleeding are essential. Here, we created a nursing protocol for initial medical care for trauma. The aim of this study was to evaluate the impact of protocoled nursing care for trauma on measures of quality performance. This was a retrospective historical control study, consisted of consecutive severe trauma patients (Injury Severity Score >16). People were divided into two groups: protocoled group (from April 2017 to March 2019) and control group (from April 2014 to March 2017). We set the primary endpoint as mortality for bleeding. The secondary endpoints included time allotted from arrival to start of CT scan and surgery, administration rate of several drugs (sedations, painkillers, preoperative antibiotics, and tranexamic acid). For the statistical analysis, continuous variables were expressed as median (interquartile range) and were compared by Wilcoxon rank sum tests given a nonnormal distribution of the data. We included 152 patients in the study: 84 in the control group before the introduction of the protocol, 68 in the protocoled group. As a primary endpoint, the mortality for bleeding was similar between two groups (5% in the control group and 5% in the protocoled group). As a secondary endpoint, the time to CT initiation [Group A 11 (7-16) min vs Group B 7 (5-10) min; p <0.001], and emergency procedure [Group A 41 (33-56) min vs group B (29-43); p <0.001] were shortened by the protocol introduction. Furthermore, the administration rates of sedations, painkillers, preoperative antibiotics, and tranexamic acid were increased in the protocoled group compared with the control group. Although the mortality as a patient-oriented outcome was not affected, improved quality of medical care by nursing protocol introduction may be suggested in this analysis. This single-institutional prospective study included 72 patients with UPRF who were admitted to the trauma surgical intensive care unit (TSICU) and survived until discharge to home between 2012 and 2017. We evaluated the activities of daily living after the discharge using physical and mental component scores of SF-36® and defined physical dysfunction (PD) as physical function (PF-N) score of 40 or less. We divided the patients in the PD (n=34) and control (without PD, n=38) groups and compared the groups. The patients had experienced blunt injuries, including falls (38%) and pedestrian injuries (33%). The mean age was 59.9 years (men: 65.3%); the median injury severity score was 22 (interquartile range: 13-29); and the mean length of TSICU stay was 3.3 days. The average period from the injury until the survey was 34.1 months. There was no difference between the PD group and the control group in the patient characteristics, fracture type, pelvic fixation, and complications. At the time of the survey, the PD group had significantly more painful complaints than the control group (PD: 67.6%, C: 23.6%, P <0.01), and had more physical and mental problems. The SF-36®subscale score showed a significant positive correlation between physical function and body pain, mental health respectively. The percentage of those who were able to return to work was not different in both groups (PD:26.5%, C:47.4%). In the multivariate analysis of PD, only age (odds ratio: 1.039, 95% CI: 1.00-1.07, P = 0.03) was relevant. Long-term PD was observed in 47% of patients with UPRF. The elderly were particularly prominent, and there was an association between pain and mental health. cells (RBC) this can lead to inhibition of oxygen transport function and development of hypoxia. Currently used methods for analyzing the state of RBC either do not have sufficient accuracy or require lengthy analysis and expensive equipment. The use of a simpler and more informative electrochemical approach to assessing the state of RBC is very promising. Electrochemical measurements in RBC suspensions (~5 • 109 cells / l) were carried out in a special electrochemical cell [1] in the potentiodynamic mode in the potential range from -0.5 to +1.2 V using the IPC Pro MF potentiostat (Kronas, Russia); optical measurements were performed using an Eclipse TS100 inverted microscope (Nikon, Japan), a CFI S Plan Fluor ELWD 60x / 0.70 lens (Nikon, Japan); RBC morphology was recorded in real time using a DS-Fi1 digital camera (Nikon, Japan). When examining RBC of patients with severe multiple trauma a decrease in the ability of RBC to change their shape during electrochemical exposure was observed, indicating a decrease in deformability, which can lead to a disruption in the oxygen supply to tissues. At the same time, with the stabilization of the patient's condition a restoration of the ability of RBC to change morphology was detected which in turn could have a positive effect on the rheological characteristics of the blood (Fig. 1) . The results of the analysis of red blood cells using electrochemical changes in their morphology can be used as an additional method for the diagnosis of critical conditions. Severe trauma should be treated immediately. Whole-body CT (WBCT) is widely accepted to improve the accuracy of detecting injuries. However, it remains the problem of time-consuming. Therefore, we focused on the scout image taken in advance of WBCT. Detecting major traumatic injuries from a single scout image would reduce the time to start treatment. A previous study suggested that even specialists could not easily find chest and pelvic injuries using WBCT scout image alone. In this study, we aimed to develop and validate deep neural network (DNN) models detecting pneumo/hemothorax and pelvic fracture from WBCT scouts. We retrospectively collected 2088 anonymous WBCT scouts together with their clinical reports at the Osaka General Medical Center between January 1, 2013, and December 31, 2017. We excluded incomplete, younger than 7 years old, postoperative, and poorly depicted images. The part of this dataset from January 1, 2017, until December 31, 2017, was used for validation and the rest for training DNN models. Pneumo/hemothorax detection model and pelvic fracture detection model were trained respectively. Accuracy, and Areas under the receiver operating characteristic curves (AUCs) were used to assess the models. The training dataset for pneumo/hemothorax contained 984 images (mean age 48 years; 30% female patients), and for pelvic fracture consisted of 783 images (48 years; 28%). The validation dataset for the former contained 258 images (54 years; 30%), and for the latter consisted of 186 images (55 years; 24%). The models achieved 59% accuracy and an AUC of 0.57 for detecting pneumo/hemothorax, 72% and 0.62 for pelvic fracture. Our results show that DNN models can potentially identify pneumo/ hemothorax and pelvic fracture from WBCT scouts. Increasing the number of samples, DNN model could accurately detect severe trauma injuries using WBCT scout image. Clinical Information System (CIS) is a computer system used in collecting, processing, and presenting data for patient care. It can reduce staff workload and errors; help in monitoring quality of care; track staff's compliance to care bundles; and provide data for research purpose. However, the transition from paper record format to electronic record involves changes in all kind of workflow in ICU. Therefore, an effective, efficient and evaluative rollout plan was required to minimize the risk that might arise from the new practice. Methods: 1. Small groups training were provided. A working station with different case scenarios were set up for practices. 2. Individual tutorials were conducted to clarify questions. Emphasis on patient care was always top priority. 3. Contingency plans were available in case of server breakdown and power failure. Downtime drills were conducted to prepare the staff in emergency situations. 4. Step-by-step transition from paper record to electronic format was gradually carried out. A plan was discussed among CIS team with clear dates and goals. 5. New items in CIS were first reviewed and amended in team meeting until consensus was made; then were promulgated to all staffs during handover before implementation. Fig. 1 (abstract P333) . The effect of therapy on the electrochemically induced change in the morphology of red blood cells in patients with combined trauma 6. Staff compliance and outcomes were then monitored; further review and amendment would be possible if necessary. CIS roll-out plan was smooth. All staffs were able to integrate CIS into the daily routine. The contingency plans were well acknowledged. New items were followed as planned. Ongoing enhancement in CIS was put forward on nursing orders, handover summary, and integration with Inpatient Medication Order Entry (IPMOE) system. With emerging benefits CIS brings along, our staff has more time to devote to direct patient care. Human input in data interpretation and clinical judgment on top of CIS play an irreplaceable role in patient care. The daily request for laboratory tests in intensive care units is a common practice. Although common, this strategy is not supported, since more than 50% of the exams requested with this rationale may be within the normal range [1] . Misconduct based on misleading results, anemia, delirium and unnecessary increase in costs may happen [2] . We have developed a strategy to reduce laboratory tests without clinical rationale. Observational retrospective study, from July 2018 to June 2019. The number and type of laboratory orders requested, the epidemiological profile of hospitalized patients, the use of advanced supports, the average length of ICU stay and the impact in outcomes such as mortality and hospital discharge at a private tertiary general hospital in the city of Rio de Janeiro / RJ -Brazil were analyzed. A strategy was implemented to reduce the request for exams considered unnecessary. Approximately 1,300 patients underwent ICU during this period. The epidemiological profile and severity of patients admitted to the unit were similar to those observed historically. There was a significant reduction (> 50%) in the request for laboratory tests and there was no negative impact on outcomes such as mortality, mean length of stay and no greater use of invasive resources. Over the period evaluated, the estimated savings from reducing the need for unnecessary exams were approximately $ 150,000 per year. The rational use of resources in the ICU should be increasingly prioritized and the request for routine laboratory tests reviewed. A strategy that avoids such waste, when properly implemented, enables proper care, reducing costs and ensuring quality without compromising safety. Evaluating the medication reconciliation errors in 2 ICUs after implementing a hospital-wide integrated electronic health record system A Rosillette, R Shulman, Y Jani University College Hospital, Centre for Medicines Optimisation Research and Education, London, United Kingdom Critical Care 2020, 24(Suppl 1):P337 Introduction: Medication errors in Intensive Care Unit (ICU) are frequent [1] and can arise from a number of causes including transition of care. Our aim was to investigate the impact of an integrated Electronic Health Record System (EHRS) on medication reconciliation (MR) errors occurring at 2 critical steps: during the transition from an ICU to the hospital ward and from the ward to hospital discharge. The objective was to examine the influence of ICU admission on long-term medication. We performed a monocentric study in 2 ICUs of a university-affiliated hospital using drug chart and medical notes review to identify MR errors before, during and after ICU admission. Data were collected retrospectively from EHRS for 50 consecutive patients discharged from the ICU between 1 June-31 July 2019, and who were newly initiated on specific drugs of interest. Results: 129 drugs of interest were initiated in ICU. Many of these were continued after hospital discharge as shown in Table 1 . There was appropriate discontinuation of all the antipsychotics newly initiated in ICU. Other than anticoagulants, there was no reason documented for continuation of the initiated drugs. The planned durations were documented more often after hospital discharge than ICU discharge for the following drug classes (% of patients with a plan after ICU discharge to the ward; % after home discharge): antibiotics (47.6%; 52.6%), and steroids (45.4%; 57.1%), but less so for analgesics (36.3%; 14.3%), insomnia (0.0%; 20.0%), and gastroprotective drugs (0.0%; 20.0%). Our study has shown that medications initiated in the ICU can be inadvertently continued at ICU and hospital discharge due to failure in documenting indication or duration. Systems are required to deprescribe ICU only drugs at discharge or communicate a plan for ongoing treatment. Introduction: The Surviving Sepsis Campaign advocates the use of care bundles to guide the management of sepsis and septic shock [1] . Our study aim was to assess compliance with a locally introduced sepsis pathway and to review intensive care unit admission outcomes. We carried out a prospective audit of patients admitted to the ICU at Royal Surrey County Hospital with a diagnosis of sepsis between 19/ 3/19 and 19/11/19, assessing compliance with local sepsis bundle delivery, outcome of ICU admission and degree of associated organ dysfunction. Results: 119 patients were identified, 71 male (59.7%), with a mean age of 65.7 (18-96). Mean 1 st 24 hour SOFA score on ICU was 6.65 (2-15). 81% of patients required vasopressors, with 67% requiring noradrenaline >0.1mcg/kg/min, and 19% requiring an additional vasopressor/ inotrope. 36% required NIV, 32% invasive ventilation and 15% RRT. ICU mortality was 15%, in-hospital mortality 24%, mean ICU stay 8 days (1-49), and mean length of hospital stay 28 days . In the presence of septic shock mortality was 47% with post-resuscitation lactate >4, versus 21% in patients with no vasopressor requirement or lactate <2 (p<0.05). The sepsis bundle was delivered in one hour to 61 patients (51%). Where the bundle wasn't completed, antibiotics were delayed in 26% of cases and blood cultures weren't taken in 66%. Where the bundle was fully delivered, unit mortality was 12% vs. 21% where it was not (p<0.05), but there was no significant difference in hospital mortality (26% vs. 30%, p>0.5) or rates of vasopressor requirement, NIV, IPPV or RRT. There is room for improvement in timely delivery of the sepsis bundle in our hospital and various measures are being instituted. Though there was no significant difference in hospital mortality, ICU mortality was significantly lower in patients when the bundle was fully delivered. Surviving sepsis campaign recommends 3h and 6 h sepsis resuscitation bundle for sepsis. The study was done to assess the feasibility of the guideline and the compliance to sepsis-3 recommendations at an emergency department. Prospective interventional study was conducted during one year. Were involved in the study all sepsis cases with a qSOFA ≥ 2. Were assessed a composite of six components (Measurement of serum lactate, Obtaining blood culture before antibiotic Administration and Provision of broad-spectrum antibiotic before the end of H3 and Provision of fluid bolus in hypotension, Attainment of target central venous pressure assessed by cardiac ultrasonography, Target lactate to normal level before the end of H6). Time base line was the first medical contact at triage zone. Secondary outcomes of study were the mortality rate and length of stay at Intensive care unit (ICU). Were involved in the study, 128 patients (mean age 54±17 years, sex ration 2,3). Pulmonary infections were the main cause of sepsis (37%) and urinary tracts infections (22%). At H3 components were achieved in 79% of cases [lactates (100%), blood culture (82%) and provision of antibiotics (79%)]. At H6 components were executed in 64% of cases (fluid provision achievement in 89%, ultrasonography assessment in 64% and normal lactate target achieved in 71%) (Figure 1 ). The reliability-adjusted rate for completion of the 3 hours and 6 hours Bundle was at 68%. Patients compliant to composite bundle got the mortality benefit (odds ratios = 0.31, 95% [confidence interval, 0.11-0.72]). The study, however, did not show any benefits of mean intensive care unit (ICU) length of stay. Faisability of 3-6h Bundle ratio was at 68%. It has shown a significant improvement in adaptation and mortality benefit without reducing mean hospital/ICU length of stay. More adapted procedures are needed to improve results targeting full compliance of patients to the 3-6h Bundle sepsis management. Patterns and outcome of critical care admissions with sepsis in a resource limited setting M Edirisooriya Maddumage 1 , Y Gunasekara 2 , D Priyankara 1 1 National Hospital of Sri Lanka, Medical Intensive Care Unit, Colombo 10, Sri Lanka; 2 Sri Jayawardenepura General Hospital, Department of Critical Care, Nugegoda, Sri Lanka Critical Care 2020, 24(Suppl 1):P340 Introduction: Paucity of epidemiological data is a major barrier in expansion of critical care services, especially in resource limited settings. We evaluated the patterns and the outcome of critically ill patients with sepsis admitted to a level 3 Medical Intensive Care Unit in Sri Lanka. A retrospective cohort study was performed to describe the characteristics and outcome of patients with sepsis, admitted to a medical intensive care unit. Sepsis is defined according to sepsis 3 definition. We examined 360 critically ill patients admitted over a period of 6 months. Sepsis was the commonest presentation, accounted for 63.6 % of all admissions. Mean age was 49.2 ± 16.1 years. Septic shock was present in 41.4% on admission. Pneumonia (35.4%) was the commonest cause, while Leptospirosis (17.9%) and meningoencephalitis (12.2%) accounted for Fig. 1 (abstract P339) . Sepsis 3-6h bundle components (% of goals achievment) second and third commonest causes of sepsis respectively. The SOFA score on admission (8.8 ± 5.0 vs 7.5 ± 4.9, P< 0.025), occurrence of AKI (62% vs 49.6%, p<0.02) and the length of ICU stay (8.8 days vs 6.2 days, p <0.001), were significantly higher in sepsis than in patients without sepsis. ICU mortality in sepsis (n=92) did not show a significant difference to nortality (n= 45) in those without sepsis (40% vs 35%, p= 0.5). Patients with leptospirosis had a mean SOFA score of 13.3, however the mortality (37.5% vs 40%, p = 0.75) was similar to others with sepsis. In contrast, mortality related to sepsis was significantly high ( 60%, p< 0.02) in the packground of immunosuppression (n= 35). Respiratory failure secondary to pneumonia was the commonest cause of critical care admission with sepsis. Sepsis related ICU mortality was high in the background of immunosuppression. Introduction: Training in placement, and the subsequent safe confirmation of position, of a nasogastric (NG) tube, relies on clinicians completing an e-Learning module at our trust. Feeding through an incorrectly placed NG tube is a 'Never Event,' associated with significant morbidity and mortality [1] . Analysis of these incidents reveal that the misinterpretation of chest radiographs, by medical staff, who had not received competency-based training, is the most frequent cause [2] . e-Learning has revolutionized the delivery of medical education [3] , however, there are barriers to its use [4] . We hypothesized that, by taking e-Learning content, and delivering it face-to-face, we would improve training rates, and thus patient safety. A questionnaire was completed by 50 critical care doctors, concerning their knowledge of the existence of the e-learning module, whether they had completed formal training in NG tube placement, and how confident they were, on confirming correct positioning, using a 5 point Likert scale. All clinicians underwent training in the interpretation of NG placement, using chest radiographs. After the session they were asked to re-appraise how confident they felt. Results were compared using paired t tests. Confidence improved in all, rising from a pre-test average score of 3.74 (SD=0.92), to post-session 4.76 (SD=0.48), p=<0.0001. Prior to the intervention, 63% of the doctors were aware of the trust guidelines, but only 17% had completed the training. After the session, 100% were aware of the guidelines, and 100% had completed the training (Figure 1) . Conclusions: e-Learning is a useful tool, but has its limitations. By using course content, delivered with more traditional learning methods, we im-proved the number of appropriately trained clinicians, and thus the safe use of NG tubes in our unit. A systematic review of anticoagulation strategies for patients with atrial fibrillation in critical care A Nelson, B Johnston, A Waite, I Welters, G Lemma University of Liverpool, Liverpool, United Kingdom Critical Care 2020, 24(Suppl 1):P342 There is a paucity of data assessing the impact on clinical outcomes of anticoagulation strategies for atrial fibrillation (AF) in the critical care population. This review aims to assess the existing literature to evaluate the effectiveness of anticoagulation strategies used in critical care for atrial fibrillation. Only 4 studies contained analysable data. Anticoagulated patients had a lower mortality at 30 days and 365 days post admission to critical care, however there was an increased incidence of major bleeding events compared to the non-anticoagulated population. Thromboembolic events were comparable in both cohorts. Data from current literature is scarce and inferences regarding the effectiveness of anticoagulation in patients in critical care with AF requires further investigation and research. Every new admission to the ICU prompts a handover from the referring department to the ICU staff. This step in the patient pathway provides an opportunity for information to be lost and for patient care to be compromised. Mortality rates in Intensive Care have fallen over the last twenty years, however, 20% of patients admitted to an ICU will die during their admission [1] . Communication errors contribute to approximately two-thirds of notable clinical incidents; over half of these are related to a handover [2] . NICE have concluded that structured handovers can result in reduced mortality, reduced length of hospital stay and improvements in senior Clinical Staff and Nurse satisfaction [3] . A checklist was created to review the information shared and to score the handover. This checklist was created with Doctors and Nurses and is relevant for handovers between all staff members. Information was gathered prospectively by directly observing 17 handovers on the ICU. There is a notable discrepancy in the quality of handovers of new patients ( Figure 1 ). This is true of handovers between Doctors, Nurses and a combination of the two. It is also true of all staff grades. Whilst a Doctor may have reviewed the patient prior to their arrival, 41% (n=7) of patients weren't handed over to a doctor. The most commonly missed pieces of information were details of the patient's weight (96%, n=16), their height (100%, n=17), whether the patient has previously been admitted to an ICU (78%, n=15) and whether the patient has any allergies (71%, n=12). The handover of new patients to the ICU is often unstructured and important information is missed. This can be said for all staff members and grades, and for handovers from all hospital departments. Post intensive care syndrome-family (PICS-F) describes new or worsening psychological distress in family and caregivers after critical illness but remains poorly studied within specialist groups [1] . We aim to define the degree of PICS-F within our tertiary referral cardiothoracic centre and map change over the course of 12 months. Caregivers attended a 5-week multi-professional clinic alongside patients. Peer support was facilitated through a café area and a caregiver group psychology session was offered with individual appointments if required. Caregiver surveys were completed including: caregiver strain index; hospital anxiety and depression scale (HADS); and insomnia severity index. Patients also completed HADS questionnaires. Repeat surveys were completed at 3 and 12 months. Results: Over 5 cohorts, 23 caregivers attended, of which 17 were spouses (74%), 3 children (13%), and 3 others (13%), with 13 caregivers completing surveys at 12 months. Patients' median APACHE 2 score was 17 (IQR 14-18.5) and median ICU length of stay was 11 days (IQR 8-23.5). Most admissions were from scheduled operations (56%). Severe caregiver strain was present in 4/23 (17%) with changes to personal plans (35%) the most common sub category. HADS demonstrated 13 caregivers (57%) with anxiety and 8 (35%) with depression. Caregiver anxiety exceded that of patients', only reaching Fig. 1 (abstract P343) . Each handover was scored according to the information accurately given to ICU staff similar levels at 12 months, while depression remained static ( Figure 1 ). Median number of nights with 'bothered' sleep was 5 (IQR 1-6.75) and 77% of caregivers expressed problems with sleep. Conclusions: Significant psychological morbidity in caregivers from our tertiary cardiothoracic centre is in keeping with the general ICU population [2] . Caregiver strain was reduced suggesting higher levels of resilience. Future work should address mental wellbeing, particularly anxiety, to minimise the effects of PICS-F. Burnout syndrome is an illness that has increasingly affected health professionals. It is characterized by great emotional stress, physical and mental exhaustion and depersonalization of the individual. More serious cases can lead to job loss or even suicide. The described work identifies the burnout level of the multidisciplinary team through a specific questionnaireBurnout syndrome is an illness that has increasingly affected health professionals. It is characterized by great emotional stress, physical and mental exhaustion and depersonalization of the individual. More serious cases can lead to job loss or even suicide. The described work identifies the burnout level of the multidisciplinary team through a specific questionnaire Methods: Application of a questionnaire suitable for the multidisciplinary group in November 2019. The same was answered by 85 professionals among physicians and nursing team. There was no identification of employees. After analysis of the results it is observed that 50% of the group presents initial burnout, 30% with the syndrome installed and about 5% with characteristics of greater severity. Main factors found were: mental and physical exhaustion during the work day, the level of responsibility existing in the activity and the perception of disproportionate remuneration by work performed. All interviewees presented some degree of burnout or high risk to develop it. The most severe cases should be traced through occupational medicine and anti-stress measures with reorganization of work performance should be discussed in order to reduce the prevalence of this syndrome. Introduction: Burnout affecting the psychological and physical state of healthcare workers is recognized in the last 10 years. Burnout has been shown to affect the quality of care. Whilst some risk factors have been identified, there are gaps within the literature related to mental health and burnout. The aim of this study is to measure levels of burnout across 3 ICU units in the metropolitan setting. To determine the level of burnout we used 2 surveys, the Maslach Burnout Inventory human services survey (MBI-hss) and the Centre for Epidemiologic Studies Depression Scale (CES-D). With the MBI-hss we analysed 3 different variables of burnout; exhaustion, cynicism and emotional exhaustion. Basic demographic data and information regarding workout schedules were collected. We studied prevalence and contributing risk factors using and analysing the outcomes of the 2 self-scoring questionnaires. Analysis was performed using descriptive statistical analysis. There were 90 respondents, 36% scored the threshold for depressive symptoms on the CES-D depression scale. Interestingly, 40% (CI 25.4-57.7%) of those meeting the score for depressive symptoms identified as having frequent restless sleep compared with 11% (4.6-21.8 %) from those not meeting. Gender did not affect depressive symptoms 35% of females and 37% of males met the threshold. With the MBI-hss for exhaustion the mean was 17.16 (SD 4.6) which is a high level of exhaustion, the second variable cynicism the mean score was 14.08 (SD 4.2), which was considered high. The final variable was emotional exhaustion the mean was 25.16 (SD 9.90), this is considered moderate levels of emotional exhaustion. Fig. 1 (abstract P344) . Hospital anxiety and depression scale (HADS) scores for patients and caregivers at baseline, 3 months, and 12 months There was high prevalence of burnout in ICU in all different categories as well as depressive symptoms. Age and gender had no affect on burnout. Interestingly, we identified that sleep and shift variables were linked to increased burnout. Following the implementation of a fully integrated EHRS on 31 March 2019 at our university-affiliated hospital we conducted a prospective study in 2 ICUs by analysing pharmacists' contributions during 4 data collection periods of 5 days at 10, 15, 19 and 21 weeks post implementation. A pharmacists' contribution was defined as contacting the physician to make a recommendation in a change of therapy/ monitoring [1] . The 3 types of contribution were: a medication errorrectification of an error in the medication process; an optimizationproactive contribution that sought to enhance patient care, and a consult -reactive intervention in response to a request. A panel of experts composed of a senior pharmacist, a consultant, a nurse, and a pharmacy student assessed the impact of each contribution, scoring low impact, moderate impact or high impact. There were 160 pharmacist contributions recorded in the 4 periods. Of these, 66 (41.2%) were medication errors, 93 (58.1%) were optimizations, and 1 (0.6%) was a consult ( Table 1) . 37% of the contributions were assessed as having medium impact, 36% as high impact and 27% as low impact. In general, the consultant assessed fewer contributions as having high impact compared to other members of the panel, with 7 contributions assessed as high impact by the consultant versus 97 by the senior pharmacist. Implementing an EHRS in combination with contributions of clinical pharmacists can prevent medication related issues. Interestingly the types of incident did not change over time. Introduction: Most ICU's are noisy and may adversely affect patients outcomes and staff performance [1] . WHO reports that the noise level in hospitals should not exceed 35 dB at daylight and 30 dB at night. The aim of this study is to evaluate the noise levels in intensive care unit, to apply awareness training to intensive care staff in terms of noise and to compare the noise levels before and after education. Noise measurement areas are separated into 17 points including 12 patient bedsides, nurse desk, staff desk, wareroom, corridor and entrance of intensive care unite. Measurements were performed 14 times per day. After 10 day, awareness training were given to staff in terms of harmful effects of noise. After the training, noise measurements were repeated during 10 days. After total 20 days the measurements were terminated. Noise was measured with incubator analyzer (FLUKE model: Bio-tek serial no:6050274). The mean noise values before and after the training were not statistically different from the mean average noise values (p>0.05). When the time of measurement were compared, the noise levels were higher between 10-16 hours to other measurements before and after the training statistically (p=0.001). Seventeen different noise measurement areas were compared in terms of noise level, there was no statistically significant difference (p>0.05). The differences were examined at the same hours between before and after training. Contrary to expectations, noise levels were found to be higher after training statistically (p<0.05). All of noise measurements were higher than the threshold values that WHO recommended. Increased noise levels in critical care units may lead to harmful health effects for both patients and staff. Our results suggest that much noise in the ICU is largely attributable to environmental factors and behavior modifications due to education have not a meaningful effect. Critical care medicine has focused on continuous, multidisciplinary care for patients with organ insufficiency in the face of lifethreatening illness. Despite significant resource limitation low income countries carry a huge burden of critical illness. Available data is insufficient to clearly show the burden and outcomes of intensive care units in these developing countries [1] . The objective of our study is to evaluate the morbidity and outcomes of patients admitted to the Intensive Care Unit of a tertiary university hospital in Hawassa, Ethiopia. This was a prospective observational study. Data was registered and analysed starting from patient admission to discharge during a 12 month period beginning September 2018. Data regarding demographics, sources of admission, diagnosis, length-of-stay and outcomes were analysed. The total number of patients admitted to the ICU was 218, with 71 patients dying over a one year period. The highest admission was from emergency medical unit, 36% and the lowest source was from pediatrics department, 8%. Out of these, 69.8% were males. The mean age was 27 years (2-62). The most frequent aetiologies of morbidity in the admitted patients were traumatic brain injury (24.6%), acute respiratory distress syndrome (22.9%) and seizure disorder (8%). Average median length of stay was 3.0 days (interquartile range: 1.0 -27.0). The overall mortality rate was 32.5%. The top four causes of death in the ICU were respiratory illness at 24% followed by sepsis with multiorgan failure at 20%, trauma (16%) and central nervous system infection (12%). Infection morbidity and mortality remains very high and needs institution of aggressive preventive strategies. The increase in frequency of trauma patients need to receive due attention. Sepsis causes a high number of deaths, though overtaken by respiratory illnesses. Improving the overall system of ICU may achieve better outcomes in resource limited countries. Introduction: ICU mortality has been widely studied in the literature in relation to outcome index that primarily value organic failure [1] . However, early mortality, in the first 48 hours of admission has been little documented in the literature. The aim of this study is to analyze factors related to early mortality in ICU. Retrospective study at a second-level hospital. Time of study was 12 months. Patients who died in ICU were included, patients were classified according timing of dead, including those who died within the first 48 hours of ICU admission. The variables analyzed were age, sex, comorbidity, Charlson index, APACHE II, need for supportive treatments, more frequent admission diagnosis, origin and support treatment limitation decisions. The statistical study was carried out using the SPSS statistical program. 138 patients were included during the study period, 72 (52.2%) died within the first 48 hours of admission. No differences in the needs of support treatments were observed, more than 90% of patients received mechanical ventilation and vasoactive therapies. Table 1 shows characteristics of patients. Half of ICU deaths occur within the first 48 hours of admission. Severity at ICU admisison was the main factor related with early mortality. Severe stroke and coronary disease were the most frequent causes of early deaths in ICU. In August 2018 The Royal College of Anaesthetists published guidelines on care of the critically ill woman in childbirth and enhanced maternal care [1] . Approximately 11000 babies are born across the area covered by Leicester University Hospitals that includes two large maternity units and is part of the UK ECMO network. This audit sets out to assess current practice and form a basis for future planning, which will likely be representative to most major obstetric centres. A retrospective audit of all patients admitted to 'intensive care units' in Leicester over a 12 month period following publication of the guidelines. The focus was on patients admitted to general adult intensive care and excludes all patients cared for in 'enhanced obstetric care' units. 9 simple standards were proposed relating to accessibility, resuscitation, follow up and multi-disciplinary learning. In total 49 women were identified with a broad range of diagnosis. The intensive care services are split across 3 hospitals and we found this led to a number of problems. The presence of trained staff to resuscitate a newborn were easily accessible, no steps to provide necessary equipment pre-emptively were present in any centre. None of our critical care units had a plan for perimortem section. On-going reviews by the obstetric and midwifery teams were very variable. Contact with the infant and breastfeeding support was also poor. Despite the large number of deliveries significant work needs to be done in order to come in line with the new national guidelines for critically ill woman in childbirth. Clearly defined pathways around escalation of care, resuscitation of both the mother and baby, integrating care of the mother and the infant in the first few days of life, and multidisciplinary learning events are being produced de novo in response to these guidelines, some of which will be illustrated in the associated poster. Interprofessional collaboration Scale [3] . Data were analyzed with IBM SPSS 25.0 Results: It was found that cooperative attitudes with an average score of 49 to 75 are considered to be of average significance. Interprofessional cooperation at an average score of 2,568 states that the level of cooperation is high and the quality of working life averages 125 to 150, suggesting that it is very good. As far as professional satisfaction is concerned, nurses are happy, content and satisfied with their work, despite workload and burnout Conclusions: Interprofessional cooperation at the ICU of the General Hospital of Larissa is high, but satisfaction from wages, resources, working environment and conditions is low. In addition, the results showed that improvements in hospital communication between staff, has a positive impact on the quality of professional life (Table 1) . Contrasting with previous reports, decreased admissions per unit population in older and oldest age groups, and those with high comorbidity, suggest resource constraints may have influenced admission discussion and decision-making over the 10-year study period in Wales. Further investigation is warranted. ICU discharge into weekends and public holidays: an observational study of mortality N Mawhood, T Campbell, S Hollis-Smith, K Rooney Bristol Royal Infirmary, General Intensive Care Unit, Bristol, United Kingdom Critical Care 2020, 24(Suppl 1):P355 Introduction: Up to a third of in-hospital deaths in ICU patients occurs following ward stepdown [1] . Discharge time seems to be associated with in-hospital prognosis, but meta-analyses have not shown a difference in weekday compared to weekend discharge [2, 3] . However, papers that examined discharge 'into' out-of-hours days, particularly on Fridays, have found differences [3] . Our aim was to assess whether discharge from ICU 'into' out-of-hours (OOH -weekends and public holidays) is associated with in-hospital mortality or re-admission to ICU, and whether these patients were seen on the wards OOH by medical staff. All adults discharged from the general ICU to a ward at the Bristol Royal Infirmary in December 2016-18 were included. In-hospital mortality rates were assessed for each day, with 'into weekdays' defined as Sunday to Thursday and 'into OOH' Friday, Saturday and the day before a public holiday. A subset of patients with data on readmission rate to ICU was also examined. All available notes from patients discharged into OOH in 2018 were reviewed. The study included 1732 patients with a subset of 443 with readmission data. 117 sets of notes were reviewed from patients discharged into OOH (Figure 1 ). The in-hospital mortality was significantly higher in patients discharged into OOH (5.1% vs 7.6%, P=0.012). Within the subset, OOH was associated with in-hospital mortality or readmission to ICU (6.8% vs 11.9%, P=0.044), though readmission rate alone was not (1.7% vs 2%, P=0.35). Of patients discharged into OOH, once on a ward 64% were reviewed by a specialty doctor but 20.5% were not seen. This is the first study to examine ICU discharge 'into' OOH days including public holidays. We found increased hospital mortality in OOH, similar to other studies [3] . Up to a fifth of high-risk ICU stepdown patients were not reviewed by a doctor on OOH days. Exploring the experiences of potential donors' family members (FM) in a follow up clinic is crucial to analyze the effects of organ procurement (OP) on the bereavement process, to gain insight on the reasons of family refusals (FR), and to improve family care during OP. A mixed-method study involving FM at 3 and 12 months after patients' death was developed and approved by Local Ethics Committee. FM of potential donors after brain (DBD) and cardiac death (DCD) treated in Careggi Teaching Hospital, Florence (Italy) were eligible if adult and consenting. Invitation letters were sent to the entitled 2 months after death and those who actively responded were involved in an encounter with a multidisciplinary group including a clinical psychologist, two nurses and two cultural anthropologists with expertise in OP. Organ replacement procedures such as ECMO (extracorporeal membrane oxygenation), LVAD (left ventricular assist device) and dialysis are routinely used to treat multi-organ failure (MOV). Globally transplantation programs struggle with increasing organ shortage. Patients (pts) with MOV are a potential source for procurement. However, outcome data after kidney transplantation (KTX) from such donors are sparse. We retrospectively studied the cadaveric KTX at the Charité Berlin in 2018 and identified donors with ongoing organ replacement procedures. Donor and recipient risk factors were assessed. Overall patient and graft outcomes were analyzed at 12 months post-transplant. A total of 220 kidneys were transplanted. We identified 11 KTX from 7 donors with MOV (6 following cardio-pulmonary resuscitation, 5 with acute renal failure -4 on dialysis) (Figure 1 ). In 3 donors, a venoarterial ECMO was implanted during ECLS-resuscitation. One donor needed a veno-venous ECMO due to ARDS, and 1 donor had a LVAD implanted due to cardiac failure. The donor age was 41 ± 10.5 years (yrs). In addition, 6 donors had at least one cardiac risk factor. The kidney donor risk index averaged 0.94 (SD ± 0.14) and S-creatinine prior to KTX was 2.41 (SD ± 1.27 One way to expand the potential donor pool is Donation after Circulatory Death (DCD), and a strategy to reduce the complications related to the ischemic time is the use of Normothermic Regional Perfusion (NRP) with extracorporeal membranous oxygenation (ECMO) [1, 2] . We compare the use of standard NRP with an effective adsorption system inflammatory mediators (CytoSorb®) in the regional normothermic reperfusion phase via regional ECMO, that involves a reduction in cellular oxidative damage, assessed as a reduction in levels of proinflammatory substances. We report a case series of 9 DCD-Maastricht IIIA category donors, treated in ECMO with NRP, to maintain circulation before organ retrieval, in association with CytoSorb® in 5 patients. During perfusion, from starting NRP (T0), blood samples are collected 3 times, every 60 minutes (T1, T2, T3). During treatment with Cytosorb®, lactate levels progressively decrease, AST and ALT increase less than without Cytosorb®, as sign of improvement in organs perfusion ( Figure 1 ). NRP with CytoSorb® might help to successfully limit irreversible organ damages and improve transplantation outcome [2] . Development and implementation of uniform guidelines will be necessary to guarantee the clinical use of these donor pools. Introduction: Shock is a common complication of critical illness in patients in intensive care units (ICUs), who are undergoing major surgery. This condition is the most common cause of death in postsurgical ICUs. Nowadays, there are different ICU scoring systems for predicting the likelihood of mortality, such as APACHE or SOFA. Nevertheless, they are used rarely because they also depend on the reliability and predictions of physicians. In these sense, gene expression signatures can be used to evaluate the survival of patients with postsurgical shock. Methods: mRNA levels in the discovery cohort were evaluated by microarray to select the most differentially expressed genes (DEGs) between groups of those that survived and did not survive 30 days after their operation. Selected DEGs were evaluated by quantitative real time polymerase chain reactions (qPCR) for the validation cohort to determine the reliability of the expression data and compare their predictive capacity to that of established risk scales. Introduction: This study evaluates the prognostic ability of frailty and comorbidity scores in patients with septic shock. The 90-day mortality rate of individual medical conditions are also compared. The burden of comorbid illness and frailty is increasing in the critical care patient population [1] . Outcomes from septic shock in patients with chronic ill-health is poorly understood. Interstitial lung disease is a group of diseases associated with poor prognosis in the intensive care unit despite major improvement in respiratory care in the last decade. The aim of our study is to assess factors associated with hospital mortality in interstitial lung disease patients admitted in the intensive care unit and to investigate the long-term outcome of these patients. We performed a retrospective study in an intensive care unit of teaching hospital highly specialized in interstitial lung disease management between 2000 and 2014. A total of 196 interstitial lung disease patients were admitted in the intensive care unit during the study period. Overall hospital mortality was 55%. Two years after intensive care unit admission, 70/196 patients were still alive (36%). One hundred eight patients (55%) required invasive mechanical ventilation of whom 80% died in the hospital (Figure 1 ). Acute exacerbation of interstitial lung disease was associated with hospital mortality (OR=5.4 [1.9-15.5] ), especially in case of acute exacerbation of idiopathic pulmonary fibrosis. Multiorgan failure (invasive mechanical ventilation with vasopressor infusion and/or renal replacement therapy) was associated with very high hospital mortality (64/66; 97%). Survival after intensive care unit stay of patients with interstitial lung disease is good enough for not denying them from invasive mechanical ventilation, except in case of acute exacerbation for idiopathic pulmonary fibrosis patients. If urgent lung transplantation or extracorporeal membrane oxygenation are ruled out, multiorgan failure should lead to consider withholding or withdrawal life support therapies. ΑGI is a malfunctioning of the GI tract in ICU patients associated with prolonged mechanical ventilation, enteral feeding failure and high mortality risk. The WGAP of ESICM proposed a grading system for AGI. Four grades of severity were identified: AGI grade I, a selflimiting condition; AGI grade II (GI dysfunction), interventions are required to restore GI function; AGI grade III (GI failure); AGI grade IV, GI failure that is immediately life threatening. The aim was to evaluate the feasibility of using AGI grades I and II as predictors of malnutrition and 1-year mortality in critically ill patients Methods: Single-center retrospective cohort study in a tertiary university hospital (2015 -2017). AGI Grade III and IV patients were excluded. Αnthropometric data, GI symptoms (vomiting,diarrhea), feeding intolerance, gastric residual volumes and abdominal hypertension were recorded. Daily prescribed caloric intake was calculated using a standard protocol and daily achievement of caloric intake was recorded. mNutric score was calculated for all patients. A score ≤5 was used to diagnose malnutrition. 200 patients (59% men, mean age 65 years) that stayed in the ICU for >48 hours were included in the study. 52% were at high nutritional risk. 1-year mortality was 31%. The prevalence of AGI II was 14%. Age, gender, BMI, mortality and energy intake did not differ significantly between patients with AGI II and those with AGI I (Table 1) . Logistic The study aimed to assess the effects of ICU admission on frailty and activities of daily living in the ≥80's population at 6-months. A prospective observational study with data used as a subset of the VIP-2 trial [1] . Research ethics committee approval from the Mater Misercordiae University Hospital (MMUH). Inclusion criteria -≥ 80 years of age and acute admission to ICU from May to July 2018. Data collected on 20 consecutive patients. Frailty and activities of daily living (ADL) were assessed using the Clinical Frailty Score (CFS) and the Katz Index of Independence in Activities of Daily Living (KATZ). Results: CSF pre-admission frailty was present in 60% of patients, increasing to 93% at 6 months ( Figure 1 ). 74% of survivors at 6-months had a CFS score increase by ≥ 1 point. Pre-frail and frail CFS patients suffered an average 2-point deterioration in their Instrumental Activities of Daily Living (IADL). 60% of KATZ patients were fully functional preadmission, deteriorating to 13% at 6 months. 74% of patients declined by 1 ADL at 6 months. 60% of the deceased were deemed fully functional initially. We demonstrate an association between an ICU admission event and enduring functional decline at 6 months. ICU admission resulted in patients acquiring on average 1.5 new IADL limitations despite their initial CFS. This is echoed in a study by Iwasyna et al. who also showed similar deteriorations in IADL and cognitive impairment [2] . KATZ benefits may be best used in describing functional decline. 74% of patients developed at least one new limitation. However, the CFS takes into account IADL's and thus may be more sensitive in predicting the functional outcomes of an ICU event at 6 months. Frailty: an independent factor in predicting length of stay for critically ill T Chandler, R Sarkar, A Bowman, P Hayden Medway Maritime Hospital, Critical Care, Gillingham, United Kingdom Critical Care 2020, 24(Suppl 1):P366 Frailty has attracted attention in the healthcare community in recent years, as it is associated with worse outcomes and increased healthcare costs [1] . Our objective was to study the impact of frailty as recorded by clinical frailty scale(CFS) to prospectively evaluate the effect of frailty on hospital length of stay (LOS). A retrospective analysis of consecutively admitted critical care (CC) patients' data (Jan'19-Oct'19) was performed. Electronic health records were used to collect demographics, CFS and clinical outcomes. Statistical analysis was performed using STATA. Students T-test, simple and multiple (adjusted for age, disease severity/ICNARC score) linear regression were used for comparison between groups and to see group effect. We excluded extreme outliers (LOS>50 days; n=13). Frailty was defined as CFS>4. Out of the 848 patients (male 58%), 554(66%) were emergency admissions, the rest elective (Table 1) . 288(40%) were non-frail. The mean LOS were 15 days (d) ±12 and 10d±9 (P<0.0001) in the frail and non-frail patients respectively. For emergency patients, LOS were 16d(±12) and 10d(±10) for the groups, (P<0.0001). For elective patients; LOS were 12d(±10) and LOS 8d(±7), (P=0.01) for frail and nonfrail respectively. After adjusting, LOS was significantly higher in frail patients by 5 days (95%CI 3,7; P<0.0001), by 4 days (95%CI 1,6; P= 0.002) and by 5 days (95%CI 4,8; P<0.0001) for total cohort, elective and emergency admissions respectively. The LOS was 6 days higher in frail than non-frail (P<0.001) for CC survivors. Frailty was associated with significantly increased LOS in this cohort, independent of age and illness severity. Hospital capacity planning should take this into consideration when modelling bed allocation Fig. 1 (abstract P365) . Clinical frailty score 6-month trend Robust clinical governance requires analysis of patient outcomes during an ICU admission [1] . On one adult ICU weekly mortality meetings are used for this purpose and aid multidisciplinary reflections on individual patient deaths. However, such reviews run the risk of being subjective and fail to acknowledge themes which may relate to preceding or subsequent deaths. This paper describes a new mortality review process in which: a) reviews are structured using the Structured Judgement Review (SJR) framework [2] ; and b) themes are generated over an extended period of time to create longitudinal learning from death. The SJR framework has been developed by NHS improvement for the new medical examiner role, looking at inpatient deaths. We adapted this to better suit the ICU creating a novel review structure. This involves explicit judgement comments being recorded, and the use of a scoring system to analyse the quality of care during the patient's stay with a focus on elements of care delivered on the ICU. Tabulation of this information allows analysis over time, identifying trends across all patients, and in specific subgroups. This framework has been rolled out at the St George's Cardiothoracic ICU weekly mortality meetings. Themes that have emerged include parent team ownership, delayed palliative care referrals and inadequate documentation of mental capacity. This will continue as part of a three-month trial and following review of this trial may be extended to other critical care units in the trust. This system allows greater insight into patient deaths in a longitudinal fashion and facilitates local identification of problems at an early stage in a way that is not possible within the traditional mortality review format. The nature of the process means that key areas for change can be identified as a routine part of the clinical week. [1] . In this study, we evaluated three distinct machine-learning methods for predicting possible patient deterioration after surgery. The data was collected retrospectively from the Catharina hospital in Eindhoven. This dataset contained all the surgeries conducted in the hospital from 2013 up to 2017. The variables in this dataset were tested on their ability to differentiate between patients with a normal recovery versus patients with an unplanned ICU admission after being admitted to the ward. The dataset contained 44 variables related to either the preoperative screening, surgery or recovery room. All variables were tested for statistical significance using a univariate logistic regression (LR), from which a subset of 34 statistically significant (p<0.05) variables was created. These variables were used to train three different types of models, namely, the LR, support vector machine (SVM) and Bayesian network (BN). The network structure of the BN was designed using expert knowledge and the probabilities were inferred using the data. The three models were validated using five-fold cross-validation, resulting in the following areas under the receiver operating characteristic curve: 0.82(0.79-0.86) for LR, 0.82(0.78-0.88) for SVM and 0.65(0.63-0.68) for BN (Fig. 1) . The results indicate that machine learning is a promising tool for early prediction of patient deterioration. The BN was included because it permits incorporating clinical domain knowledge into the learning process. However, its performance resulted inferior to the LR and SVM. In future work, we will investigate alternative domainaware methods, and compare the performance with that of the clinical experts. Intensive Care Unit (ICU) admission decisions of patients with a malignancy can be difficult as clinicians have concerns about unfavourable outcomes, such as mortality [1] . A diagnosis of a malignancy is associated with an almost 6-fold increased likelihood of refusal of ICU admission [1] . Recent large long-term mortality studies of patients with a malignancy admitted to the ICU are scarce. Therefore, our aim was to compare mortality of patients with either a hematological or a solid malignancy to the general ICU population, all with an unplanned ICU admission. All adult patients registered in a National Intensive Care Evaluation registry with an unplanned ICU admission from 2008 to 2017 were included. Subsequently, we divided these patients into 3 cohorts: Cohort 1 (all patients with a hematological malignancy), cohort 2 (all patients with a solid malignancy), and cohort 3 (a general ICU population without malignancy). As primary outcome, we used 1-year mortality, and as secondary outcome, ICU and hospital mortality. We included 10,401 (2.2 %) patients in cohort 1, 35,920 (7.6%) patients in cohort 2 and 423,984 (90.2%) in cohort 3 ( Table 1 ). The 1year mortality of patients of cohort 1, 2, and 3 was 60.1%, 46.2% and 28.3%, respectively (p<0.001). Age, comorbidities, organ failure, and type of admission (i.e. surgical or medical) were positively associated with 1-year mortality in all cohorts (p <0.05). One-year mortality is higher in both patients with a hematological malignancy and patients with a solid malignancy compared to the general ICU population. In addition, several factors were positively associated with 1-year mortality, i.e., age, comorbidities, medical ICU admission, and organ failure. Future research should focus on predictive modelling in order to identify patients with a malignancy that may benefit from ICU admission. Introduction: Drug abuse is associated with immunosuppression in multiple mechanisms. Despite that, the only study retrospectively reviewing drug abusers in the ICU demonstrated less infections and better outcomes. We compared matched patient populations in order to fully understand whether drug abuse is a risk factor for infection and a predictor of poorer prognosis as is perceived by most physicians. We hypothesized that the drug abusers admitted to the ICU will fare as good as or better than non-abuser ICU patient populations. Methods: This is a prospective study done between the years 2010-2012 on the entire patient population of the Detroit Medical Center. After the drug abuse population was identified, controls were matched according to age and admission ICU units. Patients charts were reviewed and data regarding baseline demographics, infectious complication and outcome was extracted. Data was retrospectively collected for 323 drug abusers and 305 matched controls. Comorbidities and hospital admission diagnosis were significantly different between the two groups. Disease severity scores were significantly higher in the drug abuser's patient group (DAPG) on admission and during the ICU stay. DAPG had significantly more organ failure: more need for ventilation (30.5% vs 46.4% in the DAPG (p<0.001)), more ARDS (1% vs 3.7%, p=0.03), more renal failure (33% vs 45.5%, p=0.002) and more need for renal replacement therapy (6.6% vs 11.2%, p<0.05) .They had longer hospital length of stay (LOS). There was no difference in ICU or hospital mortality. Multivariable modeling did not find drug abuse to be an independent risk factor for hospital mortality, ICU mortality (Hosp: OR = 1.37, P = 0.3397; ICU: OR=1.43, PP = 0.07), but was a risk factor for a longer hospital LOS (ME=1.46, P < 0.0001). Drug abuse is not an independent risk factor for mortality or ICU LOS. Drug abusers should be evaluated like other patients based on baseline comorbidities and disease severity. This is a small audit which although it did not include general ICU still reflects the need for encouraging clinicians and patients to speak freely regarding escalation plans. Medical decsions is clinician led however this audit was carried by nursing staff as we have a duty to be advocate for our patients involvement in medical care [2] . A retrospective analysis of independent risk factors of late death in septic shock survivors C Sivakorn 1 , C Permpikul 2 , S Tongyoo 2 (Fig. 1) . The PaP and Katz scales seem to be adequate for predicting mortality of critically ill patients admitted to a medical ICU. This finding may help in the elaboration of future ICU mortality scoring systems, as well as in more rational use of resources. However, further multicenter studies are needed to better elucidate these results. Adherence This last group was chosen because of its experience and specific training in the field of bioethics as a control group or reference. A total of 444 respondents participated in the study. 22.2% were emergency physicians, 14.8% intensivists, 11.2% emergency nursing, 6.2% ICU nursing, 24.9% resident doctors, 13.8% medical students and 6.9% other professions. We observed variability in the responses observed not only between different groups of professionals but even within the same group reflecting the difficulty in decision making. Variability was observed regarding decisions in end of life ethics conflicts. A high degree of similarity with the group of Master in Bioethics was observed in the responses issued by medicine students. The Barriers and facilitators to framing goals of patient care (GOPC) and factors motivating decision making is relatively unexplored [1, 2, 3] . A three part survey of physicians at an Australian hospital in a culturally and linguistically diverse suburb ( Table 1) . Identification of levels of confidence and barriers and facilitators to GOPC discussion and decision making was the main outcome measure. Factors influencing decision-making was analysed through scenarios. Results: 22 out of 96 eligible participants responded; 12 female, 10 male, clinical experience 4-31 years. Level of confidence was ranked between "somewhat confident and very confident." All but one respondent had six months of ICU experience. No differences in the level of confidence among Physician groups. 14 barriers and 8 facilitators were identified; poor prognosis and patient or family request were most common facilitators; conflict between treating teams and the patient/surrogate and language barriers were most common barriers. Factors driving GOPC decision-making included clinical, value judgement, communication, prognostication, justice and avoidance. Numerous barriers and facilitators were identified. Factors driving decision making did not just consider clinical factors; conflict and We aimed to investigate physician-related factors contributing to individual variability in end-of-life (EoL) decision-making in the intensive care unit (ICU). Qualitative study with semi-structured interviews with 19 specialists in critical care, (experience 2-32 years) from 5 Swedish ICUs. Data was analyzed in accordance to principles of thematic analyses. Most of the respondents felt that the intensivist's personality played a major role in EoL decisions (Table 1) . Individual variability was considered inevitable. Views on acceptable outcome: Respondents experienced that the possible outcome for patients was interpreted very differently and subjectively among colleagues, and what seemed an acceptable patient-outcome for one doctor, was not acceptable for another. Values: Most of the respondents were well aware that they might be affected by their own values and attitudes in the decision-making process. Interestingly, several respondents mentioned that they thought that patients that were marginalized by society, especially drug-abusers could be at risk for receiving decisions to limit life sustaining treatments (LST) more often than others. None of the respondents thought that their own religious beliefs played any part in decision making. Fear of criticism: Among the less experienced respondents there was a clear sense of fear of making a questionable assessment of the patient's medical prognosis. There was a fear for criticism from colleagues that were not directly involved in the decision-making, and may have made another decision. This created a wish among younger respondents to defer or avoid participating in decision-making. Physician-related, individual variability in EoL decisions primarily consisted of differing views on acceptable outcome, values and fear of criticism. Can (Figure 1 ). Within each quartile of SOFA score, mortality was highest in patients with pneumonia and peritonitis and lowest in patients with cellulitis (see Figure 1 ). The Sepsis-3 Consensus definition identified organ dysfunction as the hallmark feature of sepsis [1] . In developing Sepsis-3, the sequential organ failure assessment (SOFA) score was chosen for its prognostic value and relative ease of implementation clinically [2] . We propose an update based on epidemiologic data from two intensive care databases that more effectively captures organ dysfunction in the context of Sepsis-3. Using the MIMIC-III (exploration) and e-ICU (validation) databases, we extracted patients with suspicion of infection to form the study cohort. The predictive power of each SOFA component was assessed using the area under the curve (AUC) for in-hospital mortality. A logistic model with the LASSO penalty was used to find an alternative statistically optimal score. Results: By utilising alternate markers of organ dysfunction (e.g. lactate, pH, urea nitrogen) we demonstrated a significant improvement in AUC for several versions of the new score, SOFA2.0 ( Figure 1 ). The SOFA score can be updated to reflect current advances in clinical practice. Using epidemiologic data, we have shown that substitution of existing components with more powerful measures of organ dysfunction may provide an improved score with greater predictive power. Moreover, SOFA 2.0 exhibits equivalent ease of implementation, but better reflects organ dysfunction in the context of Sepsis-3. Introduction: Risk of acute organ failure (AOF) in cancer patients(pts) on systemic cancer treatment isunknown. However, 5% of non-hematologic and 15% of hematologic cancer pts will need admission to intensive care unit (ICU). IPOP-SCI-2017/01 is a prospective cohort study designed to ascertain the cumulative incidence of AOF in adult cancer pts. Single centre prospective cohort study with consecutive sampling of adult cancer pts admitted for unscheduled inpatient care while on, or up to 8 weeks after, systemic cancer treatment. Primary endpoint was AOF as defined by quick SOFA. Six months accrual expected an accrual of 400 pts to infera population risk AOF with a standard error of 1%. Between 08/2018 and 02/2019 10392 pts were on systemic anticancer treatment, 358 had unscheduled inpatient care and were eligible for inclusion and 285 were included. Median age was 64 years, 51% were male, 52% had adjusted Charlson Comorbidity Index (CCI) > 3 and hematologic cancers accounted for 22% of pts. The cumulative risk of AOF on hospital admission was 35% (95%CI: 31-39); and of AOF during hospital stay was 40% (95%CI: 35-44). AOF was associated with older age, CCI > 3,hematologic malignancy, shorter median time from diagnosis and > 1 prior line of therapy. On admission, 62% of pts were considered not eligible for artificial organ replacement therapy (noAORT) and 34% of pts who developed AOF while inhospital were judged noAORT. Overall, 17 (15%) of AOF pts wereadmitted to ICU, 31.5% for AORT. Median follow up 9.5 months (min 6; max 12). Inpatient mortalitywas 18%, with ICU mortality rate of 59%, with median cohort survival 4.5 months (95%CI: 3.5-5.4). On multivariate analysis, AOF was an independent poor prognostic factor (HR 1.6; 95%CI 1.2-2.1). Risk of AOF in cancer pts admitted for unscheduled inpatient care while on systemictreatment is 35%, and risk of ICU is 15%. AOF in cancer pts was an independent poor prognostic factor. A severity-of-illness score in patients with tuberculosis requiring intensive care U Lalla, E Irusen, B Allwood, J Taljaard, C Koegelenberg Tygerberg Academic Hospital, Internal Medicine, Division of Pulmonology and ICU, Cape Town, South Africa Critical Care 2020, 24(Suppl 1):P383 We previously retrospectively validated a 6-point severity-of-illness score aimed at identifying patients at risk of dying of tuberculosis (TB) in the intensive care unit (ICU). Parameters included septic shock, human immunodeficiency virus with CD 4 <200/mm 3 , renal dysfunction, ratio of partial pressure of arterial oxygen to fraction of inspired oxygen (PaO 2 :FiO 2 ) <200mmHg, diffuse parenchymal infiltrates and no TB treatment on admission. The aim of this study was to validate and refine the severity-of-illness score in patients with tuberculosis requiring intensive care. We performed a prospective observational study with a planned post-hoc retrospective analysis, enrolling all adult patients with confirmed TB admitted to the medical intensive care unit from 1 February 2015 to 31 July 2018. Descriptive statistics and Chi-square or Fisher's exact tests were performed on dichotomous categorical variables, and t-tests on continuous data. Patients were categorized as hospital survivors or non-survivors. The 6-point score and the refined 4-point score were calculated from data obtained on ICU admission. Results: Forty-one of 78 patients (52.6%) died. The 6-point scores of nonsurvivors were higher (3.5+/-1.3 vs 2.7+/-1.2; p=0.01). A score ≥3 vs. <3 was associated with increased mortality (64.0% vs. 32.1%; OR 3.75; 95%CI, 1.25-10.01; p=0.01)( Table 1) . Post-hoc, a PaO 2 :FiO 2 < 200mmHg and no TB treatment on admission failed to predict mortality whereas any immunosuppression did. A revised 4-point score (septic shock, any immunosuppression, acute kidney injury and lack of lobar consolidation) demonstrated higher scores in non-survivors (2.8+/-1.1 vs. 1.6+/-1.1; p<0.001). A score ≥3 vs. ≤2 was associated with a higher mortality (78.4% vs. 29.3%; OR 8.76; 95%CI, 3.12-24.59; p<0.001) ( Table 1) . The 6-point severity-of-illness score identified patients at higher risk of death. We were able to derive and retrospectively validate a simplified 4-point score with a superior predictive power. Chronic critical illness remains a scientific challenge, from its conceptualization to its impact on patient prognosis [1] . We evaluated the long-term evolution of ICU survivors by identifying the real burden of prolonged critical illness on survival, quality of life and hospital readmissions. We conducted a prospective cohort in 16 Brazilian hospitals including 1616 ICU survivors with an ICU stay > 72h. We compared the patients diagnosed with chronic critical illness with the other patients. Telephone Follow-up at 3 and 6 months. Quality of life was measured by the SF-12 questionnaire. It was observed that 38% of patients had some definition of chronic critical illness. Chronic critically ill patients had higher mortality at 6 months (p=0.012). This difference is mainly due to higher intrahospital mortality (p=0.0001). Mortality after hospital discharge was similar between groups. There was no difference in hospital readmission rate at 6 months. Various scores are developed to predict pulmonary complications such as ARISCAT for patients at-risk of postoperative pulmonary complication [1] and LIPS for patients at-risk of lung injury [2] . The aim of this study was to compare these scores with ours for predicting pulmonary complications in mechanically ventilated patients in SICU. This prospective observational study was conducted in SICU at a university hospital. Adult patients admitted to SICU and required mechanical ventilation >24 hours were included. Primary endpoint was the composite of pulmonary complications including pneumonia, ARDS, atelectasis, reintubation, and tracheostomy. Multivariate analysis was performed to identify risk factors of pulmonary complications and the predictive score was developed. The ROC analysis was performed to compare power of ARISCAT, LIPS and our newly developed score for predicting pulmonary complications. Outcomes in intensive care units have been reported to be better in higher-volume units [1, 2] . We compared outcomes for high-risk patients between low and higher volume units. Audit data from Irish ICUs is analysed and reported by the Intensive Care National Audit & Research Centre (ICNARC) in London. ICNARC report risk-adjusted mortality rates in all patients and in low-risk patients(predicted mortality rate <20%) for each Unit, using the ICNARCH-2015 model to predict the risk of death. We used this data to calculate the proportion of high-risk patients(predicted mortality >20%) in each Unit, the mortality rate for high-risk patients, the riskadjusted mortality rate and we compared the overall risk-adjusted mortality between low and high volume units. The median number of annual new-patient admissions among 18 participating Units was 390; units below this were defined as lowvolume and those above as high-volume Units. The proportion of all admissions to each Unit who were high-risk ranged from 8% to 54%(mean 34%). Unit mortality rates for high-risk patients ranged from 33% to 69%. The ratio of observed to expected mortality(Standardized Mortality Ratio -SMR) for high risk admissions in each Unit ranged from 0.87 to 1.34(mean 1.07). In Fig. 1 Introduction: ADL weakening is often seen after intensive care and called postintensive-care syndrome (PICS). This is also seen in even outside ICU and proposed to be called post-acute-care syndrome (PACS), especially in elderly patients. In patients with infection, SOFA score is famous for predicting in-hospital mortality, but there are no tools for predicting ADL weakening during admission. To search for risk factors for ADL weakening during admission other than the age, we conducted a retrospective observational study. The subjects were surviving patients with infection, aged from 16 to 89 who were admitted to our department from April 1, 2018 to May 31, 2019. Information of basic characteristics, laboratory data on admission and adjunctive therapies were extracted from our database. We use Barthel Index (BI) as ADL evaluation, and the BI at discharge were evaluated by nurses. We stratified patients by BI at discharge of over 60 or not, and investigated factors that predicted it. We compared each factor between 2 groups, and perform a logistic regression analysis with those that had a significant effect clinically or statistically. Despite improved outcomes of intensive care unit (ICU) patients, sleep deprivation remains a major concern after ICU discharge. Multifaceted causes make it difficult to treat and understand [1] . Not many studies have explored sleep deprivation beyond ICU. This is evidenced by findings from a recent systematic review [2] which included 8 studies with only one study [3] reporting sleep deprivation beyond ICU. The aim of this paper is to present findings of sleep deprivation beyond ICU from a larger study that examined the experience of critical illness in ICU and beyond in the context of daily sedation interruption. Hermeneutic phenomenology was used to conduct the study. 12 participants aged 18 years and above who fulfilled the enrolment criteria were enrolled into the study. The cohort comprised 7 male and 5 female participants. In-depth face to face interviews at two weeks after discharge were conducted and repeated at six to eleven months. Interviews were audio taped, transcribed and thematically analysed. Significant statements were highlighted and categorized for emergent themes. Six participants continued to experience sleep deprivation up to eleven months after ICU. Two cited dreams about ICU, three could not explain why they continued to fail to sleep and one stated that he continued hearing ICU alarms in the silence of the night. Sleep deprivation continues beyond ICU due to nightmares, delusional memories and unexplained reasons. Further research is needed to establish causes of sleep deprivation and explore ways to promote sleep in critical illness survivors after ICU discharge. Frailty is being increasingly seen as an independent syndrome. Frail patients now account for an increasing proportion of hospital and critical care admissions [1] . We aimed to compare frailty and mortality in our intensive care unit. Clinical Frailty Score (CFS) was incorporated within the electronic health record (EHR) 2019. We performed this retrospective analysis on the data collected between Jan'19 and Oct'19. The predictor and outcome for this study were frailty and hospital mortality respectively. All demographic data, acute physiology score, critical care and hospital outcome data were automatically collected in the EHR and recorded. We used a cut off of CFS>3 and above to define non-frail and frail respectively. Chi-squared test, simple and multiple logistic regression were used. Adjustment was done for ICNARC score and age. Total number of patients was 848, of which 140 (16.5%) died in hospital. Within the patients<65 years (n=392), 79 (20%) were recorded as frail or vulnerable. The number of elective and emergency admission were 292(34%) and 556(66%) respectively. In the frail and nonfrail, mortality rates were 30% and 9.5% (p<0.001) respectively, with odds ratio of 4.14 (95% CI 2.8, 6; p<0.001) ( Age is a well-known risk factor for Critical Care (CC) outcome and is incorporated into many prognostic tools; however, this has been criticized for assumption of normal physiology for young at baseline. In recent years, frailty in CC prognostication has been of interest, with meta-analysis correlating worsening outcomes with increasing frailty [1] . In this study, we compared the effect of frailty versus age for determining hospital survival for critically ill patients. We conducted a prospective cohort in 16 Brazilian hospitals including 1616 survivors of an ICU stay > 72h. We compared chronic critically ill patients (ICU stay> 10 days) and the other patients. We performed psychological and functional presential assessment in patients within 48 hours of ICU discharge and by telephone at 3 and 6 months. The prevalence of chronic critically ill patients was 26%. Regarding outcomes, chronic critically ill patients had a higher incidence of depressive symptoms than other patients in the immediate post-ICU discharge (p = 0.004), as well as a higher incidence of muscle weakness (p <0.001). However, in subsequent evaluations, we found no difference between groups regarding psychological symptoms -depression, anxiety and post-traumatic stress. Higher functional dependence was observed in critically ill patients, but without difference in the quality of life score, both in the physical (p = 0.87) and mental (p = 0.84) domains. Chronic critically ill patients, when compared to patients with stay> 72h, have a higher incidence of depressive symptoms at ICU discharge. This difference disappears in the follow up. Chronic critically ill patients present higher levels of functional dependence but without repercussions on quality of life scores. Introduction: Activation of the inflammatory response after cardiac arrest (CA) is a welldocumented phenomenon that may lead to multi-organ failure and death. We hypothesized that white blood cell count (WBC), one marker of inflammation, is associated with one-year mortality in ICU treated CA patients. We used a nationwide registry with data from five academic ICUs to identify adult CA patients treated between January 1 st 2003 and December 31 st 2013. We evaluated the association between the most abnormal WBC within 24 hours of hospital admission and one-year mortality. We accounted for baseline risk of death using multivariable logistic regression (adjusted for age, gender and 24h sequential organ failure assessment [SOFA] score). A total of 5,543 patients were included in the analysis. Of those patients 2,387 (43%) were alive one year after CA. We plotted WBC against baseline risk of death and through graphic examination of a locally weighted scatterplot smoothing (lowess) curve found the lowest risk of death to be associated with a WBC of 12 (E 9 /l) ( Figure 1 MRPs were identified by a specialist ICU pharmacist during this programme and classified by their significance on a scale of one to four. Logistic regression was used to determine if demographic factors were associated with the occurrence of a clinically significant MRP -a significance score of two or above (Figure 1) . The adjusted model included age, ICU LOS, hospital LOS, APACHE II, number of days of renal replacement therapy, number of days of ventilation, the number of medications prescribed at ICU discharge, and the WHO analgesia classification at InS:PIRE. There were increased odds of having a clinically significant MRP for hospital LOS (OR Results: 62·8% (n=115) of patients required at least one pharmacy intervention. The median number of interventions required per patient was one (IQR 0-2); the maximum number was six. 198 MRPs were recorded in this cohort. The most common intervention was clarifying duration of treatment (n=44), followed by education (n=33), and correcting drug omissions (n=27). The BNF drug class most frequently associated with MRPs was neurological (n=65), which comprises analgesics (n=45) and psychiatric medications (n=20) ( Figure 1 ). This was followed by cardiovascular medications (n=40), gastrointestinal medications (n=34), nutritional medications (n=25), and others (n=34). Many ICU survivors experience MRPs. The most common class of MRP was neurological, reflecting the high incidence of chronic pain and psychiatric illness in this population Following discussion with ICU staff, ward staff and FY1 doctors, a formal standardized handover system was introduced. This involved a verbal handover to the appropriate FY1 by an ICU doctor and the patient drug chart to be rewritten in ICU at the time of handover. The next change was to display posters on the wards to alert staff that the medical team are to be contacted when a patient comes to the ward from ICU and to ensure the drug chart is completed. The baseline data showed a median time delay of 4 hours, with one patient waiting 14 hours for a drug chart. Following the interventions the median time delay has decreased to 0 hours within 4 months as demonstrated in Figure 1 . The changes have received positive feedback from ICU staff, ward staff and FY1 doctors. The aim of reducing the time delay by 50% has been achieved with the median time delay now 0 hours. This has improved patient safety by significantly reduced delays in medications and through the introduction of a standardized handover. This has also provided an opportunity for junior doctors on the wards to seek clarification regarding medications and the clinical management plan for the patient. This has established a communication channel between ICU and the wards making patient care safer and more effective. Telemonitoring outside the ICU is scarce. But with innovative wearables measuring respiratory and heart rate wirelessly, culture on intrahospital telemonitoring should definitely change. However, culture has been known to be one of the most crucial success factors in innovation, especially in health care. Human design thinking is a promising tool in health care innovation but rarely used in a multidisciplinary team to initiate an innovation culture and stimulate sustainable collaboration. The aim of this study was to initiate a pilot project with a multidisciplinary team to start using wearables for Early Warning Score (EWS) on a clinical ward. Human Design Thinking was used to write a value proposition on wearables in clinically admitted neutropenic hematologic patients in an academic center. A multidisciplinary team was performed to cover all disciplines involved in the technical, clinical and administrative parts of the project. A vendor was chosen based on its product specifications in relation to the present hospital monitoring infrastructure. In design thinking sessions, critical appraisal of multiple telemonitoring factors was performed by sub teams and a Canvas projectplan was constructed. The project team was formed of registered nurses, physicians, ITspecialists, Electronic Health Record consultants; a critical care physician was appointed as project leader. The main critical factors were: unseamlessly transmitting of both heart and respiratory rates including appropriate movements filtering to the nurse's smartphones direct uploading into electronic health record with automated EWS calculation nurse driven protocol on EWS follow up. Philips Healthcare with their IntelliVue Guardian wearable biosensor was the chosen vendor ( Figure 1 ). Design thinking in a multidisciplinary health care team could positively influence the innovation culture. Scientific evaluation of this wearable will focus on both nurse's acceptance and data storage and is expected in the summer of 2020. Severity, readmission and lengh of stay were lower in patients receiving discharges directly to home. It seems like a safe way to discharge low-risk short stay patients. It seems to save resources and reduce costs, as well as the need for hospital beds. However, futher estudies are needed to actualy evaluate this safety. Forty-four cultures were analyzed with ePlex ( Figure 1 ). Complete agreement with conventional diagnostics was observed in 38/44 cases. No false-positive results were observed, yielding a sensitivity and specificity of 90% and 100% respectively for target pathogens. Time to result was, on average, 10.4 h faster with ePlex compared to conventional diagnostics. Antimicrobial therapy could have been optimized in 5 patients based on the ePlex result, but treatment was only changed in one case (E.coli CTX-M+) receiving meropenem 8.5 h before the antibiogram was available. The ePlex blood culture panels provide high accuracy and significantly faster results. The current implementation offers substantial potential value at a minimal cost, and is a feasible approach to 24-h/ 7 days blood culture diagnostics in many hospital settings. However, efforts to increase adherence are needed. The rapid increase of extended spectrum β-lactamases (ESBL)-producing pathogens worldwide makes it difficult to choose appropriate antibiotics in patients with Gram-negative bacterial infection. Cica-beta reagent (KANTO CHEMICAL, Tokyo, Japan) is a chromogenic test to detect beta-lactamases such as ESBL from bacterial colonies. The purpose of the study was to reveal whether Cica-beta reagent could detect ESBL-producing pathogens directly from urine rather than bacterial colonies to make a rapid bedside diagnosis of the antibiotic susceptibility of Gramnegative pathogens. We conducted a prospective observational study from July 2019 to October 2019. Patients were eligible if they were performed urinary culture tests and Gram negative pathogens were detected at least 1+ from their urine samples. The urine sample was centrifugated at 200 x g for 5min. The supernatant of sample was re-centrifugated at 1500 x g for 5min and the pellet was mixed with Cica-beta reagent. The test was considered positive when the enzymatic reaction turned from yellow to red or orange. (Fig.1) . The Bundle approach could be an effective strategy to prevent hospital-acquisition of drug-resistant pathogens in ICUs. Fig. 1 In the ASPECT-NP trial, C/T was noninferior to MEM for the treatment of HABP/VABP. We evaluated outcomes from that study in the subgroup of pts failing current antibacterial therapy for HABP/VABP at enrollment. Methods: ASPECT-NP was a randomized, controlled, double-blind, phase 3 trial in which mechanically ventilated pts with HABP/VABP received 3 g C/T or 1 g MEM every 8 h for 8-14 days. Pts with >24 h of active gram-negative antibacterial therapy within 72 h prior to first dose of study therapy were excluded, except those pts failing current treatment (i.e. signs/symptoms of the current HABP/VABP were persisting/worsening despite ≥48h of antibiotic treatment). Primary and key secondary endpoints, respectively, were 28-day all-cause mortality (ACM) and clinical response at test of cure (TOC; 7-14 days after end of therapy) in the intent to treat (ITT) population. Pts failing current antibacterial therapy for HABP/VABP were prospectively categorized as a clinically relevant subgroup. At baseline, failing current therapy for HABP/VABP was reported in 53/362 (15%) C/T and 40/364 (11%) MEM ITT pts, mostly piperacillin/ tazobactam (34%), 3rd/4th-generation cephalosporins (31%), fluoroquinolones (29%), and aminoglycosides (12%). Baseline demographic and clinical characteristics in this subgroup, including prior therapy regimen, were generally similar between treatment arms. There were greater proportions of patients with ESBL+ Enterobacterales (30%) and Pseudomonas aeruginosa (25%) in the C/T arm than the MEM arm (20% and 13%, respectively). Lower 28-day ACM was seen with C/T than MEM, as evidenced by 95% confidence intervals for treatment differences that excluded zero ( Figure 1 ); statistical significance cannot be assumed because subgroup analyses in this study were not corrected for multiplicity. Conclusions: C/T was an effective treatment for HABP/VABP pts who had failed initial therapy. Catheter-related blood stream infection (CRBSI) is common serious infections and associated with increased mortality in intensive care units (ICU). One of the most important strategy to prevent CRBSI is to minimize the duration of central venous catheterization. We built a medical team consisting of doctors, nurses and pharmacists in ICU to discuss whether patients needed central venous catheter (CVC) in terms of monitoring hemodynamics and administering drugs, and recommend catheter removal to attending physicians every day in April 2019. The purpose of this study is to evaluate whether our team-based approach could shorten the total duration of catheterization and reduce CRBSI. This was a retrospective historical control study conducted from April 2018 to October 2019 in the ICU of a tertiary care hospital in Japan. Every patient admitted to the ICU during the study period was eligible if they were inserted CVC. Patients were divided into 2 groups: Conventional (from April 2018 to March 2019) or Intervention (from April 2019 to October 2019). We set the primary endpoint as onset of CRBSI. The secondary endpoints included the duration of central venous catheterization, the length of ICU stay and hospital mortality. CRBSI was defined as bloodstream infection in patients with CVC, not related to another site. We included 428 patients: 259 in the Conventional group and 169 in the Intervention group. The reduced, though nonsignificant, tendency of CRBSI was observed in the Intervention group [hazard ratio, 0.341(95% confidence interval, 0.074-1.557; p = 0.213)]. The Intervention group was significantly associated with reduced duration of central venous catheterization (5 days vs 7 days; p < 0.01). No difference was observed in the length of ICU stay and in-hospital mortality between groups. The team-based approach to assess CVC necessity could shorten the duration of central venous catheterization and might reduce CRBSI. Introduction: Empiric antibiotic therapy decisions are based upon a combined prediction of infecting pathogen and local antibiotic susceptibility, adapted to patients' characteristics. The objective of this study was to describe the pathogen predominance and to evaluate the probability of covering the most common Gram-negative pathogens in ICU patients with respiratory infections. Methods: Data were collected from multiple US and European hospitals as part of the SMART Surveillance Program (2018). MIC (mg/L) testing was performed by broth microdilution, with susceptibility defined as follows for P. aeruginosa & Enterobacterales: ceftolozane/tazobactam Results: 78 hospitals from 24 countries provided 3384 Gram-negative respiratory isolates from patients located in an ICU in the US (22%), Eastern Europe (40%) and Western Europe (38%) in 2018. The 4 most common pathogens isolated were P. aeruginosa (26%), K. pneumoniae (15%), E. coli (13%), and A. baumannii (10%). Among Enterobacterales, 30% (588/1955) were ESBL positive. Figure 1 provides the probability of covering the most common respiratory Gram-negative pathogens from ICU patients. Co-resistance between commonly prescribed first line β-lactam antibiotics is common: when nonsusceptibility (NS) of one agent was present, susceptibility to other βlactams was generally <35%. Ceftolozane/tazobactam provided the most reliable in vitro activity in both empiric and adjustment prescribing scenarios compared to other β-lactam antibiotics. Ceftolozane/tazobactam ensured a wide coverage of the most common Gram-negative respiratory pathogens demonstrating high susceptibility levels and provided the most reliable in vitro activity in both empiric and adjustment antibiotic prescribing scenarios. Further studies are needed to define the clinical benefits that may translate from these findings. Evaluation of compliance of ICU staff for VAP prevention strategies on the outcome of patients A Kaur Fortis Hospital, Critical Care, Mohali, India Critical Care 2020, 24(Suppl 1):P426 Ventilator-associated pneumonia is the most common nosocomial infection diagnosed in adult critical care units. It is associated with prolonged duration of mechanical ventilation, increased ICU stay and increased mortality. It continues to be a major challenge to the critical care physicians despite advances in diagnostic and treatment modalities. The primary objective of the study was to determine the compliance of ICU staff towards VAP prevention bundle and secondary objective was to determine the incidence, risk factors and outcome of VAP patients. Single center, prospective, observational study carried out from February 2017 to July 2018. Patients mechanically ventilated for more than 48 hours and satisfying the inclusion and exclusion criteria were enrolled in the study. VAP was diagnosed using the CDC criteria and clinical pulmonary infection score. VAP preventive strategies were employed and compliance of ICU staff was assessed. A total of 1617 patients were admitted to ICU over the set time period and out of them 483 patients were ventilated for more than 48 hours. Among them only 166 patients fulfilled the inclusion and exclusion criteria and were enrolled in the present study. Excellent compliance was observed in head end elevation, sedation vacation, stress ulcer prophylaxis, and heat moist exchanger filter use, good compliance in oral care and hand hygiene and moderate to poor compliance in subglottic suctioning. The incidence of VAP was 24.7% with a VAP rate of 30.87/1000 ventilator days. There was a significant correlation between primary diagnosis, hemodialysis, massive blood transfusion and development of VAP (p<0.05)). Mean duration of ventilation (p<0.001) and mortality (p<0.05) were highly significant in VAP patients. Conclusions: Improvement in compliance towards VAP bundle and reduction of risk factors can help decrease incidence of VAP and related morbidity and mortality. Preventive strategies are effective in reducing ventilation-associated pneumonia (VAP) in adults [1, 2] . In paediatric population there are no data about VAP prevention, so we introduced a new bundle (VAP-p) based on the available evidence for adults. This was designed as a before-after study. We enrolled all patients admitted to 8-bed medical-surgical paediatric ICU at Gemelli Hospital in Rome, requiring mechanical ventilation for at least 48 hours. Patients with pre-existing tracheostomy were excluded. VAP-p has been introduced since 2018 in order to improve quality of assistance. Our bundle consisted in twice a day oral hygiene with chlorhexidine swab, daily check of oral bacterial colonization and aspiration prevention. Comparison was made with an historical group including patients admitted before VAP-p introduction (since 2016 to 2017). All data about demographics, antimicrobial therapy, ICU stay and treatments, were collected. Results: 162 patients were included (82 after and 80 before VAP-p introduction). 5(6%) events of VAP were recorded in VAP-p group compared to 16 (20%, p=0.01) VAP-p group had less VAP per days of mechanical ventilation (1/100 compared to 3.3/100 p=0.01). Multivariate analysis yielded an OR of 0.23 (95%CI 0.07-0.81) for VAP incidence after bundle introduction. Mortality rate was slightly reduced in VAP-p group (2.4%vs 6.2% p=ns). Patients who developed VAP required more days on mechanical ventilation and had higher mortality rate (12 vs 5 days p<0.001 and 14%vs 3% p=0.047, respectively). Our VAP-p seems effective in reducing VAP incidence in critically ill paediatric population. Introduction: Ceftolozane/Tazobactam (C/T) is a new antibiotic against MDR Gramnegative bacteria infections, whose target population are the critically ill patients. Even though 2/1 g dose safety administered as a 3hour-infusion has been already assessed, these patients can be under renal replacement therapy (RRT) and suffer changes in their volume of distribution (Vd) that may affect antibiotic concentrations. The objective was to determine concentration reached by 3g C/T (3hour infusion) in septic patients on RRT (CVVHDF) and interdose behavior. We have used RRT machine PRISMAFLEX with Oxyris filter and M100. HPLC-UV method was used for simultaneous quantification of C/T. Study population consisted of three obese critically ill patients with sepsis, on CVVHDF while receiving 3g C/T every 8 hours. Samples were taken of prefilter, post filter blood and effluent, 30 min before infusion and 1, 2 and 4 hours after the end of it. We found great interpatient variability with the lowest Cconcentration values in the patient with more hemodynamic instability using Oxyris filter. Even though Cmax was less than reported in healthy subjects, we found similar values of AUC and t ½ in comparison with healthy population studies. Cmax of T was also compromised in comparison with values reported in healthy subjects, but with higher AUC and t ½. CVVHDF contributes to C/T clearance. M100 filter showed the least clearance and higher values of AUC and t ½. Extraction rate was similar in all patients and filters (Figure 1) . Cmax achieved may be impaired because of the varying Vd caused by obesity and RRT, but not affecting the antibiotic characteristics and behaviour. We conclude that because of the variety of clinical conditions, C-concentration is compromised particularly in hemodynamically unstable patients. However, the small sample doesn´t let us extrapolate these results. The extended infusion seems to be adequate to achieve the interdose antibiotic concentration. The use of biomarkers in sepsis is useful for early diagnosis and prognosis. The desired marker should be sensitive, specific, fast and accurate. Procalcitonin (PCT) measurement is approved by the FDA even its efficacy is still under question. The determination of alfatorquetenovirus (TTV) could be a useful marker [1] . We analyzed 55 samples from 23 patients admitted to ICU with clinical suspicion of sepsis. Analytical data of C-reactive protein (CRP), neutrophils and procalcitonin were collected. The SOFA and APACHE II scales were calculated and patients stratified according to these values in good and poor prognosis. TTV quantitative determination was carried by using a quantitative CRP 2 . We calculated area under the curve (AUC) of TTV plasma levels as a function of time. The statistical analysis involved U-Mann-Whitney and Spearman test, using Chi 2 for qualitative variables. Results showed a not significant (NS) inverse relationship between the TTV AUC and the patient proinflammatory level. A tendency (NS) was found between poor prognosis and the PCT median values and CRP being higher in the poor prognosis.group. A trend showed lower TTV DNA count related to worse prognosis. An inverse relationship was found between PCT and CRP values and the TTV copies /ml plasma, NS correlation in the case of PCT. There was a clear trend between the neutrophils´expansion and the regression line slope, obtained between TTV loads in the first two study steps. Fig. 1 (abstract P432) . Patient PK/PD measurements value>0.05), suggesting that the adsorptive mechanism wasn't primarily mediated by plasma protein. HA330 was saturated after adsorption of a total of 280.28±12.33 mg of VAN. The adsorptive kinetics showed an exponential reduction of VAN mass that reached a plateau after 30 minutes of circulation. In our study, simulating in vivo conditions of HP using HA330 during sepsis, a rapid and clinically relevant removal of VAN has been shown. After 2 hours of HP, we suggest to assess VAN plasma concentration and a loading dose of VAN should be considered. However, not knowing the potential interactions with other drugs, further in vivo studies are warranted to confirm these findings. Assessing the volume of blood taken for blood culture and culture positivitydo we need to take less blood? It is commonly accepted that larger blood culture (BC) volumes (BCV) increase the yield of true positive cultures, and optimally 20 cc of blood should be obtained per set (2 bottles). Only scarce data exists on the matter of optimal BCV. It is unknown what is the minimal volume that is acceptable for BC. The objective of this study was to determine the association between BCV and the rate of positive BC. Blood taken for cultures in BD BACTEC Plus Aerobic/F negative bottles was collected from ICUs and acute care floors at 8 hospitals at the DMC over 6 months. Blood volume was estimated automatically from blood background signal data in the BD BACTEC FX instrument. Cultures were analyzed for each bottle. Data was summarized for every month as the average volume and number of cultures taken and rate of positive BC for every unit. Units were classified according to unit type (ICU, Medicine, Surgery, Mixed, Emergency Department (ED), Organ/BMT or "other" which did not fit the previous categories) and analyzed as a group. A total of 23795 cultures were taken in 84 units. There is a positive association between BV and positive BC rate for ED and "other" units (IRR=1.27, p=0.006 for the ED, IRR=5.00, p<0.001 for "other" unit). All other units had no association between BV and positive BC rate (Figure 1 ). Secondary analysis, excluding Pediatric units, gave very similar results. When comparing BV between unit types, the ED and "other" unit had significantly lower BV (2.4 ml in the ED and 3.5 ml in "other" unit compared to 4.9 ml in the ICU, 4.7 ml in surgery, 4.2 ml in mixed and 7.7 ml in BMT). The correlation between BV and positive BC rate is probably limited to units taking very low BV for cultures. Units taking volumes above 4 ml show no improvement in positive BC rate when higher volumes are taken. Better prospective studies should be done to further establish the minimal BCV needed and spare unnecessary blood loss to hospitalized patients without compromising BC yield. De-escalating antibiotics in sepsis with the use of T2MR in a 35bed Greek university ICU C Vrettou, E Douka, I Papachatzakis, K Sarri, E Gavrielatou, E Mizi, S Zakynthinos 1st ICU Department, University of Athens, Evangelismos General Hospital, ICU, Athens, Greece Critical Care 2020, 24(Suppl 1):P440 In septic patients, the early use of appropriate empiric antibiotic therapy reduces morbidity and mortality. De-escalation refers to narrowing the broad-spectrum antibiotics once the pathogen and sensitivities are known. T2 Magnetic Resonance (T2MR) is a novel method of detecting ESKAPE pathogens. We aim at investigating if using T2MR technology can expedite de-escalation of broad spectrum antibiotics. This is a prospective observational study conducted in our 35-bed university ICU. Inclusion criteria were critically ill patients age>18 y.o., with newly diagnosed sepsis and clinical suspicion of ESKAPE bloodstream infection. A sample for T2MR and a blood culture (BC) sample were collected simultaneously from the patients enrolled. The T2MR Bacteria panel test was run according to the manufacturer's guidelines and the BCs were processed according to the hospital standard procedures. We recorded clinical data and administered antibiotics. Results: 26 patients were included in the study. Mean time to culture positivity was 84 hours while mean time to T2MR result was 3.5 hours. In 20 patients the results of T2MR were in concordance with the BCs. In the remaining 6 cases, the BCs were negative while the T2 MR detected one or more ESKAPE pathogens. There were no false negative results. De-escalation in at least one drug was applied to 8 patients (30.8%). No escalation was applied to 15 patients (57.7%) and antibiotic escalation in 3 (11.5%). Conclusions: T2MR provides a quicker detection time that could shorten the time to targeted therapy. In our population this corresponded to early (within 6-12h) antibiotic de-escalation in approximately 1/3 of the included patients. Antibiotic stewardship in ICU. A single experience L Forcelledo 1 , E García-Prieto 1 , L López-Amor 1 , E Salgado 1 , J Fernández Dominguez 2 , M Alaguero 3 , E García-Carús 4 The increasing antibiotic resistance in microorganisms urged interventions such as the antibiotic stewardship programs in ICU focused on reducing the inappropriate use of antibiotics by improving the antibiotic selection, the dosage, administration route and length as well as improving clinical outcomes and reducing antibiotic resistance. Retrospective study where antibiotic consumption was analysed and measured in days of therapy (DOTs) between 2015 and 2018 in a medical-surgical ICU of a university hospital where a multimodal educational program was established. Specific training in infectious diseases in critically ill patients, periodic clinical and formative sessions Fig. 1 (abstract P439) . Correlation of blood culture positivity rate with blood culture volume by unit type were performed for ICU staff and specific leaders within the ICU staff designated. Results: 4128 patients were admitted to ICU. There was a reduction of 20,5% in DOTs (Figure 1 ), reduction in antimicrobial resistance rates (14,36 in 2015, 7,4 in 2018 [days of resistant microorganism/1000 patientdays]) without an impact in ICU global mortality (19,7% in 2015, 19,4% in 2018). The resistant bacteria registered were Acinetobacter baumannii, S. aureus MR, BLEE and carbapenemase-producing Enterobacteriaceae, Pseudomonas aeruginosa MR and Clostridium difficile. The safe in antimicrobial consumption was 391500€ (70% reduction). The ICU stay decreased from 8,2 days (2015) to 6,9 (2018) , with no variation in mean APACHE II (17,8) . The bigger decrease in antibiotic consumption was in Colistin related to the reduction in resistance bacteria, in special Acinetobacter baumannii, in linezolid and in piperacilin/tazobactam, even more remarkable in 2018 due to shortage of supplies which meant an increase in meropenem. The application of an antibiotic stewardship program in ICU succeeded in reducing antibiotic consumption, antibiotic resistance and costs without an impact in clinical outcomes like mortality or ICU stay. Clinical outcomes of isavuconazole versus voriconazole for the primary treatment of invasive aspergillosis: subset analysis of Indian data from SECURE trial P Kundu, S Kamat, A Mane Pfizer Limited, Medical Affairs, Mumbai, India Critical Care 2020, 24(Suppl 1):P442 The SECURE trial was designed to compare the safety and efficacy of Isavuconazole (A) versus Voriconazole (V) for primary treatment of invasive mould disease caused by Aspergillus and other filamentous fungi. The present analysis is aimed at comparing the Indian subset of patients with that of the overall trial population and to ascertain any similarity or difference in the primary efficacy endpoint and safety/tolerability in these two groups. In SECURE trial, 258 patients in one group received (I) & another 258 patients received (V). The Indian subset had 29 patients. We have done a qualitative analysis as the sample size of the Indian subset was small. Non-inferiority of (I) to (V) in terms of all cause mortality from first dose to day 42 was assessed in overall patients. The treatment difference between (I) and (V) group in the Indian subset of patients was analyzed. Proportion of patients who had to discontinue treatment due to TEAEs was analyzed. The all-cause mortality in the overall trial population met noninferiority margin (Table 1 ). In the Indian subset, it was higher for (I) than (V). There was a lower incidence of ocular, hepatobiliary, skin & subcutaneous tissue disorders in the (I) treated patients (see Table 1 ). In Indian subset, the above adverse events were less in the (I) group, but statistical inference could not be done due to small sample size. However, similar trend of less number of patients discontinuing therapy due to TEAEs in the (I) treated patients was seen in the overall patients & the Indian subset. The all-cause mortality in the Indian subset was higher in the (I) patients. A trend similar to the overall population regarding safety parameters favoring (I) was seen in the Indian patients. Considering the significantly higher prevalence of IA in India, suitably powered study design is necessary to draw definitive conclusions on the non-inferior efficacy & better safety & tolerability of (I) over (V) in patients of IA. Introduction: Ventilator-associated pneumonia (VAP) is one of the most frequent healthcare-associated infections, correlated with increased mortality,extended hospital stay and prolonged mechanical ventilation. Considering the latest outbreak of multiresistant A. baumannii infections in the critically ill patients with VAP, there is a growing concern regarding challenges of the antibiotherapy in these patients. Although ceftazidim-avibactam is considered to have limited effects on A. baumannii, it is reported to have a synergic activity in combination with other antibiotics. We performed a retrospective, observational study which included 24 ICU patients diagnosed with VAP(CPIS > 6). OXA 23 A. baumannii was isolated from the tracheal secretions using a rapid molecular diagnostic platform(Unyvero A50 System). Patients were divided in two groups according to the antibiotherapy:group A Meropenem + Colistin and group B Meropenem + Colistin + Ceftazidim-avibactam.Statistical analysis was performed using GraphPad6 applying T-test and Kaplan-Meier curves, having the in-hospital mortality as primary outcome and days of mechanical ventilation and hospital stay as secondary outcomes. Mean age(y.o) in group A was 46 and 52 in group B and in both groups mean Charlson comorbidity index was 3 points. Survival percent was higher in the group treated with Ceftazidim-avibactam (67 % vs 57 %, p = 0.08)- (Fig. 1) . Length of stay was significantly decreased in group B (26.5 days vs 43 days in group A, p = 0.046). Number of days under mechanical ventilation was also decreased in the ceftazidim-avibactam group (19 vs 22) but the data was not statistically significant. In light of the important thread of multiresistant A. baumannii and the lack of therapeutic measures, the synergistic activity of Ceftazidim-avibactam use in combination with other antibiotics may be a promising approach to lower the mortality and hospitalization in critically ill patients diagnosed with VAP. Impact of patient colonization on admission to intensive care on 28 and 90 days mortality G Dabar 1 , C Harmouch 2 , E Nasser Ayoub 3 , Y Habli 4 , G Sleilaty 5 , J Infections caused by Multi Resistant Bacteria are a major health problem, especially in ICUs, and it may be associated with high mortality rates. Colonization precedes infection in most instances; therefore it may be a marker of a poor outcome. We tried to determine the impact of colonization on mortality at 28 and 90 days in a population of patients admitted to one medical and one surgical ICU in the same institution. Medical records review over three years 2016-2018 of all patients admitted to one surgical et one medical ICU at Hotel Dieu de France Hospital staying more than 24h. Colonization to resistant bacteria was defined as MRSA, ESBL, MDR, and VRE. All patient received a nasal and rectal screen on ICU admission, in intubated patients tracheal aspirate was considered as colonization in the absence of clinical respiratory tract infection. Demographics, APACHE, SOFA, immunosupression, Charleston comorbidity index, length of stay, mechanical ventilation, hospitalization and antibiotic use in the previous 3 month were collected. Mortality at 28 and 90 days was assessed through medical records or phone call. Pearson Chi-Square was calculated for the association of colonization and mortality at 28 and 90 days, and subsequently odd ratio was estimated. Introduction: Critically unwell patients have been observed to respond unpredictably to traditional intermittent dosing (ID) schedules of vancomycin, likely due to the complex physiological derangements caused by critical illness. Continuous infusion (CI) of vancomycin has been suggested to overcome such problems by allowing more regular therapeutic drug monitoring and subsequent effective dose titration [1] . This study conducted at a tertiary intensive care unit, reports our experience following implementation of a continuous vancomycin infusion protocol. Prospective data was collected over two consecuative periods of three months, initially capturing plasma levels for ID (target level of 15-20mg/L) followed by reviewing plasma concentration levels in a CI protocol (target level of 20-25 mg/L). Patients recieving renal replacement therapy were excluded. A total of 22 intermittent vancomycin prescriptions were administered and dosing levels observed. In the three month CI period, 26 patients received CI vancomycin and levels subsequently checked. The CI protocol resulted in increased blood sampling (107 samples in CI group vs. 73 samples in ID cohort). Two non serious incidents were reported in the CI cohort relating to preparation of vancomycin. Both groups had a comparable median time to therapeutic range (48 hours). However, CI vancomycin group had a greater proportion of first samples outside the desired therapeutic range (70%vs 36%) (Figure 1 ). As the therapy continued, CI vancomycin demonstrated a greater propensity towards consistent therapeutic levels than that observed with ID. 83% of patients on a CI regime achieve the desired target levels compared to 77% in the ID cohort (Fig. 1) . It was positive for single or multiple microbes in 9(26.5%) and 22(64.7%) samples respectively. Single or multiple resistance genes were detected in 5(25%) and 20(80%) samples respectively. BFPCR was positive only for bacteria in 13(38.2%), virus in 2(5.9%) and for both in 16(47.1%) cases. Influenza A was found in 10(29.4%) cases. The most common organisms in community and hospital acquired pneumonia were Streptococcus pneumoniae (4/12) and A. baumannii (10/22) respectively. Bacterial cultures were concordant with BFPCR in 11/11 (100%) of positive cases. Decisions to change antibiotics could be taken earlier based on BFPCR (p< 0.001) than if were based solely on culturesboth in culture positive (9.7± 14.3 vs 50.03±6.0 hrs) and negative cases (14.7±14.9 vs 48.0+4.3 hrs) where antibiotics would have remained unchanged. Based on BFPCR antibiotics were escalated in 17(50%) patients and teicoplanin (11/ 19) was most often stopped. BAL BFPCR were obtained significantly earlier, identified more organisms and bacterial resistance than culture reports and lead to more frequent and earlier antibiotic changes. Severe community-acquired pneumonia (SCAP) is a frequent cause of hospitalization and mortality. Ceftaroline is efficacious for treatment of CAP (PORT risk class III or IV). Most severe patients were excluded from the clinical trials, so the efficacy of ceftaroline in these kind of patients is unknown Methods: This is a health record-based retrospective before-after study in a tertiary care hospital. All SCAP patients admitted in ICU between November 2017 and February 2019 receiving ceftaroline were included. Control group included patients with same inclusion criteria but receiving ceftriaxone. Propensity scores to adjust for potential baseline differences between groups were performed. Levofloxacin or azythromicin were administered in both groups. Primary outcome was the change in SOFA score over the first 96h and secondary were days of mechanical ventilation, respiratory failure at 96h, need of rescue antibiotics, length of stay and mortality Results: There were 28 patients in ceftaroline group and 43 in ceftriaxone group. Baseline characteristics were similar except from more intubated patients in ceftaroline group (Figure 1 ). There were less respiratory failure at 96h in patients with ceftaroline treatment (-73.3% vs. -51.6%; p 0,015), but no differences in other organ failures, mortality, days of mechanical ventilation or LOS. There were more need of rescue antibiotics in ceftriaxone group (7.1% vs .46.5%; p 0,001). We found more Streptococcus pneumoniae isolation in ceftaroline group (18 (64.2%) vs 11 (25.5%); p = 0.001); more empiric use of oseltamir (16 (57.1%) vs 11 (25.5%); p = 0.012), but no more influenzae infections (11 (39.2%) vs 8 (18.6%); p = 0.101). S. aureus was detected in 1 patient in ceftaroline group and in 5 in ceftriaxone group. Introduction: Acute respiratory failure (ARF) due to pulmonary infections is a usual cause of intensive care unit (ICU) admission. Immigration patterns and iatrogenic immune-suppression have made tuberculosis (TB) a common disease in Western Europe. Severe TB requiring ICU care is rare. Nevertheless, mortality associated with active TB and ARF is poor [1] . Adult patients with TB admitted to ICU from 2014-2018 were identified retrospectively. Diagnosis was based on: positive cultures of sputum, bronchial aspirates or bronchioalveolar lavage fluid. Demographic characteristics, reasons for admission, HIV status, anti-TB treatment and mortality were recorded. Total of 25 patients with TB were admitted to ICU. Mean APACHE II score was 20,2±6,9. Sixteen were male. Mean age 49,1±14,7 years. Eight (32%) were HIV-positive, 3 (12%) diabetes mellitus type 2, 3 (12%) chronic liver disease. Six (24%) had other causes of immunesuppression. Main causes for ICU admission were ARF due to non- Mycobacterium tuberculosis pathogens in 64%, acute liver failure in 12%, septic shock due to non-respiratory cause in 8%. Overall, 52% were on anti-TB treatment at time of admission. TB involved the lung parenchyma in all patients. Pleural involvement was present in 12% and lymph node in 20%. Extrapulmonary sites were present in 28%: urogenital, gastrointestinal, bone marrow. Pathogens identified in over-infections: 16% gram positive coccus, 20% gram negative bacilli, 16% fungal, 4% MDR-pathogen. One patient HIV-positive suffered ARF due to Pneumocystis jiroveci. Overall, 64% died during ICU stay. Besides its latent evolution, mortality of TB patients admitted to ICU is extremely high. ARF due to over-infection seems to be the main cause for ICU admission and mortality. Better preventive approach of these patients may improve their outcome. Introduction: Human African Trypanosomiasis (HAT) is rarely encountered by critical care clinicians, but is an important differential for fever in the returning tropical traveler. Late disease is characterized by seizures, fever and multi-organ failure [1, 2] . We present an anonymized case presenting from an endemic area in Zambia referred for tertiary critical care management. The patient was too obtunded to give informed consent and his relatives could not be contacted despite extensive efforts. A middle-aged man with no past medical history from rural Zambia presented to a local clinical officer post with fever and arthralgia. He was treated twice with anti-malarial medication without resolution of symptoms. Two months later he was admitted febrile and obtunded to a local hospital with worsening confusion. He was transferred 8 hours by ambulance to our facility in Lusaka, which is the only public tertiary critical care unit in Zambia Results: GCS on arrival was E3M4V2 without localizing neurology. Microbiology investigations were negative, including for toxoplasma, cryptococcus, HIV or malaria. The patient suffered a generalized seizure followed by a sustained GCS of 3 and was admitted to the ICU for invasive ventilation and seizure control. Peripheral blood smears demonstrated trypanosomes consistent with HAT secondary to Trypanosoma brucei rhodesiense. He was commenced on melarsoprol but rapidly deteriorated, with signs of melarsoprol-induced arsenic encephalopathy and subsequent tonsillar herniation. His death was confirmed by neurological criteria. Conclusions: ICU management of fulminant HAT involves supportive neurocritical care plus melarsoprol, a toxic arsenic compound with common side effects of hepatotoxicity and dysrhythmia. Arsenic encephalopathy occurs in 10% of late HAT, with a fatality rate of 70% [1] . Early diagnosis is associated with a 95% survival rate in developed world travelers repatriated from endemic areas [2] . Lithium chloride to prevent endothelial damage by serum from septic shock patients (in vitro study) A Kuzovlev 1 The aim of the study was to investigate into effectiveness of lithium chloride (LiCl) as agent that prevents damage to the monolayer of endothelial cells under the action of serum from multiple trauma patients with septic shock. Methods: Serum from 5 pts with septic shock (Sepsis-3) and 5 healthy donors was withdrawn. Monolayer of Ea.hy926 endothelial cells were incubated for 3 hrs at 37°C with healthy person's serum and with septic patient's serum without LiCl and with it at concentrations of 0.01 mmol, 0.1 mmol, 1 mmol, 10 mmol. LiCl was added 1 hour before the change of serum. After incubation cells were washed and fixed with 2% paraform solution and permeabilized with 1% Triton X-100 solution. Fixed cells were stained with primary antibodies to VEcadherin and then incubated with secondary antibodies conjugated with Oregon Green 488 fluorescent dye as well as with phalloid red and Hoechst dye 33342. Images were processed by fluorescence microscope and ImageJ 1.44p and MetaVue 4.6 programs. Western blotting was used to detect antibodies to VE-cadherin, claudin and GSK-3beta. Statistics included Mann-Whitney test and chi-square test. Incubation of a monolayer of endothelial cells with 5% serum of septic shock patients led to loss of VE-cadherin contacts and decrease of claudine. Preincubation with LiCl 0.01 mmol did not prevent dismantling of claudine, actin, VE-cadherins; 0.1 mmol LiCl prevented it (p>0.05), but at higher concentrations (1 mmol, 10 mmol) almost completely protected endothelial monolayer from destruction of intercellular contacts (p<0.05). Serum had almost no effect on the phospho-GSK-3β level after 5 min, 15 min, 30 min and 1 hr, but caused a significant (60%) decrease in its level after 2 and 4 hrs. LiCl (1 mmol) caused a significant increase in phospho-GSK-3β already 15 mins and up to 4 hrs after exposure. LiCl prevents septic damage to the monolayer of endothelial cells in vitro in a GSK-3beta mediated way. Introduction: The autonomic nervous system (ANS) controls both heart rate and vascular tone, which are known to be impaired during septic shock (SS) . Acute inflammation is presumed to increase arterial stiffness of large arteries in experimental studies [1] . The objectives of this work are to verify if standard SS resuscitation modulate mechanical vascular properties and to verify if alterations in these vascular properties and ANS activity are correlated. A protocol of fecal peritonitis septic shock and standard resuscitation (fluids and noradrenaline) was applied on 6 pigs. The arterial blood pressure waveform was recorded in the central aorta and in the femoral and radial arteries. The characteristic arterial time constant tau was computed at the three arterial sites, based on the twoelement Windkessel model [2] . The total arterial compliance (AC) and the total peripheral resistance (TPR) were also estimated. Baroreflex sensitivity (BRS), low frequency (LF, 0.04-0.15 Hz) spectral power of diastolic blood pressure, and indices of heart rate variability (HRV) were computed to assess ANS functionality. Results: Septic shock induced a severe vascular disarray, decoupling the usual pressure wave propagation from central to peripheral sites, as shown by the inversion of pulse pressure (PP) amplification, with a higher PP in the central aorta than in the peripheral arteries during shock. The time constant tau together with AC and TPR were independently decreased. A decrease in BRS, LF power, and HRV describe an ANS dysfunction. After the administration of fluids and noradrenaline, both vascular and autonomic dysfunction persisted and these were found to be significantly correlated. Measures of mechanical vascular function and ANS activity could represent an useful end-point to guide further clinical investigations and refine our understanding of SS mechanisms, especially under medical treatment. Introduction: Lipopolysaccharide (LPS), is a component of gram-negative bacteria known for its activation of the host immune system. The phospholipid transfer protein (PLTP) has previously been shown to promote the binding of LPS to lipoproteins, to limit inflammation and to lower mortality following injections of LPS or bacterial infection. The aim of the present study was to investigate the role of PLTP and lipoproteins in the detoxification of LPS from the peritoneal cavity. Injection of LPS intra-peritoneally (IP) (1mg/kg) to wild type (WT) and PLTP knocked-out mice (PLTP-KO) (n = 9 per group). Mass concentration and activity of LPS were quantitated by LCMSMS analysis of 3-hydroxymyristate and LAL bioassay, respectively. Lipoprotein fractions in plasma were separated by ultracentrifugation (n=10 vs n =12). Following intra-peritoneal injection, clearance of intra-abdominal LPS was faster and plasma neutralization was more efficient in WT than in PLTP-KO mice ( Figure 1) . Indeed, LPS found in plasma of WT mice was proportionally less active, sustaining a higher capacity for WT mice to neutralize LPS (Figure 1B) . Quantitative dosage of LPS in portal blood, 15 minutes after IP injection, revealed that plasma LPS associates rapidly with the lipoprotein fraction (HDL plus LDL), and in higher proportions as compared to PLTP-KO mice (66 [62-72] % vs 50 [41-54] %, respectively; p < 0.01). In line with previous studies, these observations now indicate that, LPS readily associates with lipoproteins in a neutralizing process PLTP mediated. Finally, even with a heavy LPS load (25 mg/kg), the bulk of LPS was still found in the lipoprotein fraction (80 [80-90] %), suggesting that lipoproteins plus PLTP in WT mice have a high capacity to detoxify intraperitoneal LPS. In a model of peritonitis, lipoproteins and PLTP were found to constitute key playors for peritoneal clearance and neutralization of LPS. It emerges as a key pathway for the resolution of the inflammatory response in peritonitis. Introduction: Autotaxin (ATX, Enpp2) is a secreted enzyme present in biological fluids that catalyses the production of lysophosphatidic acid (LPA). LPA is a bioactive phospholipid evoking various cellular responses in most cell types. Upregulated ATX levels have been reported in various chronic inflammatory diseases. Given the established role of LPA in the inflammatory response, we investigated a possible role for the ATX/LPA axis in LPS-induced endotoxemia. Methods: LPS was injected intraperitoneally (20 mg/kg) in mice producing 50% ATX levels (ATX df/+ , heterozygous null mutant mice), in mice producing 20-30% reduced ATX levels upon inducible inactivation (R26CreER T2 /Enpp2 n/n mice) and in mice expressing 150-200% increased ATX levels (Enpp2-Tg mice). Kaplan-Meier survival analysis was performed. ATX activity was measured using the TOOS activity assay. Results: ATX df/+ mice that produce almost 50% reduced serum ATX levels show increased survival compared to their littermate controls. For the inducible inactivation of ATX, Enpp2 n/n targeted mice were crossed with the R26Cre-ER T2 mice and tamoxifen induction enabled temporal control of floxed gene expression. R26CreER T2 /Enpp2 n/n mice were more protected against LPS-induced endotoxemia compared to control mice. Enpp2-Tg mice overexpressing autotaxin and showing a 2-fold increase in plasma levels do not display improved survival rates compared to control group. Conclusions: ATX participates in systemic inflammation, as reduced ATX levels in circulation decrease lethality of mice from caused by LPS. The excess amount of circulating ATX does not exacerbate the systemic inflammatory response to LPS. Introduction: Pneumonia (Pn) is a prevalent and severe infectious lung disease. Host genetics plays an essential role in the pathogenesis of infectious diseases including Pn [1] . The aim of the study was to analyze the variability of genes associated with neutrophil activation in pneumonia. To identify differential expressed genes (DEGs) in communityacquired (CAP) and hospital-acquired pneumonia (HAP) dataset «Genome-wide blood transcriptional profiling in critically ill patients -MARS consortium» (GSE65682) from Gene Expression Omnibus was analyzed (logFC≥2.0, FDR-corrected P-value<0.05). DEGs associated with neutrophil activation were selected according to gene ontology GO:0042119 («neutrophil activation»). With the use of GTEx Portal and Blood eQTL browser, we searched for eSNPs (expression single nucleotide polymorphisms) in whole blood for neutrophil activation genes differentially expressed in CAP/HAP. These eSNPs were further analyzed for their association with Pn via the Global Biobank Engine (GBE). A total of 46 DEGs from GSE65682 correspond to GO:0042119 genes (43 up-and 3 down-regulated) of which 39 genes were common to CAP and HAP. Functional enrichment of 46 DEGs based on DisGeNET detected top-5 diseases associated with these genes (FDR-corrected P-value<0.05): Myeloid Leukemia, chronic; Sepsis; Asthma; Lung diseases; Allergic asthma. For these 46 genes 1366 eSNPs common to GTEx Portal and Blood eQTL browser were identified. More than half of all variants were located on the second chromosome and influenced the expression of TNFAIP6 and IL18RAP genes. Among all eSNPs we identified variants associated with Pn in the GBE (Table 1) . We identified genes related to neutrophil activation, genetic variability of which was associated with pneumonia. Sepsis was induced in wild-type C57Bl6 mice (n=41) and Cse knockout mice (n=41) by i.p. injection of 10 8 cfu/mice MDR P. aeruginosa. Similar experiments were repeated after cyclophosphamide induced neutropenia. Survival was recorded for 7 days. Mice were sacrificed for determination of bacterial load and myeloperoxidase (MPO) activity as a surrogate marker of myeloid cell recruitment. Cytokines were measured in serum by Legendplex inflammatory panel. Total leukocytes from mice spleens, with or without pretreatment with the H 2 S donor GYY3147, were incubated with 1 x 10 4 cfu/mL MDR P. aeruginosa. Bacterial clearance was recorded. We observed a significant decrease in survival of Cse -/mice as compared to Cse +/+ mice (12% vs. 47%; p: 0.025). This survival advantage was eliminated in neutropenic mice (17% for both groups, p: 0.873). Cse -/mice had increased pathogen load in the liver (6.57 ± 0.13 vs 5.26 ± 0.50, p: 0.029) and lung (6.70 ± 0.17 vs 5.29 ± 0.55, p: 0.035). MPO activity was lower in Cse -/mice in the liver (634 ± 71 vs 1029± 179, p: 0.048) and lung (7627 ± 585 vs 11121 ± 1468, p: 0.34). Cse +/+ mice had increased serum levels of IL-23 (121.13 ± 33.68 vs 31.41 ± 7.02 of Cse -/-, p: 0.001); MCP-1 (4769.91 ± 908.83 vs 1940.37 ± 1062.65, p: 0.026) and GM-CSF (22.91 ± 4.66 vs 8.11 ± 1.92, p: 0.004). Phagocytic activity of leukocytes from Cse -/mice was reduced compared to Cse +/+ mice. This deficit was eliminated after GYY4137 pretreatment (Fig. 1) . Deficiency of host-derived H 2 S leads to increased susceptibility to MDR P. aeruginosa infection due to an inefficient neutrophil chemotaxis and neutrophil mediated phagocytosis. Acknowledgement Funded by the ITN Horizon 2020 Marie-Curie European Sepsis Academy Introduction: Neuroinflammation often develops in sepsis along with increasing permeability of the blood-brain barrier (BBB), which leads to septic encephalopathy [1] . The barrier is formed by tight junction structures between the cerebral endothelial cells [2] . We investigated the expression of tight junction proteins related to endothelial permeability in brain autopsy specimens in critically ill patients deceased with sepsis, and analyzed the relationship of BBB damage and measures systemic inflammation and systemic organ dysfunction. Case series included all adult patients deceased with sepsis in the years 2007-2015 with brain specimens taken at autopsy available. Specimens were categorized according to anatomical location (cerebrum, hippocampus, cerebellum). The immunohistochemical stainings were performed for occludin, ZO-1 and claudin. Patients were categorized as having BBB damage if there was no expression of occludin in the endothelium of cerebral microvessels. Results: 38% (18/47) developed multiple organ failure before death. 74.5% (35/47) had septic shock. The deceased with BBB damage had higher SOFA maximum scores (16 vs.14, p=0.04), and had more often procalcitonin levels above 10 (56 % vs.28 %, p= 0.045). BBB damage in cerebellum was more common in cases with c reactive protein above 100 mg/L as compared with CRP less than 100 (69% vs. 31 %, p=0.025). Absence of ZO-1 expression in cerebral meningeal samples associated with BBB damage (17 % vs. 0 %, p=0.046). Positive blood cultures (n = 22) were associated to absence of ZO-1 expression in cerebellar glial cells (92 % vs. 44 %, p=0.018). In fatal sepsis, damaged BBB defined as loss of cerebral endothelial expression of occludin ( Figure 1 ) is related with severe organ dysfunction and systemic inflammation. Loss of ZO-1 in endothelial cells associates with BBB damage, and sepsis contributes to ZO-1 loss in cerebellar glial cells. Oxylipins are oxidative breakdown products of cell membrane fatty acids. Animal models have demonstrated that various vasoactive oxylipin pathways may be implicated in septic shock pathophysiology but these have been poorly studied in humans. Oxylipin profiling was performed on serum samples collected on enrolment to the VANISH (Vasopressin vs. Norepinephrine as Initial Therapy in Septic Shock) trial. Samples were analysed with liquid chromatography-mass spectrometry. Patients were followed up until 28 days. Results: Samples were collected from 154 of 409 (37.7%) patients on inclusion to the trial and 39 (25.3%) had died by 28 days. Non-survivors were found to have higher levels of a number of oxylipins including: 14,15-dihydroxyeicosatrienoic acid (DHET) (p<0.01), 11,12-DHET (p=0.03), 15(S)-hydroxyeicosatetraenoic acid (p=0.02), 14-hydroxyoctadeca-pentaenoic acid (p=0.04) but lower levels of the precursor eicosapentaenoic acid (p=0.012). When corrected for multiple comparisons with the Benjamini-Hochberg test, only 14,15-DHET remained significant (p=0.025). Although there was a difference in median 14,15-DHET levels between survivors and non-survivors, many values were below the level of detection (n=84/154 (54.5%)). As such, we also analysed 14-15-DHET as a binary variable (Figure 1 ). Patients with detectable 14,15-DHET were more likely to die (HR 2.4 [95% CI 1.2-4.6], p< 0.01) and have a higher median lactate (p =0.01) and total SOFA score (p< 0.01) than those patients where baseline 14,15-DHET was undetectable. Our study suggests the oxylipin 14,15-DHET may be associated with septic shock severity and 28-day mortality. These results are consistent with the known vasodilatory actions of this class of oxylipin. More work is needed to confirm its exact role in septic shock and whether this pathway is amenable to therapeutic intervention. Introduction: Activation of neutrophils is a mandatory stage and a sensitive marker of systemic inflammatory conditions that can lead to the development of multiorgan failure. The aim of the study was to investigate into the antiinflammatory effects of lithium chloride on human neutrophils in vitro. Study was carried out on neutrophils isolated from the blood of 5 healthy donors. 50% of neutrophils were activated by 100 mkM fMLP, 50% -by 100 ng/ml lipopolysaccharide (LPS); then their activity was evaluated by fluorescent antibodies to CD11b and CD66b degranulation markers. Intact and activated neutrophils were treated with a solution of lithium chloride (9 mmol). Immunoblotting was used to assess GSK3b activity in neutrophils. Mann-Whitney criterion and p<0.05 were used for statistics. Results: Lithium chloride 9 mmol decreased the level of expression of CD11b on intact neutrophils by 16% (p=0.07), CD66b by 15% (p=0.07). fMLP increased CD11b expression on neutrophils by 2.6 times (p=0.0007), CD66b by 2.5 times (p=0, 0022). Addition of lithium chloride solution to fMLP activated neutrophils reduced the expression of CD11b (p= 0.0317) and CD66b (p=0.0079). LPS increased CD11b and CD 66b expression by 2.1 times (p=0.0007, p=0.0022, respectively); addition of lithium chloride reduced the expression of CD11b (p=0,0317) and CD66b (p=0.0079) on neutrophils. fMLP led to a dephosphorylation of GSK-3b by 47% (p<0.05), lithium chloride increased its phosphorylation by 387% (p <0.05). Adding lithium chloride to activated fMLP neutrophils restored the level of GSK-3b phosphorylation by 277% compared to controls (p<0.05). Lithium chloride modulates the inflammatory activation of neutrophils by bacterial components through the phosphorylation of GSK3b in neutrophils. Human host immune responses to lipopolysaccharide: A comparison study between in vivo endotoxemia model and ex vivo lipopolysaccharide stimulations using an immune profiling panel DM Tawfik Introduction: Sepsis, a leading cause of mortality among critically-ill patients in the ICU, recently recognized by the WHO as a global health burden. Patients that suffer from sepsis exhibit an early hyper-inflammatory immune response which can lead to organ failure and death. In our study, we assessed the immune modulations in the human in vivo endotoxemia model and compared it to ex vivo lipopolysaccharides (LPS) stimulation using 38 transcriptomic markers. Methods: Eight healthy volunteers were challenged with intravenous LPS in vivo. In parallel, blood from another 8 volunteers was challenged with LPS ex vivo. Blood was collected before and after 4 hours of LPS challenge and tested with the Immune Profiling Panel (IPP) prototype using the FilmArray® system. The use of IPP showed that markers from the innate immunity dominated the response to LPS in vivo, mainly markers related to monocytes and neutrophils. Comparing the two models, in vivo and ex vivo, revealed that most of the markers were modulated in a similar pattern (68%). Some cytokine markers such as TNF, IFN-γ and IL-1β were under-expressed ex vivo compared to in vivo. T-cell markers were either unchanged or up-modulated ex vivo, compared to a down-modulation in vivo. Interestingly, markers related to neutrophils were expressed in opposite directions, which might be due to the presence of cell recruitment and feedback loops in vivo. The majority of IPP markers showed similar patterns of expression post-LPS challenge in both models, except for several markers related to neutrophils and T-cells. The IPP tool was able to capture the early immune response in the human in vivo endotoxemia model, which is a translational model mimicking immune host response in septic patients. Introduction: Serum levels of tyrosine kinase receptor Mer and its ligand Gas6 predict mortality in septic patients in the intensive care unit. However, whether their early measurement at emergency department (ED) presentation also predicts mortality and organ failure still needs to be clarified. In this multicentre observational study, septic patients admitted to 5 Italian EDs were included [1] . At ED presentation blood samples were taken for routine biochemical analyses and serum Mer and Gas6 measurement. Urinalyses, blood gas analyses and chest X-ray were routinely performed. Mortality at 7 and 30 days, as well as the presence of organ damage such as acute kidney injury (AKI), thrombocytopenia, PT-INR derangement and sepsis-induced coagulopathy (SIC) were evaluated according to baseline levels of Mer and Gas6. In conclusion, neither Mer nor Gas6 are early predictors of mortality in septic patients at ED presentation. However, Mer independently predicted the development of SIC, thrombocytopenia and PT-INR derangement in this population. Glycocalyx shedding correlates with positive fluid balance and respiratory failure in patients with septic shock N Takeyama, Y Kajita, T Terajima, H Mori, T Irahara, M Tsuda, H Kano Aichi Medical University, Department of Emergency and Critical Care Medicine, Aichi, Japan Critical Care 2020, 24(Suppl 1):P463 Endothelial hyperpermeability would play a major role in septic shock related organ failure. The aim of this study is to clarify the relationship between glycocalyx shedding and respiratory failure, SOFA score, plasma angiopoietin (Ang)-2 level and patient survival. Methods: Plasma samples were collected from 30 septic shock patients from admission to ICU discharge and 10 healthy volunteers. Plasma syndecan (Syn)-1 and Ang-2 were measured and clinical data was also collected. Septic shock patients were classified into 3 groups according to the time-course change of Syn-1 levels. Excess Syn-1 (>400 ng/ml) during 0 to 3 days and remaining high following 4 to 7 days were assigned to Group I. Excess Ang-2 during 0 to 3 days and decreased following 4 to 7 days were assigned to Group II. Moderate increase (<400 ng/ml) during 0 to 7 days were assigned to Group III. Results: Plasma Syn-1 levels are positively associated with increased Ang-2 levels (r2=0.41, P= 0.005), suggesting that Ang-2 is involved in endothelial hyperpermeability. Fluid balance and ventilator-free days (VFD) are significantly increased in Group I as compared with Group III. SOFA score, Apache II and patient outcome does not show any differences between Groups I, II, and III. The positive correlation between glycocalyx shedding and fluid balance indicates plasma Syn-1 may be a valuable marker for endothelial hyperpermeability. The negative correlation between glycocalyx shedding and VFD indicates plasma Syn-1 may be a valuable marker for respiratory failure. The plasma level of Syn-1 for prognosis and organ failure excluding ARDS in patients with septic shock requires further investigation. Serial procalcitonin measurements in the intensive care unit at Hiroshima University Hospital K Hosokawa, S Yamaga, M Fujino, K Ota, N Shime Hiroshima University Hospital, Department of Emergency and Critical Care Medicine, Hiroshima, Japan Critical Care 2020, 24(Suppl 1):P464 Introduction: Serum procalcitonin (PCT) is a promising biomarker for differentiating bacterial infections from other inflammatory states. Moreover, including serial PCT measurements in the management of acute respiratory infection reduces the duration of antibiotic therapy without increasing the mortality. However, limited real-world information is available regarding the use of PCT in intensive care units (ICUs). We extracted and analysed data from January 1 to December 31, 2018 from all the orders and results of PCT measurements in the ICU (26 beds) at Hiroshima University Hospital. A total of 1,252 PCT measurements from 409 ICU patients were included. In 170 patients, PCT was tested ≥3 times during a single ICU stay. Serial PCT measurements showed a fade-out pattern (76 [45%] patients), a second day-peaked decrease pattern (35 [21%] patients), and a series of negative patterns (30 [18%] patients). Compared to patients who demonstrated the fade-out pattern, those who demonstrated the second day-peaked decrease pattern had higher mortality rates (3% vs. 20%, p < 0.01). Approximately one-third patients in the ICU who had decreasing serial PCT values demonstrated the second day-peaked decrease pattern. Since this group of patients had poorer survival, further studies are needed to clarify the association between a late rise in PCT levels and delayed therapeutic intervention. The research was performed on 200 full-term newborns; no clinical signs of bacterial infection were diagnosed. On the 1, 5, 20 days the plasmà concentration of IL-1ß, IL-6, IL-8, TNF-α, G-CSF, sFas, FGF, NO was determined by capture ELISA; CD3CD19, CD3CD4, CD3CD8, CD69, CD71, CD95, HLA-DR, CD34, CD14, CD3CD56, lymphocytes in apoptosis -immunophenotype analysis. By applying the statistical cluster population analysis of the immunological criteria under study we have evaluated the feasibility of sepsis diagnostics at the admission to the intensive therapy unit. The diagnostic rule for sepsis has been formulated By applying the "decision tree" approach to the "R" statistic medium. The cluster analysis confirms the presence of two clusters (presence of absence of sepsis: these two components explain the 60.81% of the point variability). The diagnostic rule for the early diagnostics of sepsis is as follows: disease develops providing during the first 48 hours CD95≥16.8%, NO≤9.6 mkmol/l or CD95≤16.8%, CD34≤0.2%, CD69≥4.12% or CD95≤16.8%, CD34≤0.2%, CD69≤4.12% and lymphocytes AnnexinV-FITC+PI-≥12.3%. 45 newborns featured the confirmed sepsis development. The accuracy of this diagnostics amounts to 95.41%; sensitivity to 97.06%; specificity to 94.67%; diagnostic false positive share to 5.33%; diagnostic false positive share to 2.94%; positive result accuracy to 89.19%; negative result accuracy to 98.61%. The aggregate determination of CD95, CD69, AnnexinV-FITC+ PI-, CD34 and the plasma concentration of NO enables the pre-clinical diagnostics of sepsis development. Efficacy of pancreatic stone protein in diagnosis of infection in adults: a systemic review and metaanalysis of raw patient data J Prazak 1 , P Egimann 2 , I Irincheva 3 , MJ Llewelyn 4 , D Stolz 5 , LG De Guadiana-Romualdo 6 , R Graf 7 , T Reding 7 , HJ Klein 8 , YA Que 1 Fig. 1 (abstract P469) . Impact of 24h lactate and bio-ADM values in patients with elevated lactate level at admission. The green curve in the left KM-plot illustrates data from 75 patients with 5 events; the red curve 70 patients with 18 events. The green curve in the right KM-plot illustrates data from 28 patients with 4 events; the red curve 96 patients with 48 events. Of note, differences in numbers between admission (n=328) and 24h (n=269) is related to initial mortality Introduction: Adrenomedullin (AM) is a peptide synthesized in vascular endothelial cells and cleared by the lungs. The use of AM as an inflammatory biomarker and his predictive value has been studied in critically ill patients, but not yet in veno-venous extracorporeal membrane oxygenation (ECMO). The purpose of this study was to describe the plasmatic levels of AM in patients supported with ECMO for acute respiratory failure Methods: AM (normal values <0.55 nmol/L) was measured at 5 time points: immediately before (T0), 24-h (T1) and 72-h after (T3) ECMO initiation and immediately before (T4) and 72-h (T5) after ECMO removal, in consecutive patients with severe respiratory failure supported with ECMO enrolled in the GATRA study (NCT03208270) at Fondazione IRCCS Ca' Granda -Policlinico of Milan. Data are reported as median (25 th -75 th percentile). Statistical analysis was performed using logistic and random effects regression models (to account for repeated measurements within individuals) Results: A total of 131 measurements were taken in 32 consecutive patients. AM (nmol/L) decreased along the course of ECMO: T0=2.0 (1.5-6.4), T2=2.0 (1.5-6.4), T3=1.6 (1.1-3.1), T4=1.3 (0.8-2.0), T5=0.9 (0.6-2.1) (mean diff.= -0.65, 95%: CI -0.96, -0.35). AM was lower in patients with viral compared to bacterial ARDS (mean diff.= -2.7, 95%CI -5.2, -0.2) (Figure 1 ). AM was higher in more severe patients (SOFA>= 10, N=14) compared to less severe patients (sofa< 10, N=18): 5.7±4.8 vs 1.4±0.8 nmol/L, respectively p<0.001. Basal values of AM could not predict mortality at 28 days (OR=0.8, 95%CI: 0.5-1.2) after conditioning for SOFA score and respiratory failure etiology Conclusions: AM plasmatic values seem to be higher in more severe patients and in patients with bacterial ARDS. AM decreased along the ECMO course but could not predict mortality in our group of patients Fig. 1 (abstract P471) . Plasmatic adrenomedullin during ECMO Heparin binding protein (HBP) is released from activated neutrophils upon stimulation of b2 integrins. This pro-inflammatory effect generates the hypothesis that it can be a sepsis biomarker for patients admitted at the emergency department (ED) Methods: The PROMPT study (ClinicalTrials.gov NCT03295825) took place at the ED of six Greek hospitals. Participants were admitted with suspected acute infection and at least one vital sign change. HBP was measured by an enzyme immunosorbent assay in plasma. Sepsis was diagnosed by the Sepsis-3 criteria. The primary study endpoint was the sensitivity for the diagnosis of sepsis. Outcome prediction was the secondary endpoint. A total of 371 patients were enrolled; 166 had sepsis. The most common infections among patients without and with sepsis were upper respiratory tract infections in 30.2% and 1.2%; community-acquired pneumonia in 6.8% and 28.3%; and acute pyelonephritis in 9.3% and 28.3%. Median HBP was 24.0 and 32.7 ng/ml respectively (p: 0.027). Following analysis of the area under the curve (AUC) it was found that the best discriminatory cut-off for sepsis was 19.8ng/ml. The comparative diagnostic performance of HBP versus qSOFA score is shown in Figure 1 . The odds ratio for sepsis with HBP above 19.80 ng/ml was 2.07 (p: 0.001). At the same cut-off point the sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) for the prediction of early death after 72 hours was 100%, 35.7%, 4.1% and 100% respectively. HBP is more sensitive but less specific than qSOFA for the diagnosis of sepsis in the ED. The rule-out prediction of early death seems the great merit. Chronobiological and recurrence quantification analysis of temperature rhythmicity in critically ill patients Introduction: Rhythmicity and complexity of several circadian biomarkers, such as melatonin, cortisol and temperature have been found to be modified by critical illness. We examined the potential alterations of core body temperature (CBT) fluctuations and complexity in three groups (N=21): patients with septic shock upon ICU admission (Group A, N=10), patients who developed septic shock at ICU hospitalization (Group B, N= 6) and controls (Group C, N=5). The hourly, average CBT was computed for 24 h upon ICU admission and discharge in Groups A and C, as well as during septic shock onset in Group B. Cosinor analysis of CBT curves was performed leading to the estimation of mesor (mean value), amplitude (the difference between peak and mean values) and acrophase (phase shift of maximum values in hours). Complexity of CBT signals was evaluated with Recurrence Quantification Analysis (RQA). No significant alterations in any circadian feature within groups were found, except for amplitude. Controls exhibited increased entry CBT amplitude (0.45 ± 0.19) compared to Groups A (0.28 ± 0.18, p < 0.05) and B (0.32 ± 0.13, p < 0.05). Higher entry CBT amplitude in Groups B and C was related with lower SAPS II (r = -0.72 and -0.84, p < 0.003) and APACHE II scores (r = -0.70 and -0.6, p < 0.003) respectively, reduced ICU and hospital stay in Group B (r = -0.62 and -0.64, p < 0.003) and entry SOFA score in Group C (r = -0.82, p < 0.003). Recovery CBT time series appeared more periodic in relation with ICU entry, for all groups. A more random CBT signals pattern upon Results: Among 23.011.601 individuals, 159.691 received inpatient treatment for sepsis. 41% had severe sepsis. 21% of sepsis and 28% of severe sepsis patients had an explicitly coded HAI. The proportion of HAI was higher in patients that received ICU-treatment than in patients without ICU-treatment (35% in ICU/14% in Non-ICU sepsis, 37% in ICU/17% in Non-ICU severe sepsis patients). Tab. 1 shows the foci of explicitly coded HAI. Nosocomial pneumonia was the most common HAI in all patient groups. CLABSI occurred more frequently in ICUtreated patients; 21% were affected. CAUTI and C. diff infections were more common among Non-ICU-treated sepsis patients. More than one quarter of Non-ICU-treated sepsis patients had a C. diff infection. HAI are common causes of sepsis and pose a significant healthcare burden. The proportion of patients affected and the distribution of foci differ between Non-ICU-and ICU-treated sepsis patients with important implications for sepsis management within hospitals. Impact of sepsis protocol triggered by Ramathibodi Early Warning Score (REWS) in IPD sepsis on clinical outcomes S Matupumanon 1 , Y Sutherasan 2 , D Junhasawasdikul 2 , P Theerawit 3 Sepsis is now early identified and managed during triage in the emergency department. However, there is less focus on the effect of patients' management at the ward level. We aim to evaluate the impact of the implementation of the sepsis protocol on clinical outcomes in in-patients with new-onset sepsis. We conducted a prospective observational cohort study among adult medical patients admitted to the general wards in a university hospital. A 25-month pre-protocol period (August 2016 to August 2018) was assigned to a control group, and a 14-month protocol period (September 2018 to October 2019) was allocated to a protocol group. An in-patient sepsis protocol comprised nurse-initiated sepsis protocol by Ramathibodi early warning score (REWS)≥ 2 plus suspected infection, prompt antibiotic, lactate measurement, and fluid resuscitation was implemented. (Table 1) . The implementation of in-hospital sepsis protocol was associated with significant improvement in patients' outcomes, namely lactate measurement, starting antibiotic within 1 hr, fluid management, and the shorter length of ICU stay. ICU routine nursing procedures interfere with cerebral hemodynamics in a prolonged porcine fecal peritonitis model SL Liu 1 , DC Casoni 2 , W Z'Graggen 3 , D Bervini 3 , D Berger 1 , SJ Jakob 1 Routine nursing procedures (NP) can interfere with blood pressure and cardiac output and may therefore alter cerebral hemodynamics in critical illness. This may be risk factor of sepsis-associated encephalopathy. Methods: 20 sedated and mechanically ventilated pigs were randomized to fecal peritonitis or controls (n=10, each). After 8 hours of untreated peritonitis, the animals were resuscitated for 76 hours (resuscitation period). NP [assessment of sedation (AS), tracheal suctioning (TS), change in body position (CP), lung recruitment maneuver (RM)] were performed at baseline and 8h, 32h, 56h and 72h after start of RP. Systemic and cerebral hemodynamics and O 2 saturations were recorded continuously. Shock is the most common cause of death in the postsurgical ICU, including septic shock and hypovolemic shock, reaching the 50-60% mortality in septic shock. The inadequate response of the immune system to the infection triggers a potent inflammatory cascade, where the C-Reactive Protein (CRP) is an essential key in the amplification and maintenance of this cascade. The gene encoding to CRP is located on the proximal long arm of human chromosome 1 (1q32). The GT polymorphism in the promoter sequence of CRP gene (rs2794521) has been associated with invasive pneumococcal disease. Thus, we analyze the relationship between rs2794521 polymorphism and the risk of developing septic shock in postsurgical patients. An observational, retrospective and single-center study was conducted on a sample of Caucasian patients undergoing major abdominal surgery, of which one part developed septic shock and another part developed systemic inflammatory response syndrome, who were used as control. The rs2794521 polymorphism was analyzed by Vasoactive medications are commonly used in sepsis treatment but may correlate with peripheral ischemia and the well-publicized complication of limb and digit loss. Yet, the association between limb and digit threat and the intensity, duration, and pattern of vasopressor exposure are unknown. We studied adults (2010-2014) at 12 hospitals in an integrated health system who met criteria for Sepsis-3. We identified the time to clinically apparent limb or digit threat using clinical adjudication among those with vasopressor-dependent sepsis (i.e. >1 hour of vasopressors at sepsis onset) who had a surgical evaluation within 28-days of sepsis onset. We defined daily vasopressor intensity as 0 to 4 vasopressors administered. Then, we created a time-dependent model for threat with mortality as a competing risk with a weight function to estimates the varying contribution of vasopressors over time. We determined the subdistribution hazard (SH) ratio of threat for various patterns of vasopressor exposure and intensity, adjusted for age, baseline risk factors, and sequential organ failure assessment (SOFA) score at sepsis onset. Of 110,621 adults with sepsis, 13,147 (12%) were vasopressordependent (age, 66 [IQR, 56-77]; 7,040 [54%] males; max SOFA score, 8 [SD 5] ). Of these, 3,664 (28%) died and 117 (0.9%) had evaluations for limb or digit threat 4 [IQR, 1-8] days after sepsis onset. The model-based weight function showed the contribution of vasopressors to threat was stable over time ( Fig 1A) . Overall, a 1 unit increase in cumulative vasopressor exposure was associated with risk of threat (SH ratio, 2.60 [95%CI, 1.60-4.23], p<.001). For various patterns of vasopressor exposure, greater intensity associated with increased risk of threat ( Fig 1B) . Compared to constant exposure, an increasing and peak pattern associated with the greatest SH (Fig 1C) . Cumulative vasopressor exposure was associated with an increased risk-adjusted hazard of limb or digit threat following sepsis. Fig. 1 (abstract P509) . Relationship between vasopressor exposure and limb or digit threat following vasopressor-dependent sepsis. Panel A demonstrates the estimated contribution of daily vasopressor intensity prior to surgical evaluation for limb or digit threat, with mortality as a competing risk. Panel B and C explore the relationship between threat and both cumulative vasopressor exposure and the pattern of exposure following sepsis onset. (B) The maximum cumulative vasopressor exposure was associated with the highest risk of limb or digit threat (SHR 17.5) when compared to reference exposure pattern (SHR 1.0, reference). (C) Increasing (SHR 1.2) and peak (SHR 1.2) patterns of cumulative exposure were associate with an increased SH of limb threat, while a decreasing pattern was associated with a lower risk (SHR 0.8) when compared to constant intensity (SHR 1.0, reference). Abbreviations: SHR: Subdistribution hazard ratio proportion of encounters transitioning from phenotype at presentation within 24hrs, by arrival phenotype assignment and probability of membership. (C) tSNE plots for α-type, ß-type, y-type, and ∂-type, with core (dark), marginal (light), and non-members (grey) in plots on the left and core, marginal, non members, and transitioning members (black) on the right Fig. 1 (abstract P005). Isolated microorganisms Critical Care References: 1. Wertz et al. Critical Care Explorations 1: e0058 The ProCESS Investigators Choosing Wisely Guidelines for the Provision of Intensive Care Services, Version 2. ICS Structured Patient Handovers References: 1. Care of the critically ill woman in childbirth The ProQOL manual: The professional quality of life scale:Compassion satisfaction, burnout & compassion fatigue/secondary trauma scales References: 1. Shimabukuro-Vornhagen A et al. CA The Code: Professional Standards of practice and behaviour for nurses, midwives and nursing associates P415 Introduction: The aim of this study was to compare factors associated with the ICU mortality for VAP due to multidrug-resistant (MDR) Klebsiella spp. in case of monobacterial (MO) vs polibacterial (PO) origin. Methods: Retrospective data analysis of patients treated in ICU with MDR Klebsiella spp. strains as pathogens of VAP during three year period was carried out. Results: Data of 71 patients were evaluated. MO vs PO of MDR Klebsiella spp. VAP cases was found to be 35 (49.3 %) vs 36 (50.7%), p = 0.906. The ICU mortality was 9/35 (25.7%) in MO, and 17/36 (47.2%) in PO one, p = 0.060. Statistical significant differences of survivors vs non-survivors in MO and PO VAP due to MDR Klebsiella spp. were found in medians of neutrophilosis 78 P430 Introduction: We study the population structure and resistome of MDR Enterobacterales and Pseudomonas aeruginosa isolates, C/T-susceptible or -resistant, recovered from low respiratory, intraabdominal and urinary tract infections of ICU patients of 11 Portuguese Hospitals (STEP study Results: In E. coli, two VIM-2 producers were found (ST131-B2-H30-O25:H4-CTX-M-15 and ST88-C-H39-O22:H4) (C/T-MIC=0.5/4-1/4 mg/L). A KPC-3-ST5463-cladeV-H160-O164:H56 (16/4 mg/L) was also detected. The most frequent ESBL-E. coli clone was ST131 CPR Klebsiella pneumoniae (32 patients), Candida spp. (21 patients). The comparison subgroup consisted of 217 patients with bacteremia caused by non-ESCAPE pathogens. We evaluated the days of mechanical ventilation, duration of antibiotic therapy (AMT), ICU length of stay (LOS), hospital LOS and mortality (Table 1). Results: Mortality in patients with bacteremia caused by non-ESKAPE pathogens was 23.5%, Candida spp Vancomycin mass removal over 120 minutes of hemoperfusion using HA330. Bars refer to vancomycin mass (mg): blue (experiment 1) and red (experiment2) bars using blood while green (experiment 3) bar using balanced solution. Yellow dashes are mean mass values of the three experiments (with standard deviations) and yellow line represents the reduction curve over time Table 1 (abstract P438). Results. * p-value versus non-ESKAPE Subgroup Mechanical ventilation P453 Translational value of the microbial profile in experimental sepsis studies SP Tallósy 1 , A Rutai 1 , L Juhász 1 , MZ Poles 1 , K Burián 2 , D Érces 1 , A Szabó 1 , M Boros 1 Invasive hemodynamic monitoring and blood gas analyses were performed on anesthetized animals between 18-24h of sepsis. The respiratory, cardiovascular, renal, hepatic and metabolic dysfunctions were evaluated with the species-specific Sequential Organ Failure Assessment (ssSOFA) score, the microbial profile was determined with selective media and MALDI-TOF MS in the initial inoculum and in the abdominal fluid taken 20h after sepsis induction. Results: Strong correlation was found between the initial dose of the inoculum (CFU) and the ssSOFA scores for organ dysfunction (rats: r = 0.656, P=0.0186; pigs: r=0.570, P = 0.0391) P466 Introduction: Pancreatic stone protein (PSP) has shown promise as a biomarker of infection however, its diagnostic potential has not been systematically evaluated. We performed a systematic review and meta-analysis of available data on PSP to evaluate its value for detecting infection in adults and determining a plasma or serum threshold value. Methods: The PubMed and Cochrane Library database were searched for studies on PSP in adult patients and their raw data were analyzed to estimate the best PSP cut-off value that could detect infected patients using the Youden's index. The cut-off sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) were computed and compared to those for procalcitonin (PCT) and C-reactive protein (CRP). Finally, we explored the potential value of a model combining all three biomarkers to detect infection. Results: From a total of 44 potentially eligible published studies, 5 containing 631 patients were included in quantitative analysis. Among them, 370 patients suffered from a clinically confirmed infection. The median Appropriate statistical tests were used using SPSS 23. CD 64 was expressed as % age of neutrophils expressing positivity. Results: Sixty patients were analyzed. All parameters were compared between survivors and non survivors. Demographics were comparable. Most common source of sepsis was lungs and majority were admitted due to medical reason. Non-survivors had significantly increased number of days with septic shock. At day 8 median values of all the biomarkers and the SOFA score were significantly higher in the nonsurvivor group (p<0.05). There was a decreasing trend of all 3 biomarkers and SOFA score amongst survivors. On multivariate logistic regression analysis, increased CD64 and CRP levels between baseline and day 8, increased days with septic shock and increased SOFA References: 1 Introduction: We characterized the association of C-reactive protein (CRP) with extracellular vesicles (EVs) in plasma from sepsis patients and assessed a commercial CRP adsorbent (Pentrasorb, Pentracor, Hennigsdorf, Germany) to deplete free and EV-associated CRP. In addition, we characterized the potential pro-inflammatory effects of EV-bound CRP on monocytes and endothelial cells Monocytes and human umbilical vein endothelial cells (HUVECs) were stimulated with isolated EVs (20,000 g, 30 min) Monocyte IL-8 secretion was quantified by ELISA; the activation of HUVECs was assessed by their expression of ICAM-1 and E-selectin using confocal microscopy. Results: Septic plasma (n=30) contained 227.0±88.6 mg/L CRP vs. 0.7±0.4 mg/ L for healthy controls (n=5). Both, total EVs and CRP + EVs were significantly elevated in septic plasma as Incubation of septic plasma with Pentrasorb resulted in depletion of free CRP (247.2±72.6 mg/L before vs. 1.8±0.7 mg/L after adsorption) as well as in a significant reduction in CRP EVs from CRP-depleted septic plasma induced significantly lower IL-8 levels. HUVEC ICAM-1 or E-selectin expression, however, did not increase upon stimulation with septic EVs. Conclusions: Treatment of septic plasma with Pentrasorb efficiently removes free CRP and detaches CRP from the EV surface, resulting in reduced proinflammatory effects Flow cytometry confirmed the association of monocytes with platelets and platelet-derived EVs as well as the uptake of EVs by monocytes. Conclusions: Storage of isolated monocytes induces a shift towards CD16 expressing proinflammatory monocytes, which seems to be mediated by residual platelets and platelet-derived EVs. It remains to be clarified whether EVs released from activated platelets can also trigger a shift towards proinflammatory, intermediate monocytes in vivo Ethical approval was provided by UCL Research Ethics Committee (5060/001). Paired parametric analyses were performed and data displayed as mean +/-95% CI. Results: Plasma calprotectin concentration began to increase 1.5hours after endotoxin administration, was significantly higher than baseline by 2 hours (356.7ng/mL vs. 737ng/mL, p <0.01), peaked at 4 hours (mean 1373ng/mL, Figure 1) and normalized by 24 hrs. Calprotectin peaked earlier than comparator soluble mediators (procalcitonin 8hrs, CRP, 24hrs) and exhibited 100% sensitivity; all participants demonstrating a minimum 2-fold increase from baseline (mean 3.84x). Calprotectin displayed greater baseline variability (SD 147.9ng/mL) than either CRP or procalcitonin. Conclusions: Our results indicate the potential of plasma calprotectin as a biomarker for bacterial infection. It increases earlier and peaks more rapidly than standard biomarkers. Whilst higher baseline variability was observed P501 A multicenter randomized controlled study on landiolol for the treatment of sepsis-related tachyarrhythmia: subanalysis of the J-Land 3S study O Nishida Kagoshima University Graduate School of Medical and Dental Sciences, Department of Emergency and Intensive Care Medicine Methods: We analyzed a retrospective cohort of electronic health records from adult sepsis patients at 12 UPMC hospitals from 2010 to 2014. We defined Sepsis-3 by i.) suspected infection (e.g., administration of antibiotics or body fluid culture) & ii.) organ dysfunction (e.g., 2 or more SOFA points) in the first 6 hours of care. Data were organized by hour and included vital signs, lab values, and treatments (e.g., total hourly IV fluids (mL) and norepinephrine equivalent dose). For each hour we describe, i.) available data elements, ii.) presence of Sepsis-3, and iii By Hour 6, most patients had vital signs (99%; n=70,559), basic labs (88%; n=62,719), fluid cultures (94%, n= 66,995), while serum lactate was completed in 24% (n=17,818) Conclusions: Early sepsis care patterns are variable. IV fluids were given during early hours, when uncertainty about sepsis was greatest, while vasopressors were administered after Sepsis-3 elements were present. P504 Effects of abdominal negative pressure treatment on splanchnic hemodynamics and liver and kidney function in a porcine fecal peritonitis model SL Liu 1 Department of Intensive Care Medicine Splanchnic hemodynamics and laboratory parameters were measured at baseline (BL, start of RP), and 24h, 48h and 72h after start of RP. Two/three-way RM-ANOVA or mixed-effects analysis, and Student t tests were performed. Results: NPT in controls had no effect. After sepsis induction, mean arterial pressure (MAP) decreased by 15 (7-22) mmHg, cardiac output (CO) by 1.3 (0.7-2.2) L/min, and arterial lactate increased by 0.2 (0.1-0.5) mmol/L. Sepsis and resuscitation was associated with increasing hepatic and renal arterial flows (p≤0.002, both), and increasing prothrombin time NPT in sepsis resulted in numerically less noradrenaline administration (0.3±0.3 ug/ min/kg in sepsis with NPT vs. 0.8±0.7 ug/min/kg without NPT, p= 0.310) and positive fluid balance (2.8±0.4 ml/h/kg with NPT vs. 3.1± 0.4 ml/h/kg without, p=0.241). Conclusions: In our experimental fecal peritonitis model, NPT did neither impair splanchnic hemodynamics nor abdominal organ function. Whether NPT helps to reduce noradrenaline and volume administration in abdominal sepsis should be evaluated in further studies. P505 Association between a C-reactive protein gene polymorphism (rs2794521) with the risk of develop septic shock in postsurgical patients of major abdominal surgery P Martínez-paz 1 Valladolid, Spain; 2 Hospital of Medina del Campo Notably, the three groups received a comparable pro kg dose of acetaminophen. No difference was found between groups in term of toxic effects. Patients carrying the CYP3A5p showed a more pronounced effect on body temperature in respect of WT and UGT1A1p °C respectively, but it does not reach statistical significance (Fig.1B). Only 50% of the patients reach a temperature <38°C at t2 and only 20% <37.5°C. Conclusions: Polymorphisms in enzymes involved in the metabolism of acetaminophen are relatively common. CYP3A5p seems to lead to higher peak plasmatic concentration and a slightly increased efficacy in fever control Panel A: variations of acetaminophen plasmatic levels after 30 minutes (t1) and 2 hours (t2) after administration of an iv dose of 1g of paracetamol in WT patients and patients carrying mutation; Panel B: body temperature variations in WT patients and patients carrying mutations Clinical Research, Investigation, and Systems Modeling of Acute Illness (CRISMA) Center, Department of Biostatistics We determined phenotype cohesiveness using probability of assignment at presentation, defining core members as ≥90% and marginal as <90% probability. We determined how members transitioned to other phenotypes over 24hrs using t-distributed stochastic neighbor embedding (tSNE) plots and determined the odds (95%CI) of transition. Results: We studied 37,198 adult sepsis encounters (median age 69 1C) The odds of ever transitioning from presenting phenotype increased significantly for marginal members vs Publisher's Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations We thank the Department of Education of the Basque Government (PIBA 2019-57) and the University of the Basque Country UPV/EHU (PPG17/65, GIU 17/32) for their financial support. A great disaster affects the family-and friend-performance of BCPR by diminishing the willingness of family and friend bystanders to follow the instruction provided by dispatchers. The experimental method IFiTEM could be an alternative of FIBTEM in cases when internal coagulation pathways assessment is prioritized (i.e. heparinized patients on extracorporeal supports). Patients undergoing limitation of life-sustaining therapy had lower Karnofsky Scale scores. Therefore, this scale may be useful to guide end-of-life decisions in the future, but further studies with larger number of patients are needed. Readmission after discharge home from critical care: a qualitative study C Robinson 1 , F Nicolson 1 , P MacTavish 1 , T Quasim 2 , JM McPeake 1 1 NHS Greater Glasgow and Clyde, NHS Greater Glasgow and Clyde, Glasgow, United Kingdom; 2 University of Glasgow, NHS Greater Glasgow and Clyde, Glasgow, United Kingdom Critical Care 2020, 24(Suppl 1):P392 Readmissions to acute care occur in a high number of critically ill patients within 90 days of hospital discharge [1] . Biomedical drivers such as frailty and pre-existing co-morbidities have been identified as drivers for readmission. However at present there is limited data on the influence of social problems on readmission. This study, using a grounded theory approach, sought to understand from a patient/caregiver perspective what the drivers for readmission to acute care were. Ethical approval was granted from the West of Scotland Research Ethics Service (19/WS/0091). A grounded theory approach was used to explore from a patient and caregiver perspective what the drivers for readmission are [2] . Using a clinical database, we identified those patients who had an ICU admission ≥ 3 days who were readmitted to acute care within 90 days of hospital discharge. The researcher attended the ward and after discussion with the direct care team conducted a semi-structured interview with patient and/or caregiver. The interview was recorded and transcribed verbatim. The transcripts were analysed to generate initial codes, followed by the development categories and sub-categories. Theoretical sampling was undertaken. Results: 15 participants were interviewed.10(66.6%) were patients and 5 (33.3%) were caregivers. The themes that have emerged from the data were: Pain and polypharmacy; lack of social support and/or isolation; strained relationships with primary care providers and information provision across the patient journey. Subsequent theory development is underway to understand how this learning could help reduce readmissions in future. In conclusion, both social and biomedical drivers are likely to contribute to acute care readmission in this group. Future interventional work is required in order to identify modifiable factors to reduce this burden for patients and the healthcare service. Frailty has shown to have prognostic relevance for patients with critical illness. Since a wide range of tools has been described to screen for frailty, we aimed to describe the association of two frailty screening tools, the Clinical Frailty Scale (CFS) score and the modified Frailty Index (mFI) in critically ill patients. We performed a post-hoc analysis of a multicenter cohort of patients admitted to six Canadian Intensive Care Units (ICU) between February 2010 and July 2011. Frailty was identified using the Clinical Frailty Scale (CFS) and the modified Frailty Index (mFI). Concordance of the frailty screening tools was evaluated with partial Spearman rank correlation and intraclass correlation (ICC). Discrimination and predictive ability of the tools for hospital mortality, 1-year mortality, hospital readmission and adverse events were compared using concordance statistic (C-statistic) and calibration plot adjusting for age, sex, Sequential Organ Failure Assessment (SOFA) score and ICU admission source, respectively. The cohort included 421 patients. Prevalence of frailty was 32.8% (95% confidence interval [CI] 28.3%-37.5%) with the CFS and 39.2% (95% CI 34.5%-44.0%) with the mFI. Concordance between the two tools was low [(ICC of 0.37; 95% CI 0.29-0.45) and partial correlation coefficient of 0.38 (95% CI 0.29-0.47)], even after adjustment. Hospital and 1-year mortality were greater for frail compared to non-frail patients using of both tools. Similarly, both tools found frail patients were less likely to be living independently after hospital discharge, and more likely to be rehospitalized when compared to non-frail patients. While the CFS and mFI show low concordance, both showed good discrimination and predictive validity for hospital mortality. Both tools identify a subgroup of patients more likely to have worse clinical outcomes. The Post-Intensive Care Syndrome (PICS) is a myriad of physical, psychiatric and cognitive disorders secondary to critical illness, leading to a decreased quality of life and an important socioeconomic burden. This study aimed to identify if the conformity to a PICS Prevention Bundle was able to reduce the incidence of the syndrome at ICU discharge. All patients admitted to the ICU from January 1 st to December 31 st 2018 were included. The conformity to each of the ten components of the PICS Prevention Bundle was assessed daily, and the patients were evaluated for anxiety, depression, cognitive dysfunction, muscular weakness, mobility impairment and nutritional risk at ICU discharge and at a 3-to-6-months follow-up consultation. The patient cohort was divided in terciles according to bundle conformity for the analysis. Results: From the 1145 enrolled patients, 352 (31%) were evaluated at ICU discharge, and 103 (29%) attended to the follow-up consultation. There was no difference in baseline characteristics between the cohorts. There was no correlation between the prevalence of PICS at discharge and Bundle conformity during ICU stay (90% vs. 87% vs 87%, p 0.834), though there was a decrease in nutritional risk and days in mechanical ventilation (Table 1) . After 3 to 6 months there was a reduction on the prevalence of any kind of PICS, mobility impairment, muscular weakness and nutritional risk. The patients that developed PICS were older and had a higher Simplified Acute Physiology Score III at ICU admission. A higher adhesion to a PICS Prevention Bundle was not able to prevent the occurrence of the syndrome. Post intensive care syndrome (PICS) is well recognized following general ICU care [1] . Intensive Care Syndrome:Promoting Independence and Return to Employment (InS:PIRE) is a multidisciplinary complex intervention designed to address PICS [2] . With a paucity of evidence on PICS after cardiothoracic intensive care, we aim to evaluate PICS and the feasibility of the InS:PIRE intervention in this population. Those attending the clinic received 5 weeks of intervention including individual appointments with ICM nurse, physician, pharmacist, and physiotherapist. A café area facilitated peer support alongside psychology group sessions. Primary outcome was quality of life measured by EQ-5D-5L. Further surveys included: pain, mental health, and selfefficacy. Questionnaires were taken at baseline, 3 and 12 months. Results: Over 5 cohorts, 27 patients attended, 67% male, median age 66 years (IQR 61-75), median APACHE 2 score of 17 (IQR 14-18.5), and median ICU length of stay was 13 days (IQR 9-21). A total of 14 (53%) patients completed surveys at one year. Scheduled admissions represented 56% of those attending. Mean EuroQol EQ-VAS score was 70/ 100 (SD +/-18) at baseline increasing to 78/100 (SD +/-16) by 1 year (Table 1) . Those with problems in at least one domain of EQ-5D-5L fell from 92% at baseline to 73% at 1-year with the breakdown shown in table 1. Severe problems were seen in 22% falling to 18% at 1 year. HADS demonstrated an anxiety or depression rate of 44%. Brief pain inventory identified 14 patients (52%) with ongoing chronic pain. Mean self-efficacy was 32/40 (SD +/-6) at baseline and 34/40 (SD +/-5) at 1 year. Cardiothoracic intensive care patients have ongoing and persistent features of PICS with significant effects on health-related quality of life. Further, the InS:PIRE multi-professional complex intervention is feasible within this specialist group. screening approach might be implemented whenever screening of the total ICU population is not deemed feasible. Influenza is an acute viral illness with a significant financial burden. Point of care testing for influenza is available and has demonstrated accuracy [1, 2] , the current gap in knowledge is the question around the opportunity cost of influenza testing. If POCT is financially a less costly test this could free up scarce resource. The study adopts a cost minimisation approach. The point of care test is the Roche Cobas® Liat® machine which can detect flu A/B and is compared with the West of Scotland Specialist Virology centre's established in house multiplex real time PCR assay.The model was developed using Microsoft Excel and has 2 arms comparing analysis of the above mentioned tests. The model estimates that the total cost of POCT per patient tested is £3926.33 compared with £4053.92 for lab testing ( Figure 1 ). This is a saving of £127.60 per patient when POCT is used. The result swings in favour of the lab test when POCT specificity falls to 95.72%. If the lab could provide the result of influenza testing within 12 hours the result would swing in favour of lab testing. Zanamivir which will potentially be used increasingly in the intensive care setting can more than double the difference between the 2 tests in favour of POCT. This research suggests that POCT offers potential cost savings in the ICU setting. This is the case as long as POCT specificity is higher than a threshold of 97.52% and the lab take longer that 12 hours to return the result. The sensitivity analysis should allow for external validity given the usual variations in ICU practice. The aim of the present study is to describe the demographic, clinical, microbiological aspects and the outcome of patients with intensive care unit-related (ICU-related) bacteremia. Moreover, we aimed to study the patient outcome in association with colistin susceptibility. Retrospective, single-center study in a 16-bed ICU for 12 months, from 1/10/2018 to 30/9/2019. ICU-related bacteremia was defined as bacteremia in patients with ICU stay >48 hours or ICU readmission (first admission ≥ 1 month before). Only the first episode of bacteremia was considered. The primary outcome was 28-day mortality. Data regarding clinical, demographic and outcome characteristics were retrieved from the patient files. The hospital's ethics committee approved the present protocol. Moreover, the patients with bacteremia due to colistin-resistant pathogens were compared with the patients affected by colistin sensitive microbes. Forty episodes of Gram-negative ICU bacteremia were collected during the aforementioned period in 40 patients (61.5% male) with a mean age and APACHE II of 60.6±16.8 years and 18±8.1, respectively. The event had taken place at an average of 13.7 days. The responsible isolates were resistant to carbapenems in 82.5% of the episodes. The majority of the events were due to a single isolate (85%). Acinetobacter baumannii and Klebsiella pneumoniae presented the majority of the implicated microbes (35% and 37.5%, respectively). The crude 28-day mortality was 30%. Finally, we could not detect any difference in mortality between the colistin sensitive and the colistin-resistant pathogens ( Figure 1 ). The present study denotes that, in a setting of extremely drugresistant pathogens with limited treatment options, Gram-negative bacteremia in the ICU is associated with increased mortality. Image 1 : Characterization of resistance mechanisms affecting ceftolozane/ tazobactam in Enterobacterales and Pseudomonas aeruginosa ICU isolates using whole genome sequencing (STEP study) M Hernández-garcia 1 , CC Chaves 2 , JM Melo-Cristino 3 , DS Silva 4 , AR Vieira 5 , MP F. Pinto 6 , JD Diogo 7 , EG Gonçalves 8 , JR Romano 9 , RC Cantón 1 1 Hospital Ramón y Cajal-IRYCIS, Microbiology Department, Madrid, Spain; 2 Introduction: Clostridium difficile infection (CDI) is the main cause of hospital acquired diarrhoea [1] . The aim of this study was to compare characteristics of CDI during yr 2011 and 2015. A retrospective observational study was carried out in Lithuanian University of Health Sciences hospital -the largest teaching facility of tertiary care in country. According to Department of infection control records, patients (pt) with (w.) diarrhoea and the first positive stool test for C.difficile toxin A/B were included. Age, Charlson Comorbidity Index (CCI) score, profile of hospital department (medical (MD), surgical or ICU) where CDI was diagnosed, type of CDI (healthcare-associated (HA), hospital or community-acquired) and rate of risk factors (RF) have been estimated in both 2011 and 2015. IBM SPSS 23.0; Pearson's Chi-square, Fisher's exact tests were used for statistics. P < 0.05 was statistically significant. Results: In total 7 pt from 2011, 72 from 2015 were enrolled. In 2011 n=4 (57%) pt were ≥65 yr old, in 2015 -n=45 (63%), (p=0.045). In 2011 CCI>5 was estimated in n=6 (86%) pt in comparison of n=46 (64%) in 2015, (p=0.025). In 2011 n=0 (0%) of CDI cases were HA, in 2015 -n=12 (17%), (p=0.01). In 2011 n=5 (71%) of CDI were diagnosed in MD in comparison of n=60 (83%) in 2015, (p=0.01). In 2011 12 weeks prior to CDI n=5 (71%) pt have been admitted to hospitals, n=7 (100%) have been treated w. antibiotics, n=4 (50%) -w. PPIs, n=5 (36%) -w. H2 antagonists, n=3 (43%) -w. immunosupressants in comparison of n=51 (71%), n=69 (96%), n=36 (57%), n=26 (71%) and n=21 (29%) in 2015, respectively, (p>0.05). Overall rate of CDI cases among in-hospital patients increased tenfold by yr 2011 and 2015. In 2015, more elderly patients had CDI and severe comorbidities were less frequent in comparison with 2011. In 2015, more cases of CDI were hospital-acquired and have occured in medical departments. Rate of risk factors of CDI remained unchanged.These results indicate a possible relationship between TTV DNA count and immunological alteration. The TTV quantitative determination could be useful as a proinflammatory marker in sepsis, with some benefits: low cost, easy determination and good correlation with immune system functionalit. It will be necessary to perform a larger study to check our hypothesis and to establish a TTV level threshold that may allow to anticípate the disease prognosis. Introduction: Acute kidney injury (AKI) is a serious complication in sepsis and associated with high morbidity and mortality. The combination antimicrobial regimens with vancomycin (VCM) and broad-spectrum betalactams (BSBL), such as piperacillin tazobactam and cefepime, have been identified as potentially nephrotoxic combinations, but existing studies have not provided sufficient evidence. The aim of this study was to evaluate detailed association between the combination antimicrobial therapy and the risk of AKI in septic patients. This investigation was a post hoc analysis of 2 prospective nationwide cohorts enrolling consecutive adult patients with sepsis in intensive care units in Japan. In this study, progression of AKI was defined as one or more elevation of renal sub-score in Sequential Organ Failure Assessment score from day 1 to day 4. We regarded anti-pseudomonal penicillins, fourth generation cephalosporines, and carbapenems as BSBL. Multivariable logistic regression analysis including a two-way interaction term (VCM x BSBL) was performed to assess the add-on effects of each antimicrobial agent on the progression of AKI. The final study cohort comprised 1837 patients with sepsis. Among them, 45 received VCM without BSBL, 1055 received BSBL without VCM, 249 received both VCM and BSBL, and 488 received other type of antimicrobials. The administration of VCM was associated with an increased risk of AKI in patients with BSBL [odds ratio (OR), 1.57 (0.96-2.57); p=0.072]. However, the tendency was not evident in patients without BSBL [OR, 0.23 (0.03-1.56); p=0.133]. The interaction effect on the progression of AKI between VCM and BSBL were statistically significant (p for interaction=0.038). The regression model including two-way interaction term suggested that the combination of VCM and BSBL might synergistically increase the risk of AKI in patients with sepsis. Increasing resistance to carbapenems due to carbapenemase productionone of main actual problems of antibacterial resistance in burn ICU. Production of several types of carbapenemases (KPC, NDM and OXA-48) is common in K. pneumoniae strains. Carbapemenase production is a marker of extreme antibacterial resistance. The aim of our study was to investigate the epidemiology of nosocomial infections caused by producing KPC, NDM and OXA-48 K. pneumonia strains in burn ICU. Total of 26 patients with nosocomial infections caused by 26 carbapenem resistance strains of K. pneumoniae were included in the study, from whom 3 had lower respiratory tract infection, 23 had skin and skin structure infection. Initial identification of isolates was performed in laboratory by automatic microbiological analyzer. For all of K. pneumoniae isolates presence of bla NDM , bla OXA-48 and bla KPCgenes were examined by PCR method. Baseline characteristics of patients: Me (IQR) of age -50 (39; 64) years, Me (IQR) of TBSA -40 (29; 52) percent, Me (IQR) of ICU LOS -30 (21; 35) days. Inhalation injury was diagnosed in 10 (38.4%) patients. Total of 11 patients died, mortality rate was 42.3%. All patients were diagnosed with nosocomial infection caused by K. pneumoniae. From 26 K. pneumonia strains 1 (3.8%) were found to be producing KPC, 3 (11.5%)producing NDM and 20 (76.9%) -producing OXA48. Only 5 (19.2%) carbapenem resistance K. pneumoniae isolates were not producing carbapenemases. From 20 patients infected by OXA48 producing K. pneumoniae 20 patients died, mortality rate was 40%. From 23 patients infected by OXA48 or NDM producing K. pneumoniae 10 patients died, mortality rate was 43.5%. From 5 patients infected by non-carbapenemase producing K. pneumonia no one died. Carbapenemase producing strains are widely spread among carbapenem resistance strains of K. pneumoniae in burn ICU. Mortality of patients infected by producing OXA48 or NDM K. pneumoniae strains reaches 43.5%. The rationale for blood purification as adjunctive therapy during sepsis involved the capacity in removing endogenous and exogenous toxins, but currently no recommendations exists [1] . A critical point may be the potential interaction with antimicrobial therapy, which remains the mainstay of sepsis treatment. The aim of our study was to investigate the vancomycin (VAN) removal during blood purification using an in vitro model of hemoperfusion (HP) with HA330 cartridge (Jafron, Zhuhai City, China), most widely used in China and actually available in Europe. This is an experimental study. Three independent experiments were performed: we injected 250 mg of VAN in 500ml of whole blood from healthy donors (experiment 1 and 2) or in 500ml of balanced solution (experiment 3) in order to assess membrane saturation. A closed-circuit (blood flow of 250ml/min) simulating HP ran using HA330. Samples were collected from arterial line at 0, 5, 10, 15, 30, 45, 60, 90, 120 minutes; VAN plasma concentrations were measured and removal was evaluated using mass balance analysis. Differences in mass removal was assessed using Kruskal-Wallis test. Results: Figure 1 shows VAN mass at each timepoints. We observed no difference between in blood and in balanced solution experiments (p- The aim of this study is to determine if routine BBV testing in the ICU contributes to the discovery of undiagnosed BBV infections. ICU patients may require renal replacement therapy (RRT). Sharing RRT equipment carries a risk of BBV transmission, which mainly relates to Hepatitis B (HBV), Hepatitis C (HCV) and HIV. Since 2012, all Glasgow Royal Infirmary ICU patients undergo routine BBV screening, with RRT machines allocated for patients with specific BBV statuses. Routine BBV testing is beneficial to both the individual and society. HCV is a pertinent health issue in Scotland. The Scottish government aims to eliminate HCV by 2030 and is researching innovative and costeffective methods to identify undiagnosed infections. This single-centre retrospective observational study examined prospectively collected clinical data from 1069 ICU admissions. Proportions were compared using a two-proportion z-test and a logistic regression model was carried out to determine if deprivation quintile was independently associated with the seroprevalence of BBVs. The BBV seroprevalence in the cohort studied: 0.45% (HBV), 11.7% (HCV), 0.91% (HIV). The seroprevalence of HBV in the cohort studied was similar to that of Scotland (p=0.11), but the seroprevalence of HCV (p<0.001) and HIV (p=0.01) were statistically significantly higher than that of Scotland. Due to the small number of reactive test results for HBV and HIV, the relationship between deprivation and BBV seroprevalence was explored for HCV only. The only independent variable associated with a reactive anti-HCV test result was "current or previous illicit drug use" (adjusted odds ratio of 40.2; 95% confidence interval of 21.1-76.4; p<0.001). This study shows that routine BBV testing in the ICU is useful in discovering new BBV infections. This is the first observational study focusing on the value of routine BBV testing in an ICU setting to our knowledge. Continuous infusion vancomycin protocol is a safe, acceptable and effective alternative to intermittent dosing of vancomycin in critical care. Ceftaroline is an efficacious treatment in patients with severe CAP, admitted in ICU. It relates to earlier resolution of respiratory failure and less rescue antibiotics. We need an adequately pragmatic trial to confirm our findings Organ dysfunction in scrub typhus, incidence and risk factor A Sarkar 1 , A Guha 2 , R Dey 3 [1, 2, 3, 4, 5] . Its preads by bite of larval stageof thromboculid mites or chigger [1] . Clinical features may include fever, headache, myalgia, lymphadenopathy, eschar, skinrash. It may also cause pneumonia, renal failure, shock, meningoencephalitis, multiple organ failure [1, 2] . Our study aims to discuss the incidence of organ dysfunction in a comprehensive way taking the overall population of patients with identified scrub typhus infection. There is lack of data in eastern India regarding the incidence and risk factors of developing multiorgan dysfunction syndrome (MODS) in scrub typhus. In this retrospective study we studied the incidence of various organ involvement and the risk factors associated with the development of MODS in scrub typhus. We collected data from December 2016 to November 2019 in tertiary care hospital at Kolkata. We have included all patients who are having fever, scrub typhus IgM antibody positive, age more than 14 years. SOFA score was used in evaluating patients with MODS. Exclusion criteria involves patient who are having coinfectional ong with scrub typhus. In a cohort (n=114), patients with multiorgan dysfunction syndrome was seen in 27 patients (23.68%), the mean age in group of patients with MODS was 50.0+/-14.96 years (mean+/-SD). In group of patients with MODS, fever duration in days was of 11+/-3.58 days (mean+/-SD), interval from treatment to defervescenc in days was 5.11+/-2.39 days (mean +/-SD). Among patients with MODS, hematologic involvement was seen in 7 patients (25.92%), hepatic involvement was seen in 19 patients (70.37%), renal involvement was seen in 17 patients (62.96%), neurologic involvement was seen in 24 patients (88%), respiratory involvement was seen in 14 patients (51.85%), cardiovascular was seen in 8 patients (29.63%), ICU shifting was necessary in 20 patients (74.07%), mechanical intubation was needed in 14 patients (51.85%) in multiorgan dysfunction syndrome patients. Hospital mortality in patients with MODS was 3 patients (11.11%). No mortality was seen in patients without MODS. Other parameters were evaluated among patients with MODS. They include eschar in 1 patient (3.7%), seizure in 7 patients (25.93%), hepatoslenomegaly in 26 patients (96.3%), leucopenia in 3 patients (11.11%), leucocytosis in 13 patients (48.14%), thromnbocytopenia in 7 patients (25.92%),decreased hemoglobin in 22 patients (81.48%), transaminitis in 19 patients (70.37%). The risk factors associated with the development of MODS are platelet counts, bilirubin, transaminitis, Glasgow coma scale, time interval from treatment to defervescence, hemoglobin, total leucocyte count and fever duration. Scrub typhus is an important cause of acute febrile illness in this part of the country and is frequently associated with organ dysfunction. However, the overall mortality is low which is similar to other studies done before [2] . score at baseline were significant (p<0.05) predictors of mortality.Highest area under the ROC curve was obtained for number of days with septic shock (0.857) followed by increased CD64 between baseline and day 8 (0.798). Though serial PCT levels significantly increased amongst non-survivors, it did not predict mortality. Serial level of biomarkers in ICU patients may predict mortality. Larger trials are needed to confirm the results. Plasma sTREM-1 levels were retrospectively measured at Day 1-2, 3-4 and 6-8 in 116 septic shock patients from the IMMUNOSEPSIS cohort (NCT02803346), included between 01/2016 and 12/2018, using a validated ELISA method. The associations between sTREM-1, mHLA-DR, 28-day survival status, and occurrence of ICU-acquired nosocomial infection (NI) were assessed. Neither sTREM-1 nor mHLA-DR levels at D1/2 were associated with the occurrence of ICU-acquired NI. However, 28-day mortality was significantly higher in patients with D1-2 sTREM-1 value superior to the median (39.6% vs 11.3%, p=0.0103; median=539 pg/mL). A significant inverse correlation was found between mHLA-DR at D6-8 and sTREM-1 at D1-2 (Sp -0.378, p<0.0001) and at D6-8 (Sp -0.382, p<0.0001). At D6-8, when stratifying patients based on sTREM-1 (400pg/mL) and mHLA-DR (5000 AB/C), patients combining elevated sTREM-1 and low mHLA-DR presented with significantly higher 28day mortality (47.6% vs 8.7%, p = 0.0003, Chi-squared test) and NI incidence (31.8 vs 12%, p=0.044) compared with patients with low sTREM-1 / high mHLA-DR. This study shows for the first time that TREM-1 pathway activation is associated with septic shock-induced immunosuppression, as shown by an inverse correlation between sTREM-1 at baseline and mHLA-DR expression at D6-8. Persisting high sTREM-1 values and low mHLA-DR expression in septic shock patients are significantly associated with higher rate of ICU-acquired infection and mortality. Introduction: Sepsis mortality remains high [1] . The Surviving Sepsis Campaign (SSC) recommends to guide resuscitation on normalization of lactate levels [2] , however this is debated [3] . We have shown that plasma levels of bio-adrenomedullin (bio-ADM) were associated with patient outcome during sepsis [4] . We therefore aimed to evaluate the added value of bio-ADM to lactate measurement in the AdrenOSS cohort. This is a post-hoc analysis of the Adrenomedullin and Outcome in Severe Sepsis and Septic Shock (AdrenOSS) cohort study. The Adre-nOSS study is a prospective observational study conducted in twenty-four centers and included 583 septic patients [4] . We studied the relationship between the association of initial evolution of lactate plasma levels and bio-ADM level at 24h and outcome in patients for whom both markers were available at admission and one day later ("24h"). Bio-ADM levels below 70 pg/mL were considered as low, and high if greater than 70 pg/mL [4] . In patients with high lactate levels (>2 mmol/L) at admission (n=328), lactate normalization (<2 mmol/L) at 24h was associated with better outcome than in patients with persistently high lactate at 24h (28day mortality 15.9% vs 41.9% respectively, HR 3.3 [2.0-5.3], p<0.001) ( Figure 1 ). Among patients with decreasing lactate, high and low bio-ADM levels at 24h identified patients with different outcomes (28day mortality 7% vs 26% for low vs high bio-ADM respectively, HR 4.4 [1.6-11.7], p<0.005). High and low bio-ADM levels at 24h also differentiated outcome of patients with persistently elevated lactate (HR 4.5 [1.6-12.3], p<0.005). In patients with low initial lactate, neither lactate or bio-ADM had no added prognostic. Our data suggest that measurement of bio-ADM in addition to lactate may help physicians to refine risk stratification and therefore to guide resuscitation during sepsis. The effect of fluid replacement in sepsis, severe sepsis and septic shock in first 24 hrs in clot quality and microstructure S Pillai 1 , G Davies 2 The inflammatory response in sepsis can lead to a spectrum of coagulation system defects [1] . Sepsis and severe sepsis is associated with a hypercoagulable state where the clot microstructure is known to be a tight and highly elastic clot, which is potentially resistant to fibrinolysis ( Figure 1 ). Conversely, septic shock is associated with a hypocoagulable state where the clot microstructure is loose and structurally weak. The study aim to investigate the effect of fluid resuscitation and replacement in clot microstructure over 24 hours. Methods: 100 patients (50 sepsis, 20 severe sepsis and 30 septic shock) were included in the study. All these patients received standard fluid replacement therapy with crystalloids. Blood samples were collected at 0 hours, 4 hours and 24 hours. Clot microstructure, standard markers of coagulation and inflammatory markers were measured. In sepsis group following fluid administration, the d f reduced initially and then remained stable (1.78-0 hours, 1.74-4 hours, 1.73-24 hours, normal d f range 1.73 ± 0.04). In severe sepsis group, the d f reduced initially, then increased (1.80-0 hours, 1.71-4 hours, 1.76-24 hours) and in septic shock, the df was very low to start with and there were only slight increase with fluid administration (1.66-0 hours, 1.68-4 hours, 1.67-24 hours). The hypercoagulable state and clot quality in both sepsis and severe sepsis group improved with fluid resuscitation, however despite an early improvement in clot quality, ongoing fluid resuscitation resulted in markedly reduced functional clot with very low clot strength and functionality. This study demonstrates that d f as a marker of clot quality and function may have potential in fluid and component replacement in critical illness and injury. This study analyses the prognostic ability of white blood cell count (WBC), neutrophil:lymphocyte ratio (NLR) and C-reactive protein (CRP). Hypo-and hyperimmune responses have been associated with increased mortality from septic shock [1] . Patients with septic shock (Sepsis 3.0) admitted to Queen Elizabeth Hospital Birmingham, between December 2017 and July 2019 were included. The primary outcome was 90-day mortality. Data was tested for normality and presented as median (IQR) and analysed using a Mann Whitney U test. Categorical data was presented as % and analysed using a chi-squared test. A p value of < 0.05 was used to determine significance. A multivariate binary logistic regression analysis was conducted using age, APACHE II, charlson comorbidity index, performance status, and initial lactate as covariates. A Hosmer Lemeshow test of >0.05 indicated good fit. Results: 474 patients were admitted with septic shock. The majority (61%) were male, with a median age of 64 (55-75) and a 90-day mortality of 37%. On day 1, WBC was lower in patients who died compared to patients who survived (9 [7] [8] [9] [10] [11] [12] [13] [14] [15] Patients who died of septic shock had a lower WBC, NLR and CRP response early on compared to survivors. This may represent early immunoparesis that allows infection to propagate unchecked. However, this was not independently associated with mortality when confounding factors were accounted for. A specific metabolite of mitochondriaitaconic acid is formed upon proinflammatory activation. The attempts of various researches to find the itaconic acid in peripherical blood of patients with sepsis were unsuccessful [1] . Some phenylcarboxylic acids (PhCAs) are known to be microbial metabolites and sepsis biomarkers; they also affect the mitochondrial functions [2] . Concentrations of PhCAs (phenyllactic, p-hydroxyphenylacetic, phydroxyphenyllactic acids) and mitochondrial metabolites (succinic, itaconic acids) in 48 serum samples from 8 patients on the 1 st day of diagnosis of sepsis and 35 serum samples from 22 patients with late stages of sepsis (SEPSIS-3) were measured by gas chromatographymass spectrometry; control group -20 donors. Results: Itaconic acid was found in low concentrations (0.5-2.3 μM) only at early stage of sepsis. The multiple increase in levels of PhCAs and mitochondrial metabolites were detected in patients with late stage of sepsis in comparison with early stage and donors, p<0.001. Increased succinic acid (up to 100-1000 μM) concentration is the result of succinate dehydrogenase inhibition by microbial metabolism intermediates (PhCAs), which was confirmed by in vitro experiments in isolated mitochondria (Fig.1) . Itaconic acid may be a promising marker in early stage of sepsis, which needs to be proved. Prediction of severe events in clinical sepsis is challenging. For such prediction we aimed to compare the novel biomarker calprotectin in plasma, with routine biomarkers. In a prospective study, blood samples were collected from consecutive patients who triggered the sepsis alert in the emergency department in our hospital. C-reactive protein (CRP), procalcitonin, neutrophils, and lymphocytes were analysed according to routine practice. P-calprotectin was analysed using a specific particle enhanced turbidimetric assay (Gentian Diagnostics AS). The composite endpoint, which was termed severe event, was defined as death or admission to the intensive care unit (ICU)/high dependency unit (HDU) within 48 hours from arrival. The study included 367 patients with written informed consent, of whom 335 were considered to have infection (defined as obtained blood culture and subsequent antibiotic therapy for at least 4 days or until discharge or death), and 32 had no infection. Seventy-four patients (22%) with infection developed a severe event. Mean pcalprotectin was 2.99 mg/L (standard deviation (SD) 2.10) among patients with infection and 2.35 mg/L (SD 2.64) among patients without infection (p=0.02). In patients with infection mean p-calprotectin was 3.81 mg/L (SD 3.18) among those with and 2.75 mg/L (SD 2.50) among those without a severe event (p=0.006). Analysis of area under the receiver-operating characteristic (ROC) curve for prediction of severe events showed superiority for p-calprotectin compared with procalcitonin and neutrophil-lymphocyte-ratio, both regarding all sepsis alert cases and regarding the patients with infection (p< 0.05 for all comparisons), Fig 1. In addition, there was a trend toward superior performance compared to CRP (p=0.10 and 0.15). In sepsis alert patients, p-calprotectin was elevated in those who subsequently developed severe events. P-calprotectin was superior to traditional biomarkers for prediction of severe events. Introduction: Rapid diagnosis of acute infections and sepsis is critical in Emergency Departments (EDs). Current tests have slow turnaround times, low sensitivities, and/or signals from contaminant or commensal organisms. Empirical antimicrobial treatment may result in severe adverse events and contributes to antimicrobial resistance. Diagnostics to distinguish bacterial from viral infections and noninfectious etiologies support clinicians in efforts toward antimicrobial stewardship. In a prospective, non-interventional study in the EDs of 6 sites in Greece (PROMPT study NCT03295825), we evaluated HostDx Sepsis, a host response test for suspected acute infections and suspected sepsis. HostDx Sepsis measures 29 human mRNA targets and employs advanced machine learning to differentiate patients with bacterial and viral infections, and noninfectious etiologies. Adult patients presenting with suspected acute infection and at least one vital sign change were enrolled. Whole blood RNA was quantified using Nano-String nCounter. Predicted probabilities of bacterial and viral infection were calculated (BVN-1 algorithm). Patients were adjudicated in a retrospective chart review by 3 independent infectious disease specialists blinded to HostDx Sepsis results. Among 396 patients adjudicated as bacterial (56), viral (45), noninfected (1), or indeterminate (294) the Area Under the Receiver Operating Characteristics (AUROC) of HostDx Sepsis for predicting bacterial vs. viral/non-infected patients was 0.92, and AUROC for viral vs. bacterial/non-infected patients was 0.87 (Fig.1) . Our results indicate that HostDx Sepsis distinguishes bacterial from viral infections and other etiologies with high accuracy. HostDx Sepsis is currently developed as a rapid point-of-care device with a turnaround-time of less than 30 minutes. HostDx Sepsis may therefore assist ED doctors in making appropriate treatment decisions earlier, towards the ultimate goal of antimicrobial stewardship. We studied the diagnostic value of a leukocyte deformability assay that rapidly quantifies the immune activation signatures of sepsis in an undifferentiated population of adults presenting to the ED. ED clinicians must balance the benefits of early intervention against the risks of indiscriminate use of resource-intensive interventions. There are no currently available rapid diagnostics with acceptable performance to achieve this balance. We prospectively enrolled adult patients within 5 hours of presentation with signs of suspicion of infection in two EDs in the USA. EDTA-anticoagulated blood was drawn and analyzed using deformability cytometry [1] . Procalcitonin (PCT) levels were also measured. Patients were retrospectively adjudicated for Sepsis-3 by physician committee using the entire medical record. Diagnostic performance characteristics and receiver operating curves were used to examine the diagnostic performance of the assay as well as PCT. Of the 190 patients enrolled, 17.4% were adjudicated as septic. The leukocyte deformability assay demonstrated 91% sensitivity, 68% specificity, and 97% Negative Predictive Value for a single cutoff. The AUC was 0.86 ( Figure 1 ). PCT with a cutoff of 0.5 ng/mL had 55% sensitivity, 87% specificity, and 90% Negative Predictive Value. The AUC for PCT (as continuous variable) was 0.8. The leukocyte deformability assay of immune activation signatures demonstrated superior diagnostic performance for sepsis when compared to PCT. The assay's diagnostic performance and rapid turnaround time of 5 minutes may positively impact patient outcomes while minimizing indiscriminate use of valuable resources in the ED. It is already known in literature that high levels of midregional proadrenomedullin (MRproADM) are related with organ disfunction in infections despite of source and pathogens [1] . Similarly, microcirculatory impairment has been reported in sepsis. We examine the correlation between microcirculatory disfunction and MRproADM as a sign of early organ failure. We included 20 consecutive adult patients with suspected infection, sepsis or septic shock admitted to our Intensive Care Unit (ICU) as first hospital admission with an expected ICU stay of > 24hours. MRproADM was measured daily during the first five consecutive days and sublingual microcirculation was assessed with Incident Dark Field (IDF) technology at T1, T2, and T5. We collected information on SAPS II, APACHE scores, and SOFA score for each timepoint. Results: Ten patients had septic shock, 5 sepsis and 5 infection. Three patients died during ICU stay. A MRproADM clearance of 20% or more between T1 and T2 was found associated with the improvement of MFI (Mann-Whitney U test, median increase 12.35% versus 2.23%, p= 0.005) (Figure 1) . A MRproADM >1.42 nmol/l at the ICU admission was associated with a worse SOFA score at all the timepoint. Moreover, MRproADM levels at admission was found significantly related with ICU mortality (AUC 0.941 [0.819-1]; p=0.017). MRproADM shown no relation with absolute value of MFI. The study shows a good correlation between the clearance of the biomarker and the improvement in MFI. Moreover, our results support previous findings on the prognostic value of MRproADM in terms of SOFA and ICU-mortality. Clinical performance of a rapid sepsis test on a near-patient molecular testing platform R Brandon 1 , J Kirk 2 , T Yager 2 , S Cermelli 2 , R Davis 2 , D Sampson 2 , P Sillekens 3 , I Keuleers 3 , T Vanhoey 3 1 Immunexpress, Seattle, United States; 2 Immunexpress, Immunexpress, Seattle, United States; 3 Biocartis NV, Biocartis, Mechelen, Belgium Critical Care 2020, 24(Suppl 1):P481 The purpose of this study was to clinically validate a new, rapid version of the SeptiCyte™ assay on a near-patient testing platform (Biocartis Idylla™). SeptiCyte™ LAB is the first-in-class sepsis diagnostic to gain FDA-clearance but has a complex workflow and a turnaround time (TaT) of~6 hours. The assay in Idylla™ cartridge format is called SeptiCyte™ RAPID. SeptiCyte™ LAB was translated to the Biocartis Idylla™ near-patient testing platform and analytically validated. For this study, 0.9mL of peripheral blood PAXgene TM solution from previously collected patient samples was pipetted directly into the cartridge and inserted into the Idylla™ reader. Patients were part of an independent cohort (N=200) from Intensive Care Units located in the USA and Europe. SeptiCyte™ RAPID results were reported as a SeptiScore™ between 0 and 10 with higher scores representing higher probability of sepsis. Assay performance determined included technician hands-on-time (HoT), assay TaT, failure rates, and Area Under ROC Curve based on comparison to retrospective physician diagnosis. Average HoT was 2 minutes, and average TaT was 65 minutes. Clinical samples could be processed immediately with SeptiCyte™ RAPID and did not require 2 hour pre-incubation of PAXgene blood, greatly improving TaT. Correlation of SeptiScore™ values between LAB and RAPID, based upon a subset of samples run on both platforms, was very high (R 2 >0.97). Estimated ROC AUC performance for discriminating sepsis from non-infectious systemic inflammation (NISI/SIRS) was similar to that previously reported for SeptiCyte™ LAB. This is the first demonstration of a validated, fully-integrated, rapid, reproducible, near-patient, immune-response sepsis diagnostic, providing actionable results~1 hr, to differentiate sepsis from non-infectious systemic inflammation / SIRS. Accuracy of Septicyte™ for diagnosis of sepsis across a broad range of patients R Brandon 1 , K Navalkar 2 , D Sampson 2 , R Davis 2 , T Yager 2 1 Immunexpress, Seattle, United States; 2 Immunexpress, Immunexpress, Seattle, United States Critical Care 2020, 24(Suppl 1):P482 The purpose of the study was to demonstrate sepsis diagnostic performance of the biomarkers of SeptiCyte™ in subjects other than critically ill adults, and in hospital locations other than ICU. SeptiCyte™ LAB was the first immune-response sepsis diagnostic assay to gain FDA-clearance (K163260) and, as part of gaining this clearance, clinical validation was performed on adult patients admitted to intensive care (ICU) only [1] . We therefore performed an in silico analysis across a broad range of patients using the SeptiCyte™ host immune response biomarkers and algorithm. Peripheral blood gene expression data, including public and private datasets, were chosen based on quality, annotation, and clinical context for the intended use of SeptiCyte™. Multiple comparisons were performed within datasets to better understand the diagnostic performance in certain cohorts including healthy subjects. Diagnostic performance was determined using Area Under Curve (AUC). Results: Table 1 shows some characteristics of the selected datasets and patients, including number of datasets (N=22) and comparisons (N=55), number of cases (N=2234) and controls (N=2089) used in comparisons, patient category and hospital location. SeptiCyte™ AUCs for the three groups of adults, adult / pediatric and pediatric / neonates were 0.88, 0.85, and 0.87 respectively, which is similar to that previously reported (0.82 -0.89) [1] . These results suggest that the SeptiCyte™ signature has diagnostic utility beyond adults suspected of sepsis and admitted to ICU. This signature has now been translated to the near-patient testing platform Biocartis Idylla™ (as SeptiCyte™ RAPID) which promises rapid (~1 hour) diagnosis of sepsis in a broad patient population following further validation. Introduction: Especially extracorporeal cardio pulmonary bypass (CPB) is known to induce severe inflammation. Postoperative inflammation is associated with a sepsis like syndrome including endothelial barrier disruption, volume depletion and hypotension. Sphingosine-1-phosphate (S1P) is a signaling lipid regulating permeability and vascular tone. In septic humans decreased serum-S1P levels could be identified as marker for sepsis severity. We addressed three main issues: (1) Are serum-S1P levels affected by cardiac surgery? (2) Are potential alterations of serum-S1P levels related to changes of acute-phase proteins, S1P sources or carrier? (3) Is the invasiveness of the surgery a factor that may influence serum-S1P levels? Methods: 46 elective major cardiac surgery patients were prospectively enrolled in this study. Serum samples were drawn pre-, post-procedure and on day 1 and day 4 after surgery. We analyzed S1Pand its potential sources: Red blood cells (RBC) and platelets. We further quantified levels of other inflammatory markers and documented other clinical parameters. Median serum-S1P levels in all patients before the procedure were 0.77 (IQR 0.61-0.99) nmol/ml. Serum-S1P levels decrease after surgery, whereas all other inflammatory markers increase. Serum-S1P levels dropped by 58% in the on-pump and 31% in the off-pump group. Changes of serum-S1P levels are associated with S1P sources and carriers: albumin, HDL and vWF:AG activity. Patients with a full recovery of their serum-S1P levels after surgery compared to their individual baseline presented with a lower SOFA score (P>0.05) and shorter ICU stay (P<0.05). Serum-S1P levels are disrupted by open heart surgery and levels might be negatively affected by endothelial injury or loss of S1P sources. Low serum-S1P levels may contribute to prolonged ICU stay and worse clinical status. Future studies may investigate the beneficial effects of S1P administration during cardiac surgery. The aim of study is to measure and correlate the expression of nCD64, mHLA-DR, PCT (procalcitonin) and qCRP (quantitative Creactive protein) to predict development of sepsis and its outcome. In this tertiary centre based longitudinal cohort study, a total 110 patients were enrolled in whom sepsis was suspected on the basis of clinical diagnosis and supported by lab investigations. They were divided into two groups sepsis/case and non-sepsis/control. Disease severity in ICU was assessed by Sequential organ failure score (SOFA). Blood samples for routine lab investigations and biomarkers were taken at the time of admission in ICU before administration of first dose of antibiotics at time D 0 /D 1. Assessment of biomarkers was done simultaneously with TLC at D 0 /D 1 , D 3 and during follow up of patients till their final outcome. There was no significant (p>0.05) mean change in PCT, qCRP, SOFA, nCD64, mHLA-DR from Day 1 to Day 3, however, mean change was higher among cases than controls.On comparison of mHLA-DR between the groups across time periods, mHLA-DR was significantly (p=0.0001) lower among septic patients than controls at both Day 1 and Day 3. All biomarker correctly predicted cases among different percentage of patients with different sensitivity and specificity. There was no significant (p>0.05) association of mortality with the study biomarkers except for PCT. In our study, diagnostic value of PCT in differentiating sepsis from non-sepsis was similar to nCD64 among all biomarkers studied. No advantage of nCD64 or mHLA-DR was found over PCT in diagnosis and correlation with disease progression and mortality. Introduction: AQP4 is a water channel protein contributing to astrocyte and immune cells migration, blood-brain barrier maintenance and cell survival [1] [2] . AQP4 genetic variants represent biomarkers associating with outcome after traumatic brain injury and intracerebral hemorrhage [3] [4] . Linking AQP4 genetic polymorphism to the course of sepsis has not been studied. Methods: Study cohort included 124 ICU patients diagnosed according to SEPSIS-3 consensus. AQP4 rs11661256 polymorphism was studied by analyzing PCR products in a 2% agarose gel using an AQP4 specific polynucleotide tetraprimer set. Data were analyzed by log rank test (MedCalc 18.11.3), and odds ratios/hazard ratios were computed. Statistical significance was determined by Fisher test (FT) or Mann-Whitney test. Results: 23 of 124 sepsis patients had the minor mutation A for SNP rs11661256 located within the regulatory 3' region of the AQP4 gene. Septic shock occurred more frequently in homozygotic carriers of AQP4 C allele vs. patients with AA or CA genotype: OR=3.75 (95%CI: 1.47-9.56), P=0.006 (FT). Lethality in septic shock patients, n= 85, significantly increased compared to sepsis patients with no shock, n=39 (82% vs. 20%, P=0.001, FT). Maximum SOFA values were significantly lower in patients with minor allele A compared to CC carriers of (9.6 vs. 12.0, respectively, P=0.008). In post-surgery group of patients, carriers of AC or AA genotypes had significantly increased survival compared to patients with CC genotypes: Chi-square=5.804; HR=0.455 (95%CI: 0.24 -0.863) for lethality; P=0.016 (Figure 1) . Association of minor allele A of AQP4 SNP rs11661256 with survival in sepsis patients seems secondary to linking the SNP to decreased development of multiorgan failure and septic shock that contribute to mortality. Validation of presepsin as a biomarker of sepsis in comparison to procalcitonin, IL-6 and IL-8 V Chantziara 1 , F Kaminari 1 , C Sklavou 1 , S Fortis 2 , P Kogionou 2 , S Perez 2 , A Efthymiou 1 1 Saint Savvas Hospital, ICU, Athens, Greece; 2 Saint Savvas Hospital, Cancer Immunology and Immunotherapy Center, Athens, Greece Critical Care 2020, 24(Suppl 1):P486 Sepsis is an everyday challenge for the intensivist and biomarkers are useful tools for identification and treatment of this syndrome. We sought to validate presepsin as a biomarker of sepsis in comparison to PCT(procalcitonin) and Interleukins (IL-6,IL-8). We enrolled 25 patients,18 men and 7 women average age 58 (39.5-74) years old, APACHE II 16 (13.5-20.5), SAPS II 48(40.5-58.5), SOFA 8 (6.5-9). 11 patients were septic on admission (according to Surviving Sepsis Campaign: International guidelines for Management of Sepsis and Septic Shock: 2016), 9 had a septic episode during their hospitalization in the ICU while 5 patients never endured sepsis. We measured presepsin, procalcitonin, IL-6, IL-8 during sepsis and on remission. Results: All septic patients had increased values of presepsin, PCT, IL-6 and IL-8 during sepsis with a cutoff value for presepsin 800pg/ml, while the values of these biomarkers were significantly decreased during remission or in comparison to non-septic patients(presepsin p = 0.002, PCT p≤ 0.001, IL-6 p≤ 0.001, IL-8 p= 0.004. All patients who were not septic survived while among septic patients 8 died (40% mortality). Presepsin correlated significantly with PCT, IL-6 and IL-8 (p<0.05). Presepsin is a valid biomarker of sepsis and correlates significantly with all the other values of PCT, IL-6 and IL-8. Clinical sepsis phenotypes are proposed at hospital presentation. These phenotypes, biomarker profiles, and outcomes are not yet reproduced in prospective data. Even less is known about the biologic mechanism the drives these distinct groups. Thus, we sought to validate clinical phenotypes and to determine markers of innate immunity, coagulation, tolerance and tissue damage in a prospective cohort. We prospectively studied patients with Sepsis-3 criteria within 6 hours of presentation at 12 hospitals in Pennsylvania (2018-2019) using automated electronic alerts. Using 29 clinical variables, we predicted phenotypes (α, β, γ, δ) for each patient using Euclidean distance anchored to published SENECA phenotype centroids. Discarded blood was analyzed in a subset (n=160) for markers of innate immunity (e.g. IL-6, IL-10), coagulation (e.g antithrombin III, eselectin), tolerance (e.g. HO-1, IGFBP7), and tissue damage (e.g. serum lactate, bicarbonate) Results: Among 549 patients, α-type was present in 146 (27%), β-type in 140 (25%), γ-type in 168 (31%) and δ-type in 95 (17%, Figure 1a ). On average, β-type was older and more comorbid (mean 73, SD 9 yrs; mean Elixhauser 4.6, SD 2.2) with renal dysfunction (median creatinine 1.8 [IQR 1.1 -2.9] mg/dL, p<0.01 all). The δ-type had more acidosis (mean HCO3 -20.1, SD 4.7 mEq/L), higher serum lactate (median 1.8 [IQR 1.0-3.5] mmol/L, p <0.01 both) and inpatient mortality (13%, Figure 1b) . The γand δ-type had greater markers of innate immunity and abnormal coagulation (e.g IL-6, ICAM p<0.01 both), while markers of increased tissue damage (lactate) and poor tolerance (HO-1) were present in δ-type, compared to α-type (Figure 1c) . The distribution and characteristics of clinical sepsis phenotypes were reproduced in a prospective validation cohort. Similar to the SENECA study, distinct biomarker profiles of tissue damage, innate immunity and poor tolerance were present for the δ-type. The effect that neoadjuvant chemotherapy and hyperthermic intraperitoneal chemotherapy (HIPEC) may have in the postoperative kinetics of biomarkers remains unknow. Some studies demonstrate that neoadjuvant chemotherapy and HIPEC do not invalidate the use of inflammatory markers in postoperative patient monitoring, but none have compared biomarkers kinetics between patients who underwent HIPEC or only cytoreduction surgery. Our main purpose was to identify a difference pattern in C-reactive protein (CRP). We conducted a single-center observational study from January 2015 to November 2019, including all patients who underwent cytoreductive surgery with or without HIPEC. CRP was measured daily until seven post-operative day. We compared patients with and without HIPEC. A total of 19 patients were included, 15 were female. Mean age was 63 yrs (44-76). No clinical and demographical differences were observed between groups. No documented infection was found. After surgery CRP increased markedly in both groups. CRP time-course from the day of surgery onwards was significantly different in HIPEC patients (9.78 ± 3.95 mg/dL vs 14.80 ± 5.63 mg/dL; p=0.035). Multiple comparisons between HIPEC and non HIPEC patients were performed and CRP concentration was significantly different on the 5th and 7th POD (Figure 1 ). No differences were found in other biomarkers (leucocytes and platelets) neither in body temperature. After a major elective surgical insult CRP levels markedly increase independently of HIPEC. Serum CRP time-course showed a higher pattern in HIPEC patients despite no infection detected. Decreased thrombin generation potential is associated with increased thrombin generation markers in sepsis associated coagulopathy D Hoppensteadt 1 , F Siddiqui 1 , E Bontekoe 1 , R Laddu 1 , R Matthew 2 , E Brailovsky 3 , J Fareed. Introduction: Sepsis associated coagulopathy (SAC) is commonly seen in patients which leads to dysfunctional hemostasis in which uncontrolled protease generation results in the consumption of clotting factors. The purpose of this study is to determine the thrombin generation potential of baseline blood samples obtained from SAC patients and demonstrate their relevance to thrombin generation markers. Baseline citrated blood samples were prospectively collected from 49 patients with SAC at the University of Utah clinic. Citrated normal controls (n=50) were obtained from George King Biomedical (Overland Park, KS). Thrombin generation studies were carried out using a flourogenic substrate method. TAT and F1.2 were measured using ELISA methods (Seimens, Indianapolis, IN) . Functional antithrombin levels were measured using a chromogenic substrate method. The peak thrombin levels and AUC levels were lower in the SAC patients in comparison to higher levels observed in the normal plasma ( Table 1 ). The SAC group showed much longer lag time in comparison to the normal group. Wide variations in the results were observed in these parameters in the SAC group. The F1.2 and TAT levels in the SAC group were much higher in comparison to the normal. The functional antithrombin levels were decreased in the SAC group. These results validate that thrombin generation markers such as F1.2 and TAT are elevated in patients with SAC. However, thrombin generation parameters are significantly decreased in this group in comparison to normal. This may be due to the consumption of prothrombin due to the activation of the coagulation system. Thus, persistent thrombin generation with simultaneous consumption of clotting factors such as prothrombin contributes to the consumption coagulopathy observed in sepsis patients. Introduction: Procalcitonin (PCT) is used in the ICU as an inflammatory marker to monitor bacterial infections and guide antibiotic therapy. Whether PCT can predict bacteremia and therefore could prevent expenses attached to bloodcultures is unknown . We investigated whether PCT can predict the outcome of blood cultures in the ICU and reduce expences. A single centre observational cohort study was performed in a Dutch community teaching hospital . Adult patients who were staying in the ICU and were suspected of bacteremia were included. Simultaneously with drawing of blood cultures, samples for PCT measurement were obtained. Expenses for PCT measurement and bloodcultures were calculated. In the study period of one year, a total of 120 patients were included. Three patients were excluded because of incomplete data. Out of the 117 included patients, ten patients had positive blood cultures. There was a significant difference in PCT levels between patients who had positive bloodcultures versus patients with negative bloodcultures (8.01 ng/ml vs 0.71 ng/ml) ( Figure 1 ). The negative predictive value for negative blood cultures is 97% when PCT is below 2ng/ml, There was no difference in CRP levels between the two groups (148 mg/l vs 179 mg/l, p= 0.83).A set of negative blood cultures in our centre costs 35 euros. Positive blood cultures however costs significantly more depending on the micro-organisms found. PCT only costs 8.50 euros per measurement. So when blood cultures are omitted when the PCT level is below 2ng/ml, a cost reduction of 38% can be achieved. A PCT value below 2 ng/ml is a good predictor of a negative blood cultures in ICU patients suspected of bacteremia. PCT guided bloodculture management in these patients could lead to a significant cost reduction Introduction: Level of cfDNA in plasma is a promising prognostic candidate biomarker in critical illness [1] . Oxidized cfDNA (ocfDNA) have not been studied as a biomarker although its functional role in cellular stress have attracted attention of researches [2] . The goal of our study was to assess the early prognostic value of plasma cfDNA/ocfDNA for sepsis in a NICU setting. The cohort included 115 NICU patients diagnosed with stroke, intracerebral hemorrhage (ICH), anoxia, encephalopathy. cfDNA was isolated from day 1 plasma and stained with PicoGreen. Oxidized DNA was determined using DNA immunoblotting with anti-8-oxo-desoxiguanosine antibodies. Genotyping of allelic variants of the TLR9 rs352162 gene was performed using a PCR and designed allele-specific tetraprimers followed by electrophoretic separation of the products Statistics was performed by the Fisher test and Mann-Whitney test. Results: Sepsis was diagnosed by SEPSIS-3 criteria in 35 patients (30.4%). Average NISU staying was 8,8±11,1 days. Circulating DNA plasma levels on day 1 predicted the future sepsis development (Figure 1 ): OR for cfDNA was 7.54 (95%CI: 3.03-18.76), P<0.001; OR for ocfDNA was 5.57 (95%CI: 1.64-18.97), P=0.008. Power of both performed tests with alpha=0.05: 1.0. Log rank test demonstrated better predictive value of cfDNA vs. ocfDNA (Figure) . Concentrations of cfDNA, but not ocfDNA, on day 1 significantly positively correlated with maximum SOFA values during hospitalization, day 3 and pre-outcome leukocyte count and neutrophil-to-lymphocyte ratios in a limited cohort of NISU patients with TLR9 rs352162 CC genotype and not in other patients with genotype TLR9 CT+TT. Increased level of plasma cfDNA better then ocfDNA predicts sepsis development in NISU. Further studies are warranted to clarify the Fig. 1 (abstract P490) . PCT values in patients with positive blood cultures and patients with negative blood cultures possible utility of TLR9 rs352162 polymorphism determining for sepsis risk stratification early on NISU admittance. admission was related with higher severity of illness and extension of ICU stay for all groups. Reduced CBT fluctuations upon ICU admission was found to more severely ill patients with worse clinical outcomes, while the more periodic CBT patterns were correlated with high CBT rhythmicity and better outcome. The impact of sex on sepsis incidence and mortality have been elucidated in previous studies, and sex is increasingly recognized as one key factor in sepsis [1] . Some studies indicate that women have better immunologic responses to infections [2] . Later investigations assume this advantage is linked to immune modulating genes located on the X-chromosome [3] . The purpose of this study is to reveal sex differences in incidence of and mortality of sepsis in a large population-based cohort. Methods: 64049 adult participants in the HUNT2 study (1995-97) were followed from inclusion through end of 2011. Incident bloodstream infections (BSI) from all local and regional hospitals in Nord-Trøndelag county were identified through linkage with the Mid-Norway Sepsis Register, which includes prospectively registered information on BSI used as a specific indicator of sepsis. We estimated age-adjusted cumulative incidence of first-time BSI and compared the risk of a first-time BSI and BSI mortality in men and women using age-adjusted Cox proportional hazard regression. During a median follow-up of 14.8 years 1840 individuals experienced at least one episode of BSI, and 396 died within 30 days after a BSI. Cumulative incidence and cumulative mortality curves are shown in Fig. 1a Introduction:The proportion of hospital-acquired infections (HAI) among sepsis patients is unknown in Germany. Systematic differences in HAI foci between sepsis patients with and without ICU treatment are insufficiently described. Retrospective cohort study based on nationwide health claims data of the German statutory health insurance AOK. Incident inpatient sepsis cases were identified in 2013/2014 among insured persons >15y without preceding sepsis in 24 months prior to index hospitalization. Sepsis was defined according to explicit sepsis ICD-10-codes (incl. severe sepsis/septic shock). HAI were defined based on specific ICD-10-codes for surgical site infection, catheter- Introduction: Elevated renin is associated with an increased risk of death in patients with vasodilatory shock (VS). Recent data show that patients with VS and elevated renin levels have improved survival when treated with angiotensin II (Ang II) + standard care (SC) vs placebo + SC. Patients with acute respiratory distress syndrome (ARDS) can develop angiotensin-converting enzyme (ACE) defects that can lead to elevated renin levels and insufficient endogenous Ang II production. We hypothesized that patients with severe ARDS and elevated renin shock would have improved survival when treated with Ang II + SC vs placebo + SC. In the randomized, placebo-controlled, double-blind ATHOS-3 study, 321 patients with severe VS receiving >0.2 μg/kg/min of norepinephrine or the equivalent were randomized to intravenous Ang II (n= 163) or placebo (n=158). In a post hoc analysis, we assessed the subset of patients with elevated renin (defined as a renin level greater than the median value of the overall ATHOS-3 population) and ARDS (defined by a PaO2/FiO2 ratio <300) at the time of randomization. Survival to 28 days was compared between the Ang II group (n=41) and the placebo group (n=61). In patients with elevated renin and ARDS, baseline age, Acute Physiology and Chronic Health Evaluation II score, and blood pressure were similar in the Ang II and placebo groups. The median serum renin level was 459.5 pg/ml (IQR: 285.8-1036.0) compared to the normal range for serum renin: 5-58 pg/ml. A significantly higher proportion of patients receiving Ang II survived to day 28 compared to those in the placebo group (51% vs 31%; p=0.01). Elevated renin identified patients with VS and ARDS who were most likely to gain a survival benefit from Ang II. Elevated renin is likely caused by an ACE defect and may describe an important subset of patients with a biotype that responds well to Ang II therapy. Introduction: Elevated renin levels have been shown to be associated with an increased risk of death and more severe acute kidney injury (AKI) in patients with vasodilatory shock (VS). Recent data show that patients with VS and elevated renin levels have improved survival when treated with angiotensin II (Ang II) + standard care (SC) vs placebo (PBO) + SC. We hypothesized that VS patients with severe AKI and elevated renin levels would have improved survival and enhanced renal recovery with Ang II treatment. In the randomized, PBO-controlled, double-blind ATHOS-3 study, 321 patients with severe VS received >0.2 μg/kg/min of norepinephrine or the equivalent and were randomized to intravenous Ang II + SC (n=163) or PBO + SC (n=158). In a post hoc analysis, we assessed the subset of patients with elevated renin (defined as a renin level greater than the median value of the overall ATHOS-3 population) and severe AKI (defined as those with AKI requiring renal replacement therapy [RRT] at baseline). Survival and renal recovery were assessed in patients treated with Ang II + SC (n=45) and PBO + SC (n=60). In patients with elevated renin and severe AKI, baseline age, Acute Physiology and Chronic Health Evaluation II score, and blood pressure were similar between Ang II + SC vs PBO + SC. The median baseline serum renin level in the whole group was 352.5 pg/ml (IQR: 115.9-785.4; normal range for serum renin: 5-58 pg/ml). A significantly higher proportion of patients receiving Ang II + SC vs PBO + SC survived to day 28 (43% vs 22%, respectively; p=0.03). Ang II recipients also had a higher rate of discontinuation from RRT by day 7 (43% vs 12%; p=0.04). In this study, elevated-renin shock patients with AKI treated with Ang II + SC gained a survival benefit and earlier discontinuation from RRT compared to those receiving PBO + SC. Elevated renin is likely caused by an angiotensin-converting enzyme defect and may identify those patients with a biotype that responds well to Ang II therapy. Most clinical trials conclude the ineffective use of anticoagulation for sepsis-induced coagulopathy [1] . However, post hoc analyses of randomized control trials report positive results [2] , suggesting anticoagulation is effective in specific populations exhibiting coagulopathy. Further, anticoagulants should be administered in the early phase [3] ; however, methods for precisely predicting the progression of sepsis-induced coagulopathy are not established. This study aimed to create and evaluate a prediction model of coagulopathy progression using machine-learning techniques. We performed a subgroup analysis of data from a retrospective cohort study involving adult septic patients in 40 Japanese institutions from January 2011 to December 2013 and used the Japanese Association for Acute Medicine disseminated intravascular coagulation (DIC) score as a DIC severity index test. The predictive ability of ΔDIC ([DIC score on Day 3] -[DIC score on Day 1]) was evaluated using various statistical methods. Using variables available at the outset, we compared the predictive ability of Random Forest (RF) and Support Vector Machine (SVM) with that of multiple linear regression analysis. A total of 1110 adults with sepsis were included in the analysis. The Root Mean Square Error in ΔDIC score for the multiple linear regression analysis model was 2.1168 compared with values of 1.6508 and 1.9394 for RF and SVM, respectively. Thus, the RF method predicted the progression of sepsis-induced coagulopathy more accurately than multiple linear regression analysis. Conclusions: RF, a machine-learning technique, was superior to multiple linear regression analysis in predicting the progression of sepsis-induced coagulopathy. This prediction model might enable us to use anticoagulation in an early phase. This study examined the efficacy and safety of landiolol, an ultrashort-acting β1-blocker, for treating sepsis-related tachyarrhythmia, according to patient background characteristics. The J-Land 3S study (JapicCTI-173767) was conducted in patients with sepsis, diagnosed according to the Sepsis-3 criteria, and tachyarrhythmia (atrial fibrillation, atrial flutter, or sinus tachyarrhythmia). The patients had a mean heart rate of ≥100 beats/min and required catecholamine administration to maintain a mean blood pressure of ≥65 mmHg. The efficacy endpoint was the percentage of patients whose heart rate could be controlled within 60-94 beats/min at 24 h of registration. The safety endpoint was the incidence of adverse events within 168 h of registration. Subgroup analyses of efficacy and safety were performed after stratifying the patients according to various patient background characteristics. A total of 151 patients were randomized, 76 to landiolol and 75 to the control group. The efficacy endpoint, percentage of patients with a heart rate of 60-94 beats/min at 24 h of registration, was significantly higher in the landiolol group (54.7% vs 33.3%; Mantel-Haenszel test: p = 0.0031). The incidence of adverse events was 63.6% and 59.5% in the landiolol and control groups, respectively, and there was no difference between the two groups. Most adverse events were related to sepsis or septic shock. The subgroup analyses showed that no patient background characteristic clearly affected the efficacy and safety of landiolol. Landiolol is a well tolerated and effective therapeutic agent for controlling heart rate in patients with sepsis-related tachyarrhythmias; its safety and efficacy were not affected by the patient background characteristics investigated. Tissue oxygenation monitoring in sepsis R Marinova, AT Temelkov UMHAT Alexandrovska, Anesthesiology and intensive care, Sofia, Bulgaria Critical Care 2020, 24(Suppl 1):P502 Near-infrared Spectroscopy (NIRS) was proposed as a concept in the end of 20th century. This method offers noninvasive monitoring of oxy-and deoxyhemoglobin in tissues.NIRS could be measured on the thenar or forehead within few santimeters of the skin. It was first applied as a monitoring in cardiovascular surgery. Patients with sepsis have changes in the microcirculation which are important target for therapy. Invasive monitoring of oxygen delivery and consumption has been used in patients with sepsis but as every invasive technique such a monitoring hides risks. NIRS offers a noninvasive method for tissue oxygenation monitoring (StO2) and could be useful in patients with sepsis and septic shock. The aim of the study is to compare noninvasive tissue oxygenation monitoring with hemodinamic monitoring and lactate values in patients with sepsis Methods:The study includes 19 critically ill patients in ICU of UMHAT Alexandrovska, Sofia. 10 of the patients fullfil the criteria for septic state. The other 9 patients do not have sepsis. In both group of patients are measured tissue oxygenation with INVIOS monitor, mean arterial pressure, oxygen saturation in mixed venous blood and lactate values during 72 h after ICU admission. Patients with sepsis are reported with significantly lower values of tissue oxygenation, compared to patients without sepsis. The values of tissue oxygenation correlate well with the mixed venous blood oxygenation, mean arterial pressure and lactate values but not significantly with APACHE scores. Conclusions: NIRS when used for tissue oxygenation monitoring correlates well with the hemodinamic monitoring and lacate values in patients with sepsis and could be used as an noninvasive monitoring for guiding teurapeutic strategies. Tissue oxygenation monitoring has no linear correlation with the severity of illness in patients with sepsis and could not be reccomended as a guidance in the early ressuscitating stage of sepsis. Further investiganions in these field are needed.the Sequenom´s MassARRAY platform and a recessive inheritance model was selected (CC vs TT/CT). The possible association between the CC recessive form of the rs279451 polymorphism and the septic shock risk was analyzed, demonstrating a statistically significant relationship (p=0.02) between both conditions. Among patients who developed septic shock, 79.2% presented a recessive inheritance pattern while 54.5% showed the CT/TT genotype. On the other hand, those patients with the recessive form of the rs279451 polymorphism were selected and a statistical analysis was performed comparing those patients who developed septic shock from those who did not develop it, obtaining a statistically significant relationship (p=0.036) between the presence of the recessive form of polymorphism and the likelihood of developing septic shock. The recessive form of rs279451 polymorphism is a risk factor for septic shock in post-operative patients of major abdominal surgery. Introduction: Sepsis remains one of the major causes of morbidity with mortality rates as high as 50 % worldwide, representing significant clinical challenge to confront highly intangible therapeutic needs. RNAbased structures are emerging as versatile tools encompassing a variety of functions capable to bypass the current protein-and cellbased therapies. RNA aptamers act as disease-associated protein antagonists. Here, the effects of an aptamer, Apta-1, were evaluated in animal models that mimic systemic inflammation in humans. High dose of LPS endotoxin was used to induce systemic inflammation in mice and in non-human primate animal models. Apta-1 was administered intravenously in two doses post LPS infection. Animals were monitored and blood samples collected up to 72 hours after Apta-1 administration. Healthy-and LPS-only treated animals served as control groups. Complex analyses of clinical parameters, hematology, serum biochemistry, inflammation and tissue damage markers were performed. Results: Apta-1 increased survival of endotoxin challenged animals up to 80% in a dose-dependent manner and exerted profound effects on wellbeing and recovery of healthy eating habits. Administration of Apta-1 led to delayed coagulation and enhanced fibrinolysis; maintained the complement cascade activated while preventing it from further amplification. Expression of pro-inflammatory cytokines was reduced while anti-inflammatory increased. Endogenous pro-inflammatory molecules (DAMPs), secreted from injured cells, were preserved at healthy level in animals treated with Apta-1. Systemic inflammation and sepsis lead to severe dysregulation of several arms/axis of innate immune response. Our studies showed that Apta-1 affects various components of this system and restores the organism's control over its dysregulated immune response. Thus, Apta-1 might be a promising potential therapeutic candidate to treat life-threatening conditions such sepsis. Several preclinical studies demonstrated beneficial effects for methane (CH 4 ) administration in various inflammatory conditions. Our aim was to investigate the consequences of post-treatment with inhaled CH 4 in a clinically relevant intra-abdominal sepsis model. Anesthetized minipigs were subjected to fecal peritonitis (0.6 g/kg, 5-9x10 6 CFU i.p.; n=22) or sham-operation (sterile saline i.p; n=5). Invasive hemodynamic monitoring with blood gas analyses was started between 16-24 hours, organ dysfunction parameters (PaO 2 /FiO 2 ratio; mean arterial pressure; lactate, bilirubin, creatinine; urine output and platelet counts) were determined according to a modified porcinespecific Sequential Organ Failure Assessment (ps-SOFA) score system, the perfusion rate (PR) of sublingual microcirculation was measured by incident dark field illumination imaging. The animals were divided into non-treated septic or septic shock groups (n=6-6) and CH 4treated septic or septic shock (n=5-5) subgroups, CH 4 inhalation started from the 18th hr (2.2% CH 4 in normoxic air; 500 mL/min). Despite the standardized induction, heterogeneous severity of organ damage was evolved. In septic and septic shock groups the median values of ps-SOFA score reached 5 (4.75-5.65) and 13 (11.75-14), respectively. Septic shock was characterized by significant elevations of creatinine and bilirubin levels, while the platelet count decreased (from 332 to 76 *10 9 /L). Inhalation of CH 4 increased the sublingual PR by 22% in the septic group, the creatinine and bilirubin levels were decreased by 28% and 80%, respectively. CH 4 post-treatment significantly decreased the ps-SOFA score (to 1; 0.5-2.75) and resulted in lower values in septic shock group (to 10; 9.5-12.4). Methane post-treatment effectively influences sepsis-related end organ dysfunction. Up to a severity threshold it may be a promising additional organ protective tool. Evaluation of sepsis awareness among various groups in Turkey: a survey study S Erel, O Ermis, Ö Nadastepe, L Karabıyık Gazi University School of Medicine, Anesthesiology and Intensive Care, Ankara, Turkey Critical Care 2020, 24(Suppl 1):P510 Introduction: Sepsis is a common life-threatening condition in critically ill patients [1] . Public awareness is important for early recognition of sepsis and improvement of outcomes [2] . We aimed to evaluate sepsis awareness among different groups of people. Methods: Prospective paper-based surveys were issued between 1st July and 1st August 2019 to patients, the relatives of the patiens, hospital staff and general public who gave consent to participate in the study. The questionnaire included ten questions about demographic informations, occupational informations of hospital stuff and sepsis awareness. A total of 588 participated in the survey. Of these participants, 87 (14.3%) were patients, 50 (8.5%) were relatives of patients, 134 (22.8%) were physicians, 125 (21.3%) were medical students, 49 (8.3%) were nurses, 51 (8.7%) were other hospital stuff and 92 (%15.6) were other people. Of these participants, 425 (72.3%) had heard of the word "sepsis". 206 (35.0%) responded correctly regarding the definition of sepsis. 325 (55.3%) of the participants heard the word "sepsis" during their education, but only 53 (9%) heard it through the media. In the groups of high school graduates, university graduates and postgraduates, the rate of hearing the word sepsis and correctly identifying sepsis is significantly higher than the primary school graduates or illiterate groups. (p<0.05). Physicians, nurses and medical students were heard of the word "sepsis" significantly more than other groups (p<0.005). Physicians and medical students responded more accurately to the definition of sepsis than other groups (p<0.05). Public awareness of sepsis is limited compared to healthcare workers. Increasing public knowledge of sepsis through education and through media may contribute to raising public awareness and improving outcomes. The association between clinical phenotype cohesiveness and sepsis transitions after presentation JN Kennedy 1 , EB Brant 1 , KM DeMerle 2 , CH Chang 3 , S Wang 4 , DC Angus 1 , CW Seymour 5 1