key: cord-0033243-z0tdpdkk authors: Van Someren Gréve, F; Van der Sluijs, KF; Juffermans, NP; Winters, T; Rebers, SP; SPVerheul, KD; Molenkamp, R; Spoelstra-de Man, AM; Spronk, P; De Jong, MD; Schultz, MJ title: Lower airway sampling greatly increases detection of respiratory viruses in critically ill patients: the COURSE study date: 2014-03-17 journal: Crit Care DOI: 10.1186/cc13531 sha: 6aadca94314fe7e81e278fbbef178f2a5bf4f538 doc_id: 33243 cord_uid: z0tdpdkk nan The theft and tampering of controlled drugs (CDs) remains a prevalent patient safety issue. Sadly there are numerous reports of critical care staff stealing CDs for personal use or fi nancial gain and notably there have been some cases where CDs have been substituted for other medications in order to delay detection of the theft. This creates both the hazard of medication errors and potentially exposes patients to opioid intoxicated healthcare workers. As most critical care staff have access to CDs, when drugs are found to be missing it can be diffi cult to identify the perpetrators. Therefore the implementation of a deterrent which also improves the methods of detection is warranted. Methods The Limpet, a device which incorporates a proximity sensor and a camera unit, was installed within the CD cupboard of the critical care unit. Whenever the cupboard was accessed the date and time were recorded and a photograph was taken to identify the staff member. Mock thefts were subsequently undertaken by a designated staff member at random times. This allowed testing of the product to determine the number of times the 'thief' was correctly identifi ed. Introduction The purpose of this study was to assess our practice in identifying and managing inadvertent hypothermia and whether this could be improved by education and introduction of a protocol. Hypothermia is associated with multiple physiological abnormalities including reduced myocardial contractility, peripheral vasoconstriction, increased infection risk and impaired coagulation [1] . Inadvertent hypothermia may therefore be an avoidable risk factor in the critically ill. The UK National Institute of Clinical Excellence has issued guidance for avoidance of inadvertent hypothermia (temperature <36°C) during the perioperative period. We audited our practice against three standards from these guidelines: all patients should have at least 4-hourly temperature observations; no patient should become inadvertently hypothermic; and all inadvertently hypothermic patients should be rewarmed. Methods Data were collected prospectively. Body temperature was recorded routinely by nursing staff using a tympanic thermometer. We noted any occasion where the body temperature dropped below 36°C along with any associated interventions -such as the use of additional bed sheets or a forced air warming device. After the fi rst audit period a simple education programme was delivered. We also introduced a departmental protocol for the prevention and management of inadvertent hypothermia. Six months later we re-audited our practice. Results Data were collected from 130 patients (2,931 patient-hours) in the fi rst audit period and from 87 patients (2,070 patient-hours) in the second audit period. In the fi rst period 29% of patients had at least 4-hourly temperature measurements compared with 40% in the second period (P <0.01). The average number of overdue temperature observations per day was 1.4 in the fi rst period and 0.9 in the second (P <0.01). Twenty-four per cent of patients became hypothermic in the fi rst period compared with 22% in the second (P = 0.07); however, the time these patients remained hypothermic reduced from an average of 7.9 hours to 6.1 hours (P <0.01). An intervention was made and documented in 15% of cases in the fi rst period and 46% in the second (P <0.001). Conclusion We saw some improvement following an education programme and introduction of a clinical protocol although there remains room for further improvement. Reference Introduction Ventilator-associated pneumonia (VAP) is the leading cause of death amongst hospital-acquired infections and is also linked to an increased length of stay and cost of care. The Institute for Healthcare Improvement ventilator bundle is comprised of a series of interventions, which, when implemented together, have been shown to decrease the incidence of VAP. The aim of this study was to determine the compliance of the bundle and if <95% [1] devise strategies to improve compliance. Methods A retrospective review of the compliance of an existing VAP bundle was conducted for all adult patients ventilated in the ICU of a large district general hospital in 2011. The bundle comprised four elements: head up 30°, peptic ulcer prophylaxis, deep vein thrombosis (DVT) prophylaxis and sedation hold. The bundle was considered compliant if all four were performed. The fi ndings of the audit were presented to the department and, through subsequent discussions, barriers to noncompliance were identifi ed. Following a period of education, a revised and updated bundle was implemented. A repeat audit covering 3 months was subsequently conducted. Results Pre-intervention, overall compliance of the bundle stood at 32% and subsequently increased to 63% post intervention (P <0.05). Compliance at the level of individual elements varied: head up 30°, 94%; ulcer prophylaxis, 91%; DVT prophylaxis, 85%; sedation hold, 37%. Post intervention, a statistically signifi cant increase in compliance with regard to sedation hold was observed; 72% (P <0.05). The other individual elements did not show a statistical change. However, the new elements that were introduced demonstrated high levels of compliance; cuff pressure 20 to 30 cm H 2 O 83% and oral hygiene with chlorhexidine 90%. Conclusion Post intervention, a statistically signifi cant improvement in overall bundle compliance was found. Thereby highlighting that through engaging all members of the multidisciplinary team in identifying barriers to noncompliance and delivering education, it is possible to improve compliance. While total compliance was suboptimal as the target was 95%, the bundle redesign has given a tool that records compliance with greater clarity due to the presence of clearly defi ned exclusion criteria. It has also been a signifi cant step in the right direction to improving the reliability of care delivered to patient and reinforces the concept that quality improvement is a continuous cycle. Reference Introduction The demand for intensive care has reached new heights, where medical advancement and aging populations have increased the proportion of patients with multi comorbidities. Ideally, bed occupancy on the ICU should be around 70% whereas beyond 80% has been associated with increased mortality. William Harvey Hospital is a district general hospital with nine ICU beds where the ICNARC national database suggested a consistently high occupancy. The purpose of this study was to assess bed occupancy and its impact on service delivery. Methods Data were collected between 2005 and 2013 to analyse trends in ICU bed occupancy, overnight discharges, surgical cancellations, and nonclinical transfers. Data were gathered from April 2012 to March 2013 using quality indicators (Intensive Care Society): readmissions; premature discharges; discharges at night; cancelled planned surgery; and nonclinical transfers. Regression analysis was conducted to assess: correlation between mortality and eff ects of excess occupancy; and causality of increasing demand for ICU beds. Results Bed occupancy remained high between 2005 and 2013 with a 15% increase. This was associated with increased overnight discharges (29%), surgical cancellations (90%), and nonclinical transfers (88%). Between 2012 and 2013, the calculated demand for beds was 174. There were 1.8 ICU to 100 hospital beds and 4.5 ICU beds per 100,000 population, which dropped signifi cantly when including regional specialty services. A persistent gap between ICU bed supply and demand was associated with increased unfavourable outcomes in all quality indicators. Conclusion A steady increase in occupancy over the last 8 years was due to multiple factors, including an increase in clinical services such as in the coronary care centre, head and neck unit, and trauma centre. Presentation of our results to managerial level facilitated increased bed capacity by 22%. References Introduction Physicians have an intense and irregular workday life. Thus, they need to use their time for continuing education in the most effi cient way. This presents a major challenge to suppliers of continuing study programs. Our online master program Physico-Technical Medicine (PTM) addresses working physicians who want to acquire knowledge in the fi elds of biomedical engineering and medical physics [1] . We attach great importance to the special needs of our participants. Therefore, we investigated which general conditions are most important for the working physician regarding qualifying continuing education. Methods The general conditions of our continuing education program (PTM) were examined on the basis of evaluation questionnaires. Additionally, students were asked in interviews about their demands, and they could give anonymous feedback on feedback cards. Results Flexibility: students appreciated fl exibility regarding time and environment of their learning activities. This fl exibility enables the students to use even small time frames or traveling times for learning. Students can choose their preferred time and place for their learning activities. Security in planning: students have to organize their time frames for learning besides their daily work, their family life, and possibly their academic career. Especially phases of attendance should be planned and communicated as early as possible. Transparency of structure: participants ask for transparency in terms of the expected workload and expenditure of time to plan their studies and get focused. Relation to previous knowledge: learning success is supported by relating new contents to previous knowledge from the student's fi rst medical degree. Mainly clinical examples illustrate abstract topics, which also helps students to recognize the relevance of the content. Individual support: students need contact persons in terms of learning organization, technical support, and for questions regarding the course material. Conclusion We could show that an extra-occupational continuing study program for physicians should be adjusted to their special situation and therefore has to comply with particular requirements. As such we could identify fi ve main requirements which are continuously fulfi lled by the online master program PTM and regularly surveyed by evaluations. Reference Introduction A collaborative approach to patient management has been shown to improve patient outcomes. In a resource-limited NHS, critical care beds are at a premium, therefore preventing unnecessary admissions by optimising ward-based management is essential. The objective of this service quality improvement project was to improve collaboration between the critical care directorate and neurosurgical high care unit at a tertiary university teaching hospital. We proposed that the use of a simple ward round tool on collaborative rounds would facilitate systematic patient review, prompt early recognition of those critically unwell and improve patient outcomes. Methods An initial observation period of behaviours and practice on the neurosurgical unit was conducted with qualitative and quantitative data collection. Following analysis of these outcome measures a simple ward round tool was constructed, with the mnemonic 'DON'T FORGET' , and we devised a programme for collaborative ward rounds to take place three times per week over a 1-month period. During this Introduction Daily blood tests are an essential diagnostic tool and routine practice in the management of ICU patients, but are associated with cost and a risk of anaemia. There is no evidence for the frequency or breadth of blood testing but there is evidence that by rationalising blood tests, signifi cant cost saving can be made without a negative eff ect on mortality or length of ICU stay [1] . Blood tests in our ICU are requested by nurses, this is a unique practice to the ICU compared with other hospital departments. Methods We had previously carried out a survey to assess blood requesting practise in UK ICUs. This demonstrated that routine tests were, as in our ICU, nurse led and the majority of staff underestimated the costs of tests. Using the results of this survey and a literature review we developed routine blood test guidelines for our general mixed 26bed ICU/HDU. Following a period of staff education, we audited our blood requesting practice over 28 days. Results Our blood testing did not comply with our guidelines. Over the 28-day period, €12,849.96 was spent on all blood tests. According to our guidelines, €2,914.96 was spent on inappropriate tests (€37,998.63 per year) ( Table 1) . Over 40% of this cost was spent on the coagulation screen. Conclusion In our ICU, unnecessary blood tests have a signifi cant cost burden of approximately €38,000 per year. To limit unnecessary costs, consultants now lead requesting by completing a blood test pro forma on the evening ICU ward round indicating which blood tests are clinically needed the following day. Reference Introduction ICU visits are generally limited to 2 hours per day selected at any time during the 24 hours. At our institution we abide to the international regulations which were recently modifi ed allowing family members and signifi cant others to be present throughout the day accompanying their ICU-admitted relatives, participating in an uninterrupted timely fashion and no time limits with their daily activities. Methods From January 2012 through April 2012, 200 questionnaires were fi lled by families of ICU-admitted patients; each questionnaire includes 13 questions concerning their overall degree of satisfaction at our unit and evaluating the quality. Accommodating spaces were off ered for families of ICU patients, off ering them a state of freedom within the ICU premises surrounding their beloved sick relatives; additionally they were constantly advised by a designated secretary to share their experience and any possible insight to improve the overall polity of our service. Results Two hundred patients were included in the study, the average age of patients was 68.55 years and the average length of ICU stay was 5.2 days. Most family (85%) desired their presence accompanying their related patients within the ICU premises when meals were served. Those families expressed satisfaction while off ering psychological support to their beloved ill ones. Fifteen per cent of the families argued about the permitted number of visitors (one person) per visit and 12% addressed lack of communication between the medical staff and patient's families. Conclusion Compared with the limited visit status within the ICU ward in the past, the majority of families expressed general acceptance and satisfaction toward the newly adopted policy. Additionally and by this new policy, patients displayed marked improvement at the psychological level, being more cooperative and experiencing less episodes of agitation. Even the medical team elaborated their interest towards this policy as being a more profound and comprehensive way of supporting the ICU patients. Further investigations are currently being employed to strengthen the outcome of the study. Introduction The aim of the study was to prospectively assess family satisfaction in the intensive care setting and identify fi elds of possible improvement. Methods The study was performed over a 6-month period in a 10-bed ICU. A properly translated and validated Family Satisfaction in the ICU (FS-ICU 24) questionnaire was used. A total of 126 questionnaires were handed out to family members upon patient discharge. The number of questionnaires returned for evaluation was 120 (91.5% participation). Five-point Likert scale responses were transformed to percentage scores, higher values representing better satisfaction. Results Overall satisfaction with care was reported as very good or excellent by 81% of family members (32% and 49% respectively) both for ICU staff concern and caring as well as symptom management. Favorable scores were also noted in terms of decision-making, 89% of relatives reporting very good and excellent rates as regards information needs. On the other hand, most of the family members (73%) felt neither included nor excluded from the decision-making process. Moreover, a small number of them (6%) answered that they received poor emotional support. Finally, the frequency of communication with ICU nurses and the atmosphere in the waiting room were noted as fair by 28% and 47% of subjects respectively. Conclusion Most family members were highly satisfi ed with ICU care their patients received. Nevertheless, our study showed that the fi elds of communication with the nurses as well as the waiting room atmosphere defi nitely need further improvement. Concerning emotional support and decision-making, the study revealed that currently used measures in our ICU still do not apply to all families and probably require a few changes. Introduction Signifi cant improvements in chemotherapy, haematopoietic stem cell transplantation and general intensive care mean that more patients are now living longer and often present to the ICU at various stages of their disease [1, 2] . Encouraging patients with haematological malignancy (HM) to express their wishes and values on end of life (EoL) prior to ICU admission would ensure that their autonomy is respected. The aim of our study was to determine whether patients with HM that were admitted and died in the ICU had any form of advance care planning (ACP) documented prior to their admission. Methods Data were collected on all adult patients with HM that were admitted and died in the ICU of a tertiary haematology referral centre in London during the period of 1 year. Results Information was collected for 34 patients and their characteristics are shown in Table 1 . In 62% of the patients EoL decisions were made, and do-not-attempt-resuscitation documentation was found in 65% of the cases. Documentation on information exchange, and participation in the deliberation or the decision-making process was found in only 30% of the patients, even though more than 75% of the haematology physicians estimated the prognosis of these patients as moderate (17%) or poor (59%). In the vast majority of cases (31/34) the deaths occurred after withholding or withdrawal of treatment in ICU. Conclusion Only a small number of patients with HM that die in the ICU have documented ACP prior to admission. Since the majority of ICU patients lack personal decision-making capacity, ACP would ensure that care is consistent with patients' wishes, EoL actions are congruent with their values, the burden on family and healthcare providers is alleviated and cost is decreased. References Introduction Perceived futility of care may jeopardize patient quality of care and increase ICU staff turnover. It is related to workload and interdisciplinary collaboration. Our aim was to evaluate concepts from team science, namely inclusive leadership (a style that invites and appreciates others' contributions) and psychological safety (a key antecedent of speaking up and learning behavior), and to determine their eff ect on perceived futility and intention to quit. Methods A staff survey in four interdisciplinary ICUs and two intermediate care units of two tertiary care hospitals. The questionnaire contained validated scales to assess inclusive leadership of head nurses and attending physicians, nurse-physician collaboration, collaborative Introduction The decision to withhold therapeutic intervention in a patient is a complex decision, wrapped in profound ethical debates. The role of the physician is to cure, treat and alleviate suff ering. When the fi rst two goals are not possible the medical role should be dedicated to end-of-life treatments adjusted to the patient's benefi t [1, 2] . Methods A retrospective cohort study including all adult patients with sepsis admitted to the emergency room (ER) at a tertiary care, university hospital between 1 July 2011 and 30 June 2012. Results During the study period 162 patients with sepsis were admitted to the ER, of which 40 (25%) had withheld therapeutic decisions. Comparing this group with the group without therapeutic limitations, patients in the fi rst group were older (81 ± 13 vs. 68 ± 14, P <0.001), with more comorbidities (90% vs. 66%, P = 0.004) and a higher proportion needing help in daily activities (Karnofsky performance status (KPS) <70% = 55% vs. 8%, P <0.001). The hospital mortality in patients with a decision to limit the therapeutic intervention was signifi cantly higher (83% vs. 43%, P <0.001). Variables independently associated with the withholding therapy decision were age (adjusted OR per year = 1.078, P <0.001), presence of comorbidities (adjusted OR = 4.632, P = 0.030), chronic wounds (adjusted OR = 5.965, P = 0.005) and patient's needed of help in daily activities (KPS <70%, adjusted OR = 5.391, P = 0.012). In the fi rst group a lower proportion received antibiotics (70% vs. 99%, P <0.001) and when those were considered inadequate for the agent responsible for the sepsis episode it was less frequently changed (15% vs. 50%, P = 0.028). However, no diff erences were found regarding the elapsed time from admission to the ER until the fi rst medical contact or the time since the recognition of sepsis and antibiotic administration, although the group with withholding decisions had less specimens collected for microbiology: blood cultures (68% vs. 91%, P <0.001) or other specimens (58% vs. 96%, P <0.001). Conclusion In this study the decision to withhold therapy was independently associated with increasing age, the presence of comorbidities and loss of functional autonomy. For the same level of intervention such as antibiotic administration, the decision to withhold therapy did not infl uence the effi cacy of therapeutic attitudes. References Conclusion Despite declining rates worldwide, autopsy remains an important tool for quality and safety assurance. In this retrospective study, autopsy showed that knowledge of the correct premortem diagnosis would have altered therapy in 10% of critically ill cirrhotic patients. Introduction There are no well-validated risk scores for patients with oncological and hematological malignancies presenting to the emergency department (ED). Previous research found the prognostic blood biomarker pro-adrenomedullin (proADM) to be associated with infection-related complications and mortality and thus may be helpful in managing febrile patients with malignancies. Yet the prognostic value of proADM in general oncological patients presenting to the ED remains unclear. Herein, the objective of this study is to evaluate the prognostic potential of proADM and clinical parameters in a consecutive cohort of patients with oncological and hematological malignancies with regard to ICU admission and 30-day mortality. Methods We enrolled all consecutive patients with oncological and hematological malignancies seeking ED care at a tertiary care hospital from February 2013 to October 2013. We prospectively collected various clinical features, and measured blood parameters including proADM upon admission. To assess outcomes, data from electronic medical records -that is, ICU admission, length of stay (LOS), and postacute care location -were used and we contacted all patients 30 days after hospital admission. Logistic regression models with area under the receiver operating curve (AUC) were used to assess association of baseline parameters and outcomes. We included a total of 469 patients with oncological and hematological malignancies, of whom 8.9% (n = 42) were admitted to the ICU and 18.7% (n = 88) did not survive until the 30-day followup. There was a strong association of initial proADM levels and 30-day mortality risk (odds ratio (OR) per 10-fold increase 9.9, 95% CI 4.3, 22.9) with an AUC of 0.67 (95% CI 0.60, 0.74). This association remained signifi cant after multivariate adjustment for initial vital signs (blood pressure, pulse, temperature) and comorbidities (chronic heart failure, chronic obstructive pulmonary disease, diabetes, coronary heart disease) with an adjusted OR of 9.0 (95% CI 3.1, 26.4). There was also a signifi cant association of proADM and LOS (adjusted regression coeffi cient per 10-fold increase: 6.6, 95% CI 2.0, 11.2). Conclusion This study including consecutive patients with oncological and hematological malignancies found a moderate association of proADM with 30-day mortality and LOS. proADM in combination with clinical parameters may help to improve site-of-care decisions for these patients in the future. Introduction A prospective multicentre observational study was carried out to assess the extent to which critical care teams manage patients in hospital who are cared for outside the critical care unit. The Health Service Executive (HSE) in Ireland is in the process of implementing a national Early Warning Score (EWS) and, at an EWS of 7 or above, a referral to critical care is recommended. This study recorded the EWS of patients referred to the critical care team and describes the subsequent interventions made by the critical care team and patient outcomes. Methods Six critical care departments in university-affi liated hospitals across Ireland collected data on all referrals to the critical care team over a 6-week period. Data were anonymised, coded and analysed centrally. Results A cumulative total of 399 calls were made to the critical care teams in the six hospitals. The most common reason for referral was to request a critical care review of a patient (n = 319, 79.9%). Other reasons for referral included cardiac arrest, request to transfer patients from other hospitals and requests for vascular access. The average duration spent by the critical care team reviewing patients on the wards was 57 minutes. This increased up to 67 minutes for cardiac arrest calls. Of the 319 critical care reviews, 160 (50.2%) patients were subsequently admitted to critical care. A total 118 of this 160 had EWS of 7 or above, while 42 scored less than 7 but were still deemed to require admission to critical care. Conclusion Regardless of the EWS, critical care teams are heavily involved in the management of patients outside critical care units. Fifty per cent of patients reviewed by the critical care team subsequently required admission to a critical care unit. The trigger threshold (7 and above) for referral to a critical care team currently recommended by the EWS escalation protocol is more likely to predict need for critical care admission. However, one in four patients referred below the threshold also required admission to a critical care environment. This study questions the safety of introducing such a protocol into acute hospitals. Will noncritical care staff be forced to wait until patients deteriorate further and reach the trigger threshold for referral or will the role of the critical care team expand further to look after all patients with abnormal EWS in hospital? Introduction We aimed to assess actions taken in response to variations in the National Early Warning (NEW) score and to identify factors associated with a poor response. The NEW score is a physiological score, which prescribes an appropriate response for the deteriorating patient in need of urgent medical care. This allows enhanced observation and clinical review of patients, identifying patients at risk of acute mortality. Methods We performed a prospective observational study of adult patients admitted to an acute medical ward in a London district general hospital over a 2-week period. Patient characteristics, NEW score, time of day, day of week and clinical response data were collected for the fi rst 24 hours of admission. Patients with less than a 12-hour hospital stay were excluded. The primary outcome measure was the quality of clinical response. Data were analysed with univariate and multivariate logistic regression. Results During the study period 200 patients were included with a median age of 70 (20 to 102) years. NEW scores were evenly distributed between day and night (52% vs. 48%) with a greater proportion on weekdays compared with weekend days (82% vs. 18%). The majority of patients scored <5 (93% vs. 7%). Forty-seven (27%) patients received an inadequate clinical response. Univariate analysis showed no association with time of day (night 34% vs. day 38%, OR 0.83 (0.47 to 1.49), P = 0.556). However, day of the week (weekend 56% vs. weekday 32%, OR 2.8 (1.30 to 5.84), P = 0.01) and increasing score (NEWS ≥5 100% vs. NEWS <5 31%, OR 65 (3.8 to 1100), P < 0.0001) were signifi cantly associated with an inadequate response. Day of the week was independently associated with an inadequate response after adjusting for confounders (OR 3.08 (1.27 to 7.46), P = 0.013). Conclusion Clinical response to NEW score triggers is signifi cantly worse at weekends, highlighting an important patient safety concern. Reference Introduction The purpose of this study was to evaluate the impact of obesity on outcomes in patients with severe sepsis. Since obesity is considered an infl ammatory disease and is associated with elevations in several infl ammatory mediators important in the outcome of sepsis, the relationship between obesity and outcome in septic patients was studied. Methods This retrospective cohort study included all patients over the age of 40 with a confi rmed diagnosis of severe sepsis and an ICU stay at our academic medical center from 1 January 2005 to 31 March 2011. Obesity was defi ned as a body mass index of 30 or greater. Data on other patient demographics and APACHE II score at the time of sepsis were collected from patient charts. Outcomes measured included inhospital mortality, development of acute respiratory distress syndrome (ARDS), days on mechanical ventilation, hospital cost, and length of stay. We identifi ed 824 patients who met the inclusion criteria for this study. Of these patients, 257 (31.2%) were classifi ed as obese. The mean APACHE II score was similar between obese and nonobese patients (23.3 vs. 22.4; P = 0.068). Obese patients had a similar rate of in-hospital mortality (31.9% vs. 33.7%; P = 0.810) compared with nonobese patients, but a signifi cantly higher rate of development of ARDS (49.4% vs. 34.4%; P <0.001). Obese patients also had signifi cantly Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 S16 more days on mechanical ventilation (6.2 days vs. 5.0 days; P = 0.005). There was no relationship between mortality in obese patients on mechanical ventilation (34.4% vs. 39.5%; P = 0.26) or ARDS (33.9% vs. 42.6%; P = 0.13) compared with nonobese patients. Hospital costs and length of stay did not diff er between the groups. Conclusion Obesity signifi cantly increased the incidence of ARDS and days on mechanical ventilation in patients with sepsis. Previous work has reported that obesity is associated with elevations in infl ammatory cytokines and adipokines, particularly IL-6, which is a known risk factor for ARDS. The higher rate of ARDS in obese patients with sepsis identifi es a high-risk group where new therapies may be most benefi cial and where new methods of preventing ARDS can be targeted. Introduction Studies suggest that obesity may infl uence mortality in patients who develop sepsis. However, the mechanisms linked to improved outcomes are unclear. Our aim was to assess the impact of obesity on mortality at 30 and 180 days and cytokine expression. Methods We used a platform of a negative randomized control trial in subjects (n = 51) with a diagnosis of severe sepsis with ≥1 organ failure. The cohort of severe septic subjects was stratifi ed by obesity status based on the body mass index (BMI >30). Primary outcomes: 30-day and 180-day mortality; secondary outcome: diff erence in median (IQR) of fi ve infl ammatory cytokines including tumor necrosis factor alpha (TNFα), TNFα-receptor 2, interleukin (IL)-6, IL-1-receptor-antagonist (IL-1ra) and IL-10. The measurement of median baseline cytokine levels was done in serum by Luminex technology. Statistical signifi cance was defi ned as P <0.05. Results Fifty-one subjects with severe sepsis were included in the study; 37% of the patients were obese (BMI >30). Paradoxically, obese severe septic patients had lower 30-day mortality (n = 1 (5%) vs. n = 9 (28%), P = 0.069) and 180-day mortality (n = 1 (5%) vs. n = 13 (41%), P = 0.008), when compared with nonobese. The expression of TNFα, TNFα-receptor 2, IL-6, IL-1ra and IL-10 was not statistically signifi cant diff erent among obese versus nonobese severe septic patients. Conclusion Obesity is associated with lower mortality rates at 30 and 180 days in patients diagnosed with severe sepsis. This survival benefi t was not associated with lower cytokine production among obese patients. Further studies are needed to assess the mechanisms associated with the survival benefi t related to obesity in patients with severe sepsis. Introduction Predicting long-term outcome in patients surviving a pneumonic or nonpneumonic COPD exacerbation remains challenging. This study investigates the association of clinical parameters and the prognostic blood marker pro-adrenomedullin (proADM) measured upon hospital discharge with 6-year mortality in well-defi ned cohort of COPD patients. Methods We prospectively followed consecutive COPD patients from a previous Swiss multicenter trial (2006 to 2008) [1] over a 6-year followup and investigated all-cause mortality following hospital discharge. Patients and/or treating general practitioners were contacted by telephone interview to assess the vital status of patients. We used Cox regression models and the area under the receiver operating characteristics curve (AUC) to investigate associations of baseline predictors and mortality. Overall mortality in the 469 included COPD patients was 55% (95% CI 0.5 to 0.6) with a 14% (95% CI 0.1 to 0.2) mortality incidence rate per year. Patients with pneumonic COPD exacerbation had a more pronounced infl ammatory response compared with patients with nonpneumonic exacerbation with regard to levels of initial C-reactive protein levels (median 158 mg/dl vs. 39 mg/dl, P <0.0001), procalctionin (median 0.4 μg/l vs. 0.1 μg/l, P <0.0001) and proADM (median 1.3 nmol/l vs. 0.9 nmol/l, P <0.0001), but long-term survival was similar (HR 1.0, 95% CI 0.8 to 1.2) . In univariate regression models, proADM was signifi cantly associated with mortality after 1, 3 and 6 years (HR 16.1 (95% CI 6.9 to 37.7), 10.5 (95% CI 5.7 to 19.6) and 10.4 (95% CI 6.2 to 17.7), respectively). There was no eff ect modifi cation by type of exacerbation. A model including clinical parameters (age, coronary heart disease, heart failure, diabetes mellitus, chronic renal failure, neoplastic disease, pneumonia, smokers) and proADM showed good discrimination of long-term survivors from nonsurvivors with AUC of 0.74 (95% CI 0.6 to 0.7). Conclusion Clinical parameters and discharge levels of proADM allow accurate long-term prognostication in COPD patients independent of initial type of exacerbation. The focus on the best use of long-term prognostic information to improve patient care and clinical outcomes seems promising/rational. Reference Introduction Based on expert opinion and case note review, the UK National Confi dential Enquiry into Peri-operative Outcome has recommended provision of perioperative level 2 and 3 care to support major surgery in older people, and particularly those with comorbidity [1] . We wished to identify whether we could predict if the need was uniform and whether any factors could predict the degree of organ supports needed. Methods A retrospective note review of all patients admitted to a level 2 critical care unit in the 12-month period from 1 January 2012 to 31 December 2012 undergoing revision hip surgery either as a two-stage or single-stage process. Surgery was undertaken at a national referral unit and chosen to represent an appropriate group of older, comorbid patients. Predefi ned preoperative and perioperative data were collected from chart review, along with postoperative physiological data whilst the patient was in critical care. This included frailty, comorbidities, operative blood loss, anaesthetic technique and level and duration or organ supports including the need for additional medical review whilst on the unit. Frailty was assessed preoperatively using the Rockwood assessment tool by trained staff [2] . Data were analysed using Microsoft Excel for Mac 2011 and Stata/IC 11.2 for Mac. Results A total of 182 patients with a mean age of 69.8 years (range 29 to 92) were identifi ed. Frail patients were signifi cantly more likely to need additional medical input in the postoperative period whilst on critical care ( Figure 1 , P = 0.002) but this was not signifi cantly linked to need for vasopressors, evidence of sepsis or choice of anaesthetic technique. Conclusion In complex revision orthopaedic surgery, the need for postoperative level 2/3 support cannot be predicted from any preoperative or intraoperative factors but patient frailty does indicate the need for medical input in the postoperative period. References Introduction ICU or hospital mortality rates have been reported as the endpoint of ICU therapy for many years. The aim of this study was to determine the 1-year mortality after discharge from the ICU in patients who were treated in the ICU for more than 72 hours and to identify predictors for 1-year mortality. Methods This study was conducted in a 20-bed mixed ICU of a teaching hospital. The study sample was extracted from a dataset of all ICU patients treated for more than 72 hours between 1 January 2007 and 1 October 2012. Demographic characteristics and clinical characteristics at admission and during the ICU stay were collected. Characteristics of patients alive 1 year after ICU discharge were compared with patients who died within the fi rst year after ICU discharge. Descriptive statistics were calculated. Multivariate analysis of 1-year mortality was performed using a logistic regression model with backward elimination. Survival was analysed by the Kaplan-Meier method using the time interval from day of ICU discharge until death. Results During the study period, 740 patients were treated for more than 72 hours in the ICU. The ICU mortality was 106/740 (14%). The data of 617 ICU survivors were further analysed (17 patients were lost to follow up). Conclusion Of patients being treated for more than 72 hours in the ICU, 28% died within 1 year after ICU discharge. One-half of them within the hospital stay after ICU discharge. High age at ICU admission, high APACHE IV predicted mortality score, high number of comorbidities, readmission and an admission diagnosis within the categories 'cardiovascular' and 'sepsis' are associated with an increased 1-year mortality after ICU discharge in this population. The burden of patients dying after ICU discharge underlines the necessity for clear ICU discharge criteria and post-ICU care. physical composite score (PCS) 36.2, mental composite score 48.1; 50 = national average). Reduced muscle strength was associated with low scores on the SF-36 physical function and general health domains. Performance on the 6MWT correlated with the SF-36 including the PCS (P = 0.001). Screening positive for anxiety was associated with both poor 6MWT performance and reporting dysfunction on the EQ-5D domains. ICU/hospital length of stay, number of days ventilated, severity of illness and organ dysfunction were not found to be predictive of muscle strength or physical functioning. Conclusion Our study gives qualitative evidence that survivors of critical illness have reduced muscle strength, physical functioning and HRQL after hospital discharge. Also, we have shown muscle weakness is predictive of overall physical functioning, which in turn impacted HRQL and mental health. No ICU risk factors were identifi ed that predicted defi cits in muscle strength or physical functioning. Introduction The aim of this study was to determine an appropriate risk model to identify patients at high risk of prolonged ICU stay and to aid patient consent prior to cardiac surgery. Methods Data were prospectively collected on 5,440 consecutive cardiac surgery cases between April 2009 and March 2012. The primary outcome measure was the combined outcome of prolonged ICU stay (length of stay greater than 20 days) and/or in-hospital mortality. Logistic regression was performed to assess the predictability of logistic EuroSCORE against the primary outcome. Low-risk, mediumrisk and high-risk groups were identifi ed and subsequent risk of 1-year mortality assessed. Survival status was determined at 1 year. Results A total of 192 (3.5%) patients had a prolonged ICU stay and 187 (3.4%) in-hospital deaths occurred, resulting in a combined primary outcome of 349 (6.4%). At 1 year, 371 (6.8%) deaths occurred. The risk of death in-hospital and at 1 year was signifi cantly higher in patients with prolonged ICU stay (in-hospital mortality, 15.6% vs. 3.0%; P <0.001/1 year, 27.6% vs. 6.1%; P <0.001). The mean logistic EuroSCORE for all patients was 10.9. Patients with prolonged ICU stay had a signifi cantly higher logistic EuroSCORE (20.3 vs. 10.6; P <0.001). The logistic EuroSCORE was a reasonable predictor of prolonged ICU/ in-hospital mortality (OR 1.04, 95% CI 1.04 to 1.05, P <0.001) with a receiver operating characteristic (ROC) curve of 0.72. The relationship between a patient's logistic EuroSCORE and predicted risk of prolonged ICU is shown in the fi gure; including low-risk, medium-risk and high risk groups. Around 50% of the entire cohort of patients had a logistic EuroSCORE of 10 or less and an associated risk of prolonged ICU stay of 5% or less. See Figure 1 . Conclusion Using an existing risk prediction model, a patient's risk of prolonged ICU stay can be calculated using contemporaneous data. This information could be relevant for aiding in providing informed consent for cardiac surgery patients. Introduction Information about lung cancer patients surviving critical illnesses is very scarce. Our aim was to evaluate the outcomes and continuing of anticancer treatments in lung cancer patients surviving ICU admission. Methods Secondary analysis of a prospective multicenter study including patients admitted for >24 hours to 22 ICUs in six countries from Europe and South America during 2011. Readmissions and patients in cancer remission >5 years were excluded. Logistic regression was used to identify predictors for hospital mortality. Results A total of 449 patients (small-cell (SCLC) = 55; non-SCLC = 394)) were admitted to ICUs, and out of them 275 (SCLC = 29; NSCLC = 246) were discharged alive from the hospital. Among them, 200 (73%) patients were alive and 72 (26%) had died at 6 months; three (1%) patients were lost to follow-up. Mortality rates were far lower in the patient subset with nonrecurrent/progressive cancer and a good performance status (PS), even those with sepsis, multiple organ dysfunctions, and need for ventilatory support. Cancer recurrence or progression occurred in 53 (26%) hospital survivors. Anticancer treatments were recommended for 108 (39%) hospital survivors and administered to 102. Treatments used were variable combinations of surgical resection (7%), radiation therapy (34%), and chemotherapy (80%). The initial treatment plan required reduction or modifi cation in 35 (34%) patients. Post-hospital mortality was nonsignifi cantly lower in the patients given the initial treatment plan than in the other patients (17% vs. 32%, P = 0.065). Poor PS was the only factor associated with a lower probability of receiving the initial treatment plan (OR = 0.20; 95% CI, 0.05 to 0.87; P = 0.032). At 6 months, 71% patients were at home, 15% were hospitalized, and 7% were in hospice care; the location was unknown for 6% patients. PS at 6 months was 3 to 4 in 19 (9.5%) survivors. Conclusion Post-hospital mortality in critically ill lung cancer patients is relatively high and many patients require anticancer treatments after discharge. PS before ICU admission is a major determinant of both mortality and ability to receive optimal anticancer treatment in these patients. The population of the UK is ageing, with the fastest increase in those ≥85 years. Increased age has been repeatedly associated with adverse outcome and it is uncertain to what extent this relates to the changes of ageing in themselves, or due to other considerations. Age is a key variable in the majority of scoring systems that relate patient characteristics to adverse outcome. We aimed to assess change in age distribution of patients admitted to our ICU over 20 years and examine the relationship between age of patient, mortality and length of stay (LOS). Conclusion There are increasing numbers of older patients on ICUs in the UK. In analyses uncorrected for severity of illness or comorbidities, older patients are more likely to die on the ICU, and on the ward after ICU. They also spend longer in hospital prior to discharge. Introduction UK military personnel injured overseas are repatriated to the Royal Centre for Defence Medicine (RCDM) based at the Queen Elizabeth Hospital Birmingham (QEHB) in Birmingham UK. We report the demographics and outcomes of military patients treated on the ICU at RCDM using data from the Intensive Care National Audit and Research Centre over a 6.5-year period. Methods Data on 570 admissions of 527 patients to the ICU at RCDM/ QEHB were analysed by ICNARC using standard methodology. Results Some physiology and CCMDS data were missing for 175 patients. Age, sex and mortality are described in Table 1 . A total of 90.9% of patients had traumatic injuries, 2.1% received CPR prior to ICU admission, 1.5% prehospital. A total of 20.6% had head, neck or spinal trauma. A total of 85.7% were transferred directly to the ICU from a military hospital overseas, others coming to the ICU following surgery at RCDM. Of the 382 patients with APACHE II score data the mean score was 11.0 (SD 4.9), probably refl ecting stabilisation in military hospitals overseas or during aeromedical critical care transfer. The mean number of ICU days was Level 3: 7.6 (SD 11.6); Level 2: 2.0 (SD 2.8). A total of 70.4% of patients required advanced respiratory support for a mean of 7.5 days, and 33% required advanced cardiovascular support for a mean of 3.7 days. The data on resource utilisation for this group of patients may inform planning of critical care support for military operations overseas. Introduction The rate of very old patients (≥80 years old) admitted to intensive care has increased in the last years. Older age is associated with a higher prevalence of chronic diseases, including cancer [1] . The aim of the present study was to describe characteristics, outcomes and predictive factors of mortality in very old patients admitted to the ICU. Methods We performed a retrospective analysis of all cancer patients who were 80 years or older admitted to the ICU between January 2009 and December 2012 in a tertiary reference cancer center. Data were collected from medical records. Results A total of 597 patients with cancer were included in the analysis. Hospital mortality was 28.5%. These patients were more likely to have a solid tumor, a localized disease and underwent surgery, chemotherapy or radiotherapy recently. Variables associated with hospital mortality in these patients were: lung cancer, metastatic cancer and at ICU admission vasopressor requirements, acute respiratory failure, low hemoglobin levels, elevated creatinine levels, elevated bilirubin levels, acidosis and hyperlactatemia. By the multivariate analysis, the following factors were independent factors associated with hospital mortality: lung cancer (odds ratio (OR) = 6.3, 95% CI = 2.6 to 15.0, P <0.001), lactate levels on ICU admission (OR = 1.03, 95% CI = 1.02 to 1.04), P <0.001), bilirubin levels on ICU admission (OR = 1.16, 95% CI = 1.04 to 1.30, P = 0.007) and creatinine levels on ICU admission (OR = 1.58, 95% CI = 1.35 to 1.85, P <0.001). Conclusion Very old patients with cancer present acceptable rates of survival after ICU admission. Similarly to younger patients, organ function evaluation in the fi rst hours of ICU can predict outcomes in this specifi c population. Reference Introduction Just over 1,200 curative oesophagectomies are carried out in the UK annually. Although in-hospital mortality rates have fallen (12 to 13% in 1998 to 2.5% 2013), complication rates remain high [1] with anastomotic failure and respiratory failure common postoperatively [2] . The aim of this retrospective review was to examine the outcomes in patients who underwent oesophagectomies in our unit between January 2010 and October 2012. Methods We examined demographic data, survival (30 day and 1 year) and length of ICU and hospital stay. Case notes were reviewed to identify postoperative complications including anastomotic breakdown, reintubation and respiratory failure. Data were analysed to examine the relationship between the use of postoperative noninvasive ventilation and intraoperative fl uid volume and the incidence of postoperative complications. Results Seventy-two patients were identifi ed as having undergone an oesophagectomy between January 2010 and October 2012. Median age was 65 and 82% were male. One patient died within 30 days (1.39%) Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 S21 and nine patients had died by 1 year (12.5%). The median length of ICU and hospital stay was 4 days and 14 days respectively. Six patients had an anastomotic leak (of which two were chyle leaks). Use of noninvasive ventilation (in 23.1% of patients) was not associated with an anastomotic leak (chi-square P = 0.53), nor was the amount of fl uid given intraoperatively (Mann-Whitney U P = 0.410). Six patients had to be reintubated and this was associated with a signifi cantly increased length of both ICU and hospital stay (Mann-Whitney U P = 0.01 and 0.03 respectively). Lower P/F ratios were also associated with a signifi cant increase in length of both ICU and hospital stay (P = 0.007 and 0.043). Conclusion The overall mortality and morbidity rate was comparable with that seen nationally. Our data suggest that the use of non-invasive ventilation was not associated with anastomotic breakdown. A lower P/F ratio in the postoperative period was associated with prolonged ICU and hospital stay. The fi rst description of SAPS II dates back to 1993 [1] , but little is known about the scoring accuracy in daily practice and factors possibly aff ecting it. The purpose of this study was to evaluate accuracy of SAPS II scoring by means of a nationwide survey. Methods Twenty clinical scenarios, covering a broad range of illness severity, were randomly assigned to a convenience sample of clinicians or nurses of Swiss adult ICUs. They were asked to assign a SAPS II score (including details for each item of the score) for one scenario. Results were compared with a reference, as defi ned by fi ve experienced researchers using a Delphi method, and cross-matched with data related to training and quality control for scoring, information on daily practice of scoring, and structural and organizational properties of each participating ICU. Results Sixty-three (81%) of 78 adult ICUs participated in this survey. A perfect match with each single reference item was found in 27 (7.8%) of 345 scorings. The participants' mean SAPS II scoring was 42.6 ± 23.4, with a bias of +5.7 (95% CI 2.0 to 9.5) as compared with the reference score. There was no evidence of variation of the bias according to case severity, number of beds in the unit, number of residents during workshifts, linguistic area, profession (physician vs. nurse), experience, initial SAPS II training, and presence of a quality control system. The items with highest scoring accuracy were bilirubin, temperature and chronic disease (93%, 93% and 91% respectively), whereas the lowest agreement was found for urinary output and for the Glasgow Coma Scale (63% and 64%). Conclusion This nationwide survey suggests a wide variability of SAPS II scoring results. On average, SAPS II was overestimated by more than 10%, irrespective of the profession or experience of the scorer or the structural characteristics of the ICUs. At least one person per unit involved in the scoring should be trained by the national society and should be responsible for the scoring quality. Introduction Early Warning Scores (EWS) are used in UK hospitals to identify patients who are acutely unwell or needing urgent review. The National EWS (NEWS) [1] was implemented in our institution and led to a noticeable increase in the numbers of patients being triggered for escalation of medical care, although clinically this was thought unwarranted. This led to a potentially dangerous lack of faith in the NEWS by clinical staff . We assessed whether it was safe to move to VitalPAC™ EWS (ViEWS), a commercially available electronic EWS [2] . Methods All patients scored as 'high risk' by NEWS (with a score of 6 or more) in a snapshot audit of patients in our 500-bed acute district general hospital were identifi ed and reviewed clinically. All of these patients were then recategorised using ViEWS. The clinical safety of this recategorisation was then assessed. Results Forty-six patients were identifi ed in our hospital at the time of the snapshot as being high risk according to NEWS. After recategorising this cohort of patients using ViEWS, 36 were classifi ed as high risk (in this instance meaning a score of 5 or more). Subjectively the authors did not have any clinical concerns created by moving 10 patients out of the high-risk classifi cation. Conclusion ViEWS is more specifi c without being less sensitive. We have replaced NEWS with ViEWS and feel that this is clinically safe. Introduction Blunt chest wall trauma accounts for over 15% of all trauma admissions to emergency departments worldwide [1] . Reported mortality rates vary between 4 and 60% [2] . Management of this patient group is challenging as a result of the delayed onset of complications. The aim of this study was to develop and validate a prognostic model that can be used to assist in the management of blunt chest wall trauma. Methods There were two distinct phases to the overall study; the development and the validation phases. In the fi rst study phase, the prognostic model was developed through the retrospective analysis of all blunt chest wall trauma patients (n = 274) presenting to the emergency department of a regional trauma centre in Wales (2009 to 2011). Multivariable logistic regression was used to develop the model and identify the signifi cant predictors for the development of complications. The model's accuracy and predictive capabilities were assessed. In the second study phase, external validation of the model was completed in a multicentre prospective study (n = 237) in 2012. The model's accuracy and predictive capabilities were re-assessed for the validation sample. A risk score was developed for use in the clinical setting. Results Signifi cant predictors of the development of complications were age, number of rib fractures, chronic lung disease, use of preinjury anticoagulants and oxygen saturation levels. The fi nal model demonstrated an excellent c-index of 0.97. Conclusion In our two-phase study, we have developed and validated a prognostic model that can be used to assist in the management of blunt chest wall trauma patients. The fi nal risk score provides the clinician with the probability of the development of complications for each individual patient. Introduction Crush syndrome is often encountered in natural disasters. It is a critical condition leading to multiple organ failure. However, the mechanisms by which the local traumatic injuries aff ect distant organs remain unknown. We paid attention to bone marrow-derived mononuclear cells (BMMNCs) as therapeutic strategy against crush injury. Transplantation of BMMNCs can elicit protection and regeneration against damaged organs through their paracrine properties and its clinical application has been realized to treat ischemia reperfusion-related various diseases. We have previously reported that multiple organ damage based on systemic infl ammatory response is induced in crush injury and the pathogenesis is largely dependent on massive ischemia-reperfusion. We investigated whether BMMNCs could suppress systemic infl ammation and improve mortality in a rat model of crush injury. Methods To develop crush syndrome, both hind limbs of rats were compressed for 6 hours under weights (3.0 kg each). Rats in the treated group were intravenously administrated BMMNCs (1×10 7 cells in 1 ml phosphate-buff ered saline (PBS)) immediately after weight removal (BMMNC group) and PBS alone was administered in the control group (CR group). The sham group underwent the same procedure without compression of hind limbs (SH group). The rats were observed over 7 days after the injury to evaluate survival. To estimate antiinfl ammatory eff ects of BMMNCs, sera were collected 3 hours, 6 hours and 24 hours after the injury. The levels of interleukin 6 (IL-6) and tumor necrosis factor alpha (TNFα) were measured by ELISA and statistically analyzed. The survival rate at day 7 in the BMMNC group was 80.0%, and that in the CR group was 47.6%, revealing that the 7-day survival was signifi cantly improved by the BMMNC injection (P <0.05). Crush injury-induced upregulation of serum IL-6 was signifi cantly reduced by the BMMNC treatment at all time points (P <0.05). The level of TNFα decreased signifi cantly in the BMMNC group compared with that in the CR group 24 hours after the compression release (P <0.05). These fi ndings suggest that transplantation of BMMNCs has an ability to evade the devastating condition following crush injury by suppressing systemic infl ammation. Conclusion The administration of BMMNCs reduced production of infl ammatory cytokines and improved survival rate in a rat model of crush injury. Cell therapy using BMMNCs might become a novel therapy against crush injury. Introduction Some patients presenting to the emergency department (ED) currently face inacceptable delays in initial treatment due to suboptimal initial triage. Triage scores, such as the Manchester Triage System (MTS), have not been well validated in unselected medical patients. Herein, we performed a prospective cohort study to assess the prognostic potential of the MTS and the prognostic biomarker proadrenomedullin (ProADM) to identify patients at high initial treatment priority, patients with admission to the ICU, and patients who die within a 30-day follow-up. Methods This is a prospective, observational cohort study including all consecutive medical patients seeking ED care between June 2013 and October 2013, except nonadult and nonmedical patients. We collected detailed clinical information including the initial MTS and measured ProADM levels on admission in all patients. Initial treatment priority was adjudicated by two independent, blinded physicians based on all available results at the time of ED discharge to the medical ward. To assess outcomes, data from electronic medical records were used and all patients were contacted by telephone 30 days after hospital admission. The prognostic performance of MTS and ProADM was assessed in multivariate regression models with area under the receiver operating curve (AUC) as an overall measure of discrimination. Results We included a total of 1,452 patients (58% males, mean age 66.6 years). A total of 20.1% (n = 292) were classifi ed as high treatment priority, 5.4% (n = 79) were admitted to the ICU and 4.4% (n = 64) died within 30 days. The initial MTS showed a good prognostic accuracy to predict treatment priority (AUC 0.75) and ICU admission (AUC 0.76), but not for mortality prediction (AUC 0.58). Initial ProADM levels were independent predictors for all three outcomes and signifi cantly improved the MTS score to AUCs of 0.78 for treatment priority, 0.80 for ICU admission and 0.84 for mortality. Conclusion Within this large cohort of consecutive unselected medical patients seeking ED care, the MTS instrument in combination with a prognostic biomarker (ProADM) allowed accurate initial risk assessment in regard to treatment priority, ICU admission and mortality. A combined score has the potential to signifi cantly improve initial risk assessment in patients, which may translate into faster and more targeted care and better clinical patient outcomes. Introduction At the hospital we are faced with situations of violence, whether verbal or physical, especially in the emergency department (ED) [1] . The objective of this study is to evaluate the phenomenon of violence at hospitals in Lebanon, especially in the ED, and to recommend techniques to prevent it. Methods A questionnaire consisting of 18 questions was sent to the caregivers in the ED of three randomly selected hospitals in Beirut, Lebanon in 2012. A total of 111 people (nurses, aides, doctors, interns, residents, social workers and security guards) responded to the survey questionnaire. Results The majority of the surveyed people are young women (62%) aged between 20 and 40 years (78%) with a nursing degree (74%) and professional experience <5 years (48%). In total, 59% of respondents have experienced violence in the ED during the night (58%) from the patients (31%) or their companions (68%). The caregivers most aff ected by the violence are nurses (54%) and employees of reception (46%). Violence can be verbal (threats 47%, insults 36%, criticism 18%) or physical (hitting 43%, slapping 40%, stabbing 17%). The dissatisfaction of the patient in his care (42%) and his anxiety (33%) are the most important factors in the generation of violence that may have repercussions on the care workers and their psychological status. Conclusion Violence in the ED may be due to the heavy workload of the caregivers causing a delay in care. Secondly, patients in the ED may feel insuffi ciently informed and heard by the nursing staff . The priority given to emergencies depending on the severity and not the order of arrival can be misunderstood [2] . Therefore, we recommend the following actions: encourage caregivers to improve their knowledge and training on the management of patients in emergency situations; train emergency caregivers to mediation, nonviolent communication and managing stressful situations; and increase the number of nurses and security guards in the ED and motivate them to ensure a better quality of care and minimize the delay in their care. Introduction Intentional self-poisoning is one of the most common presentations to acute medical units across the UK [1] . To our knowledge no studies have been published on incidence of admissions to critical care in England after overdose. Our aim was to investigate the epidemiology, clinical features and outcomes of patients admitted to critical care after intentional self-poisoning to establish patterns in our community. Methods We performed a retrospective data collection using our critical care database 'Metavision' to select all patients admitted with a diagnosis of 'overdose' . Records were scrutinised to collect information on patient demographics, clinical features and medical management. Results Thirty-eight patients (male:female ratio 1:1.53) were admitted to critical care over a 1-year period (September 2011 to 2012). This represented 2.45% of the total admissions to critical care during the same period. The sample had a signifi cantly younger median age (45 years) than the standard patient population in critical care (68 years) during the same period (P <0.0001). Despite the young age and paucity of comorbidities, there was no diff erence in length of stay between overdose patients (2.0 days) and all other patients on critical care (1.58 days, P = 0.3). The median number of agents ingested was three Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 S25 (1 to 7) with 84.2% ingesting ≥2 agents. Hypnotics and antidepressants made up 45% of the agents ingested. A total 92.1% of the sample was admitted out-of-hours or at weekends. Fifty per cent had a past history of overdose, 25% had a history of alcohol misuse. A total of 79% of patients were referred to critical care due to a low conscious level but only 50% required IPPV and 20% received vasopressors/inotropes. The mortality rate was 2.6%, with one further death 6 months after discharge due to alcoholic liver disease. Estimated fi nancial cost was £80,555 or £2,119 per patient (57 level 3 bed-days, 35 level 2 bed-days). Conclusion Intentional self-poisoning mortality rates are low in spite of the number of patients admitted. Despite the young age of patients and lack of comorbidities, their length of stay is similar to the average length of stay for all patients admitted to the unit, representing a signifi cant fi nancial cost. Self-poisoning requiring critical care support is more common out-of-hours when less senior expertise is available. Education of junior doctors into management of overdose is therefore vital to ensure the early identifi cation and appropriate treatment of these patients. Introduction A number of studies have demonstrated the signifi cant and increasing morbidity and mortality associated with alcohol-related disease in the UK, and the corresponding impact this has on critical care services [1] . This study was designed to similarly evaluate the current burden of alcohol-related disease on admission rates to a regional ICU, but also to calculate the fi nancial costs of such admissions. Methods Data-entry fi elds derived from recommendations in a public heath publication [2] were introduced to the local electronic records system to enable prospective data collection for all admissions that were either wholly or partially attributable to alcohol consumption. Using locally defi ned values for the cost per day of an admission to the ICU, depending on the maximum level of organ support required during the admission, it was possible to calculate the total expense incurred by the unit for each alcohol-related admission. In 1 year from December 2012 to November 2013 inclusive, the ICU recorded 84 alcohol-related admissions, accounting for approximately 9% of annual unplanned admissions. With an average length of stay (5.8 days) similar to that of all other unplanned admissions, this totalled 534 ICU bed-days. A total of 86% of patients with alcohol-related conditions were male with an average age of 46.4 years (range 15 to 83 years), and the majority (42%) presented with chronic conditions partially attributable to alcohol consumption. The number of admissions per month varied from zero in May to a peak of 14 in November, with the majority (40%) of admissions occurring over the autumn months ( Figure 1 ). Eighty-nine per cent of patients with alcohol-related conditions required support for at least two organ failures, which subsequently equated to an overall cost to the unit of £725,308 and 12% of an approximate £6 million annual budget. The results of this study support the hypothesis that alcohol-related disease contributes considerably to admissions to this ICU. Furthermore, they have shown that the fi nancial impact was proportionally greater than the percentage number of admissions attributable to alcohol consumption, refl ecting the high frequency of multiorgan failure in this patient cohort. References Introduction The objective was to analyze the clinical features of multiple organ dysfunction syndrome (MODS) induced by severe heat stroke, and to investigate the pathogenesis, diagnosis and prevention strategies in patients with MODS caused by severe heat stroke. Methods We retrospectively studied nine cases of MODS caused by severe heat stroke, and systemically reviewed the relevant literature. Results (1) Nine patients with MODS caused by severe heat stroke were all exposed in hot and humid environments. All nine patients are severe heat stroke, including seven cases of classic heat stroke (77.8%) and two cases of exertional heat stroke (22.2%). (2) Among nine cases of MODS caused by severe heat stroke, eight patients met the diagnostic criteria for SIRS. In addition, there was a signifi cant increase in blood PMN proportion and serum CRP of all nine patients. Of note, there was also a marked increase in serum IL-6 (average (36 ± 19) pg/ml; reference range (0 to 5.9 pg/ml)); and TNFα level (average (21 ± 10) pg/ml; reference range (0 to 8.1 pg/ml)), while IL-8 was normal (average (22 ± 25) pg/ ml; reference range (0 to 62 pg/ml)). (3) There were 34 organ damages involved in all nine patients with MODS induced by severe heat stroke. The kidney, circulatory and liver accounted for 58.8% of all these organ dysfunctions. The incidence of severe heatstroke-induced organ dysfunction in turn was the kidney, circulation, liver, blood coagulation, metabolism, brain, lungs, and gastrointestinal tract. (4) During hospitalization, the common complication was pulmonary infection in patients with MODS caused by heat stroke. (5) After early intensive care of organ supportive treatment, there was a quick improvement and recovery in most cases in several days. Seven patients survived, and the average length of stay in hospital was 9.5 days. Conclusion Severe heat stroke results in signifi cant abnormal changes in infl ammatory markers in patients with MODS induced by heat stroke. The types of organ dysfunction in heat stroke-induced MODS are usually distinct from those of infection and trauma. After active cooling and intensive organ function supportive treatment, most patients recovered in a relatively short period. shock. Patients need vasopressor infusion to maintain adequate delivery of oxygen but it could have deleterious eff ects on skin perfusion and worsen the burn depth. This shock could result from the interplay of the initial hypovolemia and the release of multiple infl ammatory mediators [1] . It has been shown that a low-dose of hydrocortisone could reduce the shock duration but the mechanisms involved remain unclear. We investigated the systemic genomic response after severe burn injuries and determine whether patterns of gene expression could be associated with a low dose of glucocorticoids. Methods Thirty burn patients with over 30% of total body surface area were enrolled into a randomized double-blind clinical study. Fifteen patients were treated with a low dose of hydrocortisone and 15 patients were treated with placebo. Whole blood samples were collected after shock onset (S1) before any treatment, 1 day after treatment beginning (S2), and 120 hours and 168 hours after the burn injury (S3/S4). Blood samples of 13 healthy volunteers were collected. Pangenomic expression was evaluated with Aff ymetrix HG-U133plus 2.0 microarrays. Moderated t tests and F test were used to compare burn patients with controls and gene expression profi les between the two groups (B-H correction, P <0.05). Results Severe burn injury induced the deregulation of a considerable number of genes (n >2,200 at S1) in comparison with controls with an increased number of deregulated genes over time. Within burn patients, more than 300 genes were deregulated by hydrocortisone over time. The treatment had a rapid eff ect on gene expression, 339 and 627 genes were diff erentially expressed at S2 and S3 respectively. However, the number of these genes decreased drastically at S4 (only 24 genes signifi cant). The genes identifi ed at S2 were mostly related to the decrease of growth, development and quantity of leukocytes but these biological processes were not found signifi cant at S3, indicating that the action of glucocorticoid in the response to burn injury is short lived and time dependent. Conclusion This study is an informative overview of the genomic responses after burn injuries. More importantly, it is the fi rst study providing information about mechanisms involved in glucocorticoid's reduced shock duration after burn. Reference Introduction The Managed Clinical Network for Care of Burns in Scotland (COBIS) was launched in April 2007. Primary aims included establishing and maintaining a registry of complex burn injury in Scotland and setting mechanisms to regularly audit outcome of burn treatment against nationally agreed standards of care. On behalf of COBIS, we present 3-year incidence and mortality data of Scottish patients admitted with a complex burn injury in this abstract. Methods From January 2010 onwards, data were prospectively collected for all patients in Scotland with complex burn injury admitted to Scottish burns units. Data collection was initially on a paper pro forma, but subsequently evolved into a web-based audit data capture system to securely link hospital sites involved in the delivery of care of complex burns. Data collected included extent and mechanism of burn, presence of airway burn or smoke inhalational injury, comorbidities, complications, length of stay, interventions and mortality. Quality, completeness and consistency of data collection are audited with feedback to the individual units. Results In a population of approximately 5.3 million, the annual incidence of complex burn injury is 499 to 537 (9 to 10 per 100,000). The incidence of a major burn is 5% of burn admissions. The hospital mortality from a burn is 1 to 2.2%. See Table 1 . Conclusion From these data, Scotland now has comprehensive national fi gures for complex burn injury. This allows for benchmarking against other international indices, few of which provide comprehensive data. COBIS data can now also be correlated with other mortality data sources. As data quality improves, detailed analysis of mortality data will allow COBIS to identify contributing issues aff ecting burns patients. Some issues identifi ed already are that patients with burns often die soon after their discharge from hospital of other related and unrelated causes. Subsequent analysis of this will allow COBIS to identify and address issues that may be contributing to these statistics. The majority of burns injuries are considered accidental, although previous studies have identifi ed demographic factors associated with higher risk of burns such as socioeconomic deprivation [1] and being from ethnic minority groups [2] . This study aims to identify population subgroups in London at high risk of burns injuries requiring admission to a burns centre through geographic mapping and socioeconomic statistics. Hemodynamic endpoints are preferred to tissue perfusion targets. Early antimicrobial therapy and de-escalation are routine practices without use of infective biomarkers. Crystalloid is preferred rather than colloid at initial resuscitation. CVP and fl uid challenge are still more popular than new fl uid responsiveness methods on preload assessment. Hydrocortisone is the most common steroid prescription in septic shock but the threshold of initiation, frequency and discontinuation are varied. shown that only 20 to 35% of patients trigger the clinical EWS prior to cardiac arrest. Jarvis and colleagues proposed that an EWS based on common laboratory fi ndings can predict patient mortality [3] . The aim of this study, as part of a wider review of cardiac arrests in our hospital, was to determine whether the laboratory early warning score (LEWS) might be of use identifying patients at risk of cardiac arrest in our trust. Methods Retrospective data collected identifi ed cardiac arrest calls that lead to CPR or defi brillation over 6 months. The LEWS was calculated according to the formula devised by Jarvis and colleagues [3] . LEWS ≥4 for males and ≥5 for females was taken as being a 'trigger' as suggested by Jarvis and colleagues [3] . Introduction The aim of this audit was to evaluate whether guidelines produced in a local intensive therapy unit (ITU) with regard to blood sampling practices were being adhered to. The volume of blood taken and cost was also evaluated. Methods A retrospective audit investigating the number of routine blood tests ordered on an ITU in January 2013 was performed. There were no exclusion criteria. Computer-based data collection systems were used to gather data with regard to patient details and when blood tests were processed. The collection bottles used were examined to see how much blood was needed to fi ll them. Thirty nurses were asked how much 'dead space' blood they discarded, and an average was recorded. The cost of the blood tests was also calculated. Following a period of education regarding the contents of the guidelines, this was re-audited in July 2013 retrospectively. The initial audit examined 901 patient-days. Urea and electrolytes (U&Es) and full blood counts (FBC) were requested in line with the guidelines. Liver function tests (LFTs), bone profi le, magnesium and a clotting screen were ordered approximately four times more than advocated. It was shown that many bone profi les and magnesium tests were probably inappropriate requests. Moreover, twice as much blood was taken from patients compared with that recommended by guidelines (almost 16 litres in total in January). The cost of the routine blood tests in January was €11,019. If guidelines had been followed, the estimated yearly saving would be €65,588. During the repeat audit, 731 patient-days were examined. The amount of times a U&E or FBC were requested was largely unchanged, but the amount of times a LFT, bone profi le, magnesium and clotting screen were ordered reduced by approximately 50%. Almost one-third more blood was taken from patients when compared with the suggested volume in the guidelines. The cost of the blood tests done in July was €5,423. Despite an improvement in the frequency of blood testing, an estimated €21,907 per year could still be saved. The results underline that the unit's guidelines were not being followed. The re-audit does show an improvement in adherence. Patients are being exposed to unnecessary blood tests, which not only is implicated in iatrogenic anaemia, but also places a signifi cant fi nancial burden on the department. Continued staff education and encouragement are required in order to aid the transition from current to recommended practice. The direct central line entry rate is believed to be a major contributor to the risk of central line infections. At Mayo Clinic there was historically no schedule for obtaining blood for analysis in the pediatric ICU. A policy was implemented in May 2013 to restrict blood draws to three times daily for nonemergent blood draws only. We subsequently conducted this study to determine whether implementation of this policy was associated with a reduction of blood draws as well as central-line unique entries. Methods Data from the laboratory as well as database for Central Line Unique Entry were analyzed at baseline and after implementation of the policy change for identifi cation of any decrease in line entry/blood draw rate. As per Mayo Clinic policy, IRB approval was not required for a QI project. In the pre-implementation phase there were a total of 4,602 blood draws in 5,227 total patient-days, (0.88 blood draws/patientdays). After consolidation, there were 1,095 blood draws in 1,491 patient-days (0.73 blood draws/patient-day; 17% reduction). Of these line entries, 24.7% were arterial line entry, 50.5% central line entry and 12.5% were by peripheral venipuncture. After policy implementation, these numbers were 10.9%, 49.7%, and 23.8%, respectively. The average central line unique entry after blood draw consolidation decreased from 10 to 6 line entries/central line-day. Consolidation of blood draws was associated with a cost saving of $7,200/year. Conclusion Consolidating time frames for blood draws in the PICU was associated with decreased central line entries, decreased utilization of vascular access teams, and decreased phlebotomy cost. We hypothesize that this policy will be associated with a decreased incidence of CLABSI when more patients are included for analysis. Introducing Although various techniques to reduce intraoperative blood loss have been described, there is an absence of uniformity and consistency in their application. In the literature, the blood loss in burn surgery is estimated to be at least 123 ± 106 ml per percentage body surface area excised [1] . Recently we developed a novel hemostatic technique using a silicone gel dressing (SI-AID®; ALCARE Co., Ltd, Tokyo, Japan) to stop intraoperative bleeding. Briefl y, soon after tangential excision with the Humby knife, the wounds were sprayed with thrombin and with 1:100,000 adrenalin solution and wrapped tightly with SI-AID® for a full 10 minutes. Burn wounds on limbs were tangentially excised under tourniquet control and wrapped with SI-AID® before defl ation of the tourniquet. After defl ation of the tourniquets, we waited for a full 10 minutes. When the SI-AID® was removed, any major bleeders were cauterized, and the grafts were applied after rinsing the wounds with warm saline. Methods This is a prospective observational study. From 1 January to 31 October 2013 we collected preoperative and 24-hour postoperative hemoglobin levels from the patients who underwent tangential excision for burn injury, and calculated blood loss in the perioperative period. The data for amounts of blood transfusion, excised area, and harvest area were also collected. Introduction Massive bleeding remains a leading cause of potentially preventable death after cardiovascular surgery [1] . Conventional coagulation tests (CCT) fail to characterize the multiple hemostatic abnormalities observed in surgical patients and are further limited by their slow results and poor correlation with transfusion requirements. We assessed the clinical impact of goal-directed coagulation management based on rotational thromboelastometry (ROTEM) in patients undergoing an emergent cardiovascular surgical procedure. Methods Over a 2-year period, data from 71 patients were collected prospectively and blood samples were obtained for coagulation testing. Administration of packed red blood cells (PRBC) and hemostatic products was guided by an algorithm using ROTEMderived information and hemoglobin level. Based on the amount of PRBC transfused, two groups were considered: high bleeders (≥5 PRBC; HB) and low bleeders (<5 PRBC; LB). Data were analyzed using the chisquare test, unpaired t test and ANOVA as appropriate. Results Preoperatively, the HB group (n = 31) was characterized by lower blood fi brinogen and decreased clot amplitude at ROTEM compared with the LB group (n = 40). Intraoperatively, larger amounts of fi brinogen, fresh frozen plasma and platelets were deemed necessary to normalize the coagulation parameters in the HB group. Postoperatively, the incidence of major thromboembolic and ischemic events did not diff er between the two groups (<10%) and the observed in-hospital mortality was signifi cantly less than expected by the POSSUM score (22% vs. 35% in HB group and 5% vs. 13% in LB group). Conclusion ROTEM-derived information is helpful to detect early coagulation abnormalities and to monitor the response to hemostatic therapy. Early goal-directed management of coagulopathy may contribute to improve outcome after cardiovascular surgery. Reference There is a need for further clarifi cation around coagulopathy and interventional radiology in the critical care setting. The low absolute incidence of bleeding complications and risk of complications from transfusion lends further support to the view that FFP should be used therapeutically rather than as prophylactic 'cover' [1] . Introduction Few studies have investigated the use of viscoelastic devices for monitoring of treatment with LMWHs and to our knowledge there are no studies comparing diff erent LMWHs or diff erent visocoelastic methods. Methods Enoxaparin (Klexane) and tinzaparin (Innohep) were added to 2 ml citrated blood from 10 intensive care patients to obtain plasma concentrations of 0, 0.5, 1.0 and 1.5 IU/ml enoxaparin and tinzaparin, respectively. The study was approved by the local ethics committee and with written consent (relatives). Clot formation and clot retraction was studied using ROTEM and ReoRox. Results ROTEM analysis showed prolonged clot formation (CT) with increasing concentrations of enoxaparin and tinzaparin (more so). ReoRox analysis showed that the initiation of clot formation (COT1) increased with increasing doses of enoxaparin and tinzaparin (more so), as did the progression of clot formation (COT2 -COT1), thus resulting in a prolongation until complete clot formation (COT2). See Table 1 . Conclusion Clot initiation was prolonged with both drugs and detected by both ROTEM and ReoRox. Clot formation was more decreased with tinzaparin than enozaparin and only detected by ReoRox. Introduction Heparin is commonly given in our neonatal ICU (NICU) by continuous intravenous infusion. Heparin is diluted in parenteral nutrition bags and administered over a period of 24 hours with in-line fi ltration. However, there are no data on heparin stability in parenteral nutrition bags, especially on its compatibility with 50% dextrose mainly present in bags. The aim of our in vitro study was to determine heparin stability in parenteral nutrition bags prepared in a NICU after 24-hour infusion and to assess its interaction or not with 50% dextrose. Methods We prepared both types of bag: parenteral nutrition bags whose composition was defi ned in the unit, including sodium heparin (77 UI/ml); and bags containing only sodium heparin diluted in 50% dextrose (193 UI/ml). These bags (n = 6 per type) were infused over a period of 24 hours with and without in-line fi ltration. Heparin activity was measured using a chromogenic anti-Xa method in bags just being prepared (references for other measures) and after 24-hour infusion and in effl uents at the end of infusion line after 24 hours. Our results show values of heparin activity measured in bags and effl uents with and without in-line fi ltration after 24-hour infusion for both types of bag assessed (Tables 1 and 2) . Results are expressed as median values (minimum to maximum) in percent. The oral direct and selective factor Xa inhibitor edoxaban (Daiichi Sankyo) is currently available in Japan for the prophylaxis of venous thromboembolism (VTE) in patients undergoing major orthopedic surgery and is undergoing investigation in phase III trials for the prevention of stroke in patients with atrial fi brillation and the treatment and secondary prevention of VTE. The primary complication of any available anticoagulant therapy is the risk of bleeding. Rapid reversal of anticoagulation may be necessary in patients requiring emergency treatment due to uncontrolled bleeding. Prothrombin complex concentrates (PCC) are frequently used to reverse the eff ect of vitamin K antagonists such as warfarin and have also been suggested to be potentially eff ective in reversing the eff ects of the new oral anticoagulants. The present study was therefore designed to determine whether the four-factor PCC Beriplex® can eff ectively reverse bleeding and normalize coagulation following edoxaban administration in a rabbit kidney injury model. Methods Rabbits were treated with a high intravenous bolus dose of edoxaban (1,200 μg/kg) followed by the administration of Beriplex® (25 to 75 IU/kg). Bleeding was assessed based on the time to hemostasis and the total blood loss after induction of a standardized kidney injury. In parallel, the following biomarkers of hemostasis were determined: factor Xa inhibition, prothrombin time (PT), activated partial thromboplastin time (aPTT), whole blood clotting time (WBCT), and thrombin generation (TGA). The results confi rmed increased and prolonged bleeding of edoxaban-treated animals following standardized kidney injury compared with vehicle administration. Parallel monitoring of biomarkers of hemostasis showed a prolongation of PT, aPTT, WBCT, and changes in thrombin generation parameters. Subsequent administration of Beriplex® resulted in a dose-dependent reversal of edoxaban-induced bleeding as indicated by reduced time to hemostasis and total blood loss. Both parameters achieved statistical signifi cance compared with placebo at the Beriplex® dose of 50 IU/kg under fully blinded study conditions. The biomarkers correlating best with Beriplex®-mediated edoxaban anticoagulation reversal included PT, WBCT and endogenous thrombin potential. Conclusion In summary, Beriplex® treatment eff ectively reversed edoxaban-induced anticoagulation in an animal model of acute bleeding at clinically relevant dose levels. dabigatran-treated animals were randomized (n = 6/group) to a single injection of idarucizumab at 30, 60 or 120 mg/kg i.v. or vehicle (control animals). Blood loss and hemodynamic variables were monitored over 4 hours or until time of death. Data were analyzed by ANOVA (± SD) and by the log-rank test. Results Dabigatran levels were 1,147 ± 370 ng/ml with no diff erences between groups prior to injury. BL in sham animals was 409 ± 53 ml 10 minutes after injury and 700 ± 107 ml after 4 hours (survival rate 100%). Anticoagulation with dabigatran (control animals) resulted in signifi cantly higher BL 10 minutes after injury (801 ± 66 ml, P <0.05). Mortality in these animals was 100%, with a mean survival time of 121 minutes (range: 90 to 153 minutes; P <0.05 vs. sham and idarucizumab-treated animals). Total BL in dabigatran-treated animals was 2,977 ± 316 ml. In contrast, treatment with idarucizumab was associated with a dose-dependent reduction in BL. Conclusion Bivalirudin is a valuable option for anticoagulation in patients with VAD, and can be easily monitored with aPTT. The use of a bivalirudin-based anticoagulation strategy in the early postoperative period may overcome many limitations of heparin, and above all the risk of HIT which is higher in patients undergoing cardiac surgery. Bivalirudin should no longer be regarded as a second-line therapy for anticoagulation in patients with VAD. Introduction Free hemoglobin (fHb) can scavenge nitric oxide and induce vasoconstriction [1] . The fHb content may be higher in older blood bags. We studied whether old red blood cell (RBC) transfusion increases plasma fHb in septic patients and if this aff ects the microvascular response. Methods Twenty septic patients randomly received either fresh (<10 days storage) or old (>15 days) RBC transfusion. Plasma fHb was measured before and 1 hour after transfusion; the sublingual microcirculation was assessed with sidestream dark-fi eld imaging. The perfused boundary region (PBR) was measured as an index of glycocalyx damage [2] . The thenar Tissue Hb index (THI) was measured (near-infrared spectroscopy). Results fHb increased in the old RBC group ( Figure 1 ). THI increased in both groups, while SDF parameters were unaltered. Negative correlations were found between ΔfHb and changes in total vessel density (r = -0.57, P <0.01; Figure 2 ) and THI (r = -0.71, P <0.001). These relations were lacking in patients with PBR <2.68 μm. The present study investigated the relationship between age [1] and biomarkers of coagulation and fi brinolysis [2] and their impact on outcome in severe sepsis patients. Methods A prospective observational study of adult patients with severe sepsis was conducted in a single academic hospital. Plasma was analyzed for coagulation and fi brinolysis markers on days 1, 2, and 3. Patients were stratifi ed according to the age 67 years, which was the point of the Youden index (maximum sensitivity + specifi city -1) in a receiver operation characteristics plot for a logistic regression model of in-hospital mortality. Results For the in-hospital survival rate, that of older sepsis patients was signifi cantly lower than younger patients, 9/15 (60.0%) versus 27/28 (96.4%), P <0.05. Older patients had markedly higher total plasmin activator inhibitor-1 (TPAI-1) on day 1, thrombin-antithrombin complex (TAT) on days 2 and 3, and fi brin monomer complex on day 2, and markedly lower plasminogen (PMG) on day 3 and alpha-plasmin inhibitor (αPI) on days 2 and 3 compared with younger patients (all P <0.05). Age was an independent predictor of high TAT on days 2 and 3, high fi brin monomer complex on day 2, low PMG on day 3, and low αPI on days 2 and 3 after adjusting for some cofactors and covariables. TPAI-1 on day 2, TAT on day 2, and PMG on day 3 were risk factors of inhospital mortality in older sepsis patients (Table 1) . Table 1 shows improvements in outcomes. Dialysis was discontinued in 20 (83%) among the 24 patients requiring dialysis at inclusion. Ecu was generally well tolerated, with no deaths or unexpected safety concerns. Meningococcal infections occurred in two patients, one of whom continued treatment. Introduction Restrictive transfusion practice in recent randomized trials and systematic reviews continues to have favorable outcomes [1] . Despite this, substantial variability in transfusion practice persists [2] . It is important to identify contemporary variability, and which patient, provider, and institutional factors drive this variability. Therefore, we performed a retrospective analysis hypothesizing that red blood cell Conclusion Anemia and elevated hematocrit seem to be associated with in-hospital mortality among patients with suspected infection. Treatment of high or low hematocrit as a part of the ED resuscitation could be a subject for further investigation. Introduction Several predicting models have been described to identify the necessity for massive transfusion (MT) for trauma patients [1] . The purpose of this study is to validate the simplifi ed scoring systems reported previously and establish new criteria at emergency and in the ICU in Japan. We retrospectively analyzed trauma patients transported to our center for the recent 2 years. Patients transferred from other hospitals with minor injuries or confi rmed cardiac arrest at the scene were excluded. Results A total of 297 trauma patients were included in this study. Thirty-one (10.4%) patients required MT. Sensitivity and specifi city for the Assessment of Blood Consumption (ABC) score were 48% and 99%, respectively. Because blunt injuries account for most trauma patients in Japan, we established new simple criteria using signifi cant factors that were derived from the examination on arrival. If trauma patients met any of the following conditions -that is, shock index (SI) >1, base excess (BE) <-3 mmol/l, and positive focused assessment of sonography for trauma (FAST) -sensitivity and specifi city was 97% and 80%, respectively. Conclusion Patients with sepsis receiving blood products, particularly platelets, were signifi cantly more likely to develop ARDS, had more days on mechanical ventilation, and had higher mortality. The lack of an increase in mortality associated with PRBC transfusion may be due to the benefi t in oxygen delivery or sample size. Introduction Perioperative red blood cell transfusion is commonly used to address anemia, an independent risk factor for morbidity and mortality in critically ill patients [1] ; however, evidence regarding optimal blood transfusion practice in septic shock is lacking. The aim of this study was to defi ne which is the best transfusion strategy in septic shock patients regarding 28-day mortality and clinical outcomes: restrictive or liberal. Methods The Transfusion Requirements After Cardiac Surgery (TRACS) study is a prospective, randomized, controlled clinical noninferiority trial conducted between February 2009 and February 2010 in an ICU at a university hospital cardiac surgery referral center in Brazil. Consecutive adult patients (n = 502) who underwent cardiac surgery with cardiopulmonary bypass were eligible; analysis was by intention to treat. This is a randomized controlled parallel-group trial, which included 300 patients admitted to a cancer ICU with diagnosis of septic shock. Patients were randomly assigned to a liberal strategy of blood transfusion (to maintain hemoglobin >9 g/dl) or to a restrictive strategy (hemoglobin >7 g/dl). Mortality in 28 days was the main outcome. Secondary outcomes were clinical complications days free of organ dysfunction, ICU and hospital length of stay, adverse eff ects of transfusion and 60-day mortality. Results A total of 136 patients were included in the fi rst part of the trial. Mean age was 62 ± 14 years, SAPS 3 at admission was 65 ± 15 and all patients had the diagnosis of solid neoplasm. Sixty-three patients (46.3%) were included in the liberal strategy and 73 patients (53.7%) in the restrictive strategy. Occurrence of 28-day mortality was similar between groups (54% in liberal group vs. 56.2% in restrictive group; P = 0.395). Conclusion Among cancer patients with septic shock, the use of a restrictive perioperative transfusion strategy compared with a more liberal strategy resulted in similar rates of 28-day-mortality. Reference Introduction Blood transfusion is associated with increased morbidity and mortality in the critically ill [1] . Adverse eff ects of transfusion may be mediated by changes in the blood product that accumulate with storage time [2] . The mechanisms, however, are largely unknown. Erythrocyte-derived microparticles (MPs) have been found in transfusion bags [3, 4] and their concentration increases with storage duration [5] . We hypothesize that accumulation of MPs during storage induces a proinfl ammatory state in the recipient. in capillary leakage and subsequent pulmonary edema [2] . TRALI is a clinical diagnosis with the following criteria: acute onset within 6 hours of blood transfusion, PaO 2 /FIO 2 ratio <300 mmHg, or worsening of the P:F ratio, bilateral infi ltrative changes on chest radiograph, no sign of hydrostatic pulmonary edema (pulmonary arterial occlusion pressure ≤18 mmHg or central venous pressure ≤15 mmHg) and no other risk factor for acute lung injury [2] . Methods We describe a fatal TRALI in a patient with infl uenza A (H1N1), suggesting a relationship between a fi rst-hit lung injury followed by the second lung impairment after blood transfusions. We report a 57-year-old female, without a previous medical record. She had an acute onset of fever, cough, muscle pain and progressive dyspnea leading to acute respiratory distress syndrome. The test for infl uenza A H1N1 was positive. She was recovering, but on day 12 of admission, after 1 hour of platelet transfusion, she started with intensive tachycardia, dyspnea and hypoxemia. Her mechanical ventilation parameters increased dramatically. She was in plan for extubation with FiO 2 of 30% and positive end-expiratory pressure of 8, which became 100% and 14 respectively. The P:F ratio dropped to 62. Her leukocytes were 10.6×10 9 /l a few hours earlier and went down to 1.5×10 9 /l after the onset. Previous lactate was normal, but jumped to 42 mg/dl. She was free of vasopressors and after the off ending transfusion went through refractory shock and died approximately 24 hours after the blood transfusion. Conclusion For our knowledge this is the fi rst case reported of TRALI in an infl uenza A (H1N1) patient. Although blood transfusion can be life saving, it also can be a life-threatening intervention. Prevention is still the best hit. References Introduction Prothrombin complex concentrate (PCC) has been suggested as a measure to terminate trauma and dabigatran-induced bleeding. Owing to the confl icting data concerning such therapy, we investigated the impact of a four-factor PCC to terminate massive bleeding following the infl iction of multiple trauma in dabigatran anticoagulated pigs. Methods After ethical approval, 24 male pigs were administered dabigatran etexilate (30 mg/kg bid p.o.) for 3 days. On day 4, dabigatran in anaesthetised animals was infused prior to injury to achieve supratherapeutic levels. Twelve minutes after infl iction of bilateral femur fractures and standardised blunt liver injury, animals randomly received PCC (25, 50 or 100 IU/kg; n = 6) or placebo (n = 6). Time-adjusted blood loss as primary endpoint (observation period 300 minutes) and a panel of coagulation variables were continually measured. Data were analysed by two-way ANOVA. Data are mean ± SEM. Results Concentrations of dabigatran prior to infl iction of trauma was comparable between groups (590 ± 40 ng/ml). Anticoagulation with dabigatran and trauma caused severe coagulopathy as shown by prolonged TEM variables (CT, CFT), PT and aPTT. Following PCC application these eff ects were partially reversed. Due to ongoing blood loss both PT and TEM variables prolonged over time in PCC 25 IU/kg substituted animals. Accordingly, no-PCC (38.5 ± 4.7 ml/minute) and PCC 25 IU/kg (22.6 ± 5.5 ml/minute) animals showed highest blood loss (P <0.05 vs. PCC 50 IU/kg and PCC 100 IU/kg) with a mean survival time of 106 minutes (no-PCC animals) and 204 minutes for PCC 25 IU/kg animals, respectively. All animals of the PCC 50 IU/kg and PCC 100 IU/kg group survived. Blood loss in both groups was comparable (PCC 50 IU/ kg: 5.9 ± 0.2 ml/minute; PCC 100 IU/kg 6.0 ± 0.3 ml/minute). The use of high doses of PCC reversed the anticoagulant eff ects of dabigatran, which was associated with a signifi cant reduction of blood loss. However, the results of this study show that suffi cient concentrations of PCC are necessary to overcome thrombin inhibition. Introduction Fresh frozen plasma (FFP) is associated with onset of acute lung injury [1] , the mechanism of which is largely unknown. On the other hand, FFP may be benefi cial, as a higher ratio of FFP to red blood cells decreases mortality in bleeding trauma patients [2] and is associated with an endothelial stabilizing eff ect in vitro [3] . We investigated the eff ect of transfusion with FFP on host response and markers of endothelial damage in nonbleeding critically ill patients. Methods This was a substudy of a multicenter trial in which nonbleeding critically ill patients with an increased International Normalized Ratio (1.5 to 3.0) were randomized to omitting or administering a prophylactic transfusion of FFP (12 ml/kg) prior to an invasive procedure. In 38 patients randomized to receive FFP transfusion, we measured levels of factor VIII, von Willebrand factor, and markers of proinfl ammatory response before and after transfusion. Data are presented as medians. Results FFP transfusion resulted in a signifi cant decrease of TNFα (from 12.3 to 3.1 pg/ml, P = 0.01), von Willebrand factor (from 475 to 424%, P <0.01) and factor VIII (from 246 to 244%, P <0.01). FFP did not alter levels of IL-1β, IL1-RA, IL-8, IL-10, MCP1, MIP1A or sCD40L. Patients had some degree of lung injury at baseline as refl ected by a lung injury score of 2 (0.8 to 2.5), which did not change following transfusion. None of the patients developed TRALI. Conclusion FFP transfusion is not associated with a proinfl ammatory response in the critically ill. Rather, FFP seemed to have an endothelial stabilizing eff ect. Introduction When treating trauma patients with severe hemorrhage, massive blood transfusions are often needed. Damage control resuscitation strategies can be used for such patients, but an adequate fresh frozen plasma:packed red blood cell (FFP:PRBC) administration ratio must be established. Methods We retrospectively reviewed the medical records of 100 trauma patients treated with massive transfusions from March 2010 to October 2012. We divided the patients into two groups according to the FFP:PRBC ratio: a high-ratio group (≥0.5) and a low-ratio group (<0.5). The patient demographics, fl uid and transfusion quantities, laboratory values, complications, and outcomes of both groups were analyzed and compared. Results There were 68 patients in the high-ratio group and 32 in the low-ratio group. There were statistically signifi cant diff erences between groups in the quantities of FFP, FFP:PRBC, platelets, and crystalloids administered, as well as the initial diastolic blood pressure. When comparing the incidence of complications, bloodstream infections were noted only in the high-ratio group, and the diff erence was statistically signifi cant (P = 0.028). Kaplan-Meier plots revealed that the 24-hour survival rate was signifi cantly higher in the high-ratio group (71.9% vs. 97.1%, P <0.001). The 30-day survival rate was also higher in the high-ratio group (56.2% vs. 67.6%), but the diff erence was not statistically signifi cant (P = 0.117). Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Conclusion For treating patients with severe hemorrhagic trauma, raising the FFP:PRBC ratio to 0.5 or higher may increase the chances of survival (especially the 24-hour survival rate). Eff orts to minimize bloodstream infections during the resuscitation process must be increased. References PT (27 ± 9 seconds). Following the 90-minute infusion of dabigatran, the mean plasma levels of dabigatran increased to 1,423 ± 432 ng/ ml. This supra-therapeutic level was associated with a further prolongation of PT, aPTT, and the EXTEM variables CT and CFT. These changes in coagulation parameters were compounded by blood loss following trauma (total blood loss at 60 minutes: 1,978 ± 265 ml). Sixty minutes after trauma, four out of fi ve animals had no measurable clot formation (EXTEM CFT ≥4,000 seconds) and clot strength (EXTEM MCF) had reduced to 11 ± 7 mm. Both PCCs and aDabi-Fab, but not rFVIIa, reversed the eff ects of dabigatran on thromboelastometry parameters (clotting time and clot formation time) and PT at all time points. In contrast, aPTT was only normalised by idarucizumab. Plasma concentrations of dabigatran remained elevated after PCC therapy, but were not measureable after idarucizumab. Introduction Fluid resuscitation of hemorrhagic shock is frequently associated with reperfusion injury and secondary organ damage. Studies suggest that low doses of both nitrite and carbon monoxide may protect tissues and organs from this reperfusion injury by limiting mitochondrial free radical production. We explored the eff ects of very small doses of nitrite and carbon monoxide on tissue injury in a porcine model of hemorrhagic shock. Methods Fluid resuscitation of hemorrhagic shock is frequently associated with reperfusion injury and secondary organ damage. Studies suggest that low doses of both nitrite and carbon monoxide may protect tissues and organs from this reperfusion injury by limiting mitochondrial free radical production. We explored the eff ects of very small doses of nitrite and carbon monoxide on tissue injury in a porcine model of hemorrhagic shock. Results Although no increase in blood nitrite concentrations was observed after inhalation, nitrite was associated with signifi cant decreases in blood, muscle, and peritoneal fl uid lactate concentrations (P <0.05), whereas both nitrite and carbon monoxide were associated with signifi cant decreases in glycerol in peritoneal fl uid (P <0.05). Following resuscitation, the muscle mitochondrial respiratory control ratio was preserved in the nitrite and carbon monoxide groups and reduced in the control group. Adjuvant drugs had no eff ects on any gross hemodynamic parameters. We conclude that at low doses, nebulized sodium nitrite and inhaled carbon monoxide are associated with tissue protection during resuscitation from severe hemorrhagic shock. Introduction Currently, most non-invasive blood pressure (NIBP) monitoring is based on the oscillometric method and determines the blood pressure during cuff defl ation [1] . On the other hand, a measurement during cuff infl ation may be advantageous, as cuff infl ation requires lower cuff pressure and shorter duration than defl ation. In surgical patients during anesthesia, the infl ationary NIBP has reasonable accuracy compared with conventional defl ationary NIBP [2] . Few studies have reported NIBP monitoring using the infl ationary NIBP in ER patients with various unstable conditions. A purpose of this study is to verify the usefulness of the infl ationary NIBP monitoring in the emergency department. Methods A total of 2,981 NIBP data were collected from 174 patients (age; 56.5 ± 22.2 (7 to 92) years) who have been accommodated in the resuscitation area of the ER at Keio University Hospital, using alternately two algorithms with a standard monitor (BSM-6000; Nihon Kohden Inc., Tokyo, Japan). One algorithm consists of continuous infl ationary and defl ationary measurement in a single cycle (dual algorithm, 1,502 data) performed in order to verify a success rate and a precision of data. The defl ationary algorithm (1,479 data) consists of only conventional defl ationary measurement performed in order to verify the duration of the measurement cycle. The success rate of the infl ationary NIBP (completed only by infl ationary method) was 69.0%. The bias and precision of systolic pressure and diastolic pressure (diff erence of systolic and diastolic pressure between infl ationary and defl ationary NIBP) were -0.6 ± 8.8 and 3.5 ± 7.5 mmHg, respectively ( Figure 1 ). Infl ationary NIBP could also determine NIBP more quickly compared with defl ationary NIBP (16.8 vs. 29.1 seconds, median) ( Figure 2 ). Conclusion These data suggest that infl ationary NIBP has reasonable accuracy and suffi cient rapidity compared with defl ationary NIBP in emergency room patients. References Introduction The time profi le of the arterial pulse is known to have features such as the dicrotic notch and subsidiary peaks. Such features may provide useful information about the vascular system and are traditionally explained in terms of aortic valve closure or multiple refl ections from impedance mismatches within the arterial system. However, experimental evidence of such refl ections has been elusive. It has been proposed that arterial dynamics may obey a nonlinear equation [1] . This model predicts the existence of multipeaked solitons which can travel long distances without dissipation. We demonstrate that within the soliton model it is not necessary to model valve closure or wave refl ection: single or multiple notches arise de novo even from featureless theoretical LV pressure pulse profi les. We show that a number of clinically relevant features of the invasive blood pressure are reproduced by the soliton model and examine the role of LV pulse energy on pulse wave shape and progression. Methods A model for the arterial pressure is given by solutions to a KdV equation with constants depending on the properties of the artery [1] . This can be solved with the initial condition of a parabolic left ventricular pressure pulse. The evolution of the arterial pulse along the arterial tree is shown in Figure 1 . We also predict arterial pulses for increasing left ventricular ejection energies. Conclusion Our simple model explains many features of the arterial pulse observed in clinical practice such as the development of the dicrotic notch, the change in shape along the arterial tree and the steepening and acceleration with hypertension. Some phenomena that have traditionally been attributed to arterial wave refl ections or resonance of the invasive arterial pressure measurement can instead be explained by intrinsic properties of the arterial pulse. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Introduction Pay-for-performance programs and economic constraints call for solutions improving the quality of healthcare without increasing costs. Many studies have shown decreased morbidity in major surgery when perioperative goal-directed therapy (PGDT) is used. We assessed the clinical and economic burden of postsurgical complications in the University HealthSystem Consortium (UHC) in order to predict potential savings with PGDT. Methods Data from adults who had 10 major surgical procedures in 2011 were screened in the UHC database. Thirteen postsurgical complications were tabulated. In-hospital mortality, hospital length of stay and costs from patients with and without complications were compared. The risk ratios reported by the most recent meta-analysis were used to estimate the potential reduction in postsurgical morbidity with PGDT. Potential cost savings were calculated from the actual and anticipated morbidity rates. Patients were randomly selected and catheterization was performed in a neutral position of the head (n = 50) and by turning the head 45° to the opposite side (n = 50). The US-guided catheterization procedure was performed in accordance with general principles. Once the needle entered the IJV and blood was aspirated, the US probe was released from the hand, and the catheter was placed and fi xed according to the Seldinger technique. The data of the intervention side for each process, the count number of successful catheter insertion, whether there is arterial access or not and the duration of procedure (from skin contact of the needle to catheter insertion) were recorded. The localization of the IJV in relation to the CA is 66% anterolateral, 4% anterior and 30% lateral in a neutral position, and 62% anterolateral, 28% anterior, 10% lateral position in 45° rotation. So while there is no change in a signifi cant proportion of patients in the localization of the IJV in relation to the CA, the anterior placement rate increasing the risk of CA puncture was signifi cantly higher in a position of rotation compared with the neutral position (P = 0.001). No signifi cant diff erence was found between procedure durations. Complications were recorded when observed. Conclusion In our study, it has been shown that anterior placement of the IJV in the neutral position is less but this has no advantage in avoiding arterial puncture. The smaller area of procedure in a neutral position can cause diffi culties in practice. However, the processing times between each head position were not diff erent. Nevertheless, further studies evaluating whether there are comparable complication rates and the same duration of procedure in emergency and trauma patients in whom head rotation cannot be possible are needed. Peres [1] and more recently Lum [2] developed some anthropometric formulas to correlate patient's height (H) and ideal length of central venous catheter (CVC) in order to identify optimal tip position. The aim of this study is to compare the reliability of the anthropometric formulas with the method based on intracavitary ECG. Methods We enrolled patients admitted to our ICU candidate to elective CVC insertion, who had a detectable P wave on surface ECG. Intracavitary ECG was used to identify the optimum tip location since a maximal P wave indicates the cavo-atrial junction [3] . Post-insertion chest X-ray (CXR) was performed in all patients to verify the tip position. Assuming that the cavo-atrial junction is about 3 cm from the carina [4] , the tip position was considered correct between 1 and 5 cm from the carina (between the lower 1/3 of the superior vena cava and the upper 1/3 of the right atrium). For each patient we retrospectively evaluated whether the catheter length calculated with Lum's and Peres' formulas on an estimated height would have been acceptable. Results Sixty-fi ve CVCs were placed: 51 in the right internal jugular vein (IJV), 14 in the left IJV. The mean catheter length by intracavitary ECG was signifi cantly deeper than predicted by Lum's formulas (18.2 ± 1.9 vs. 16.7 ± 1.7 cm, P <0.001) and not diff erent from Peres' formulas (18.2 ± 1.9 vs. 18.2 ± 1.7 cm, P = 0.8). At post-procedural CXR, 88% of the tips were in the target zone. In three cases (5%) the catheter went in the wrong direction but was immediately corrected during the procedure since the intracavitary P wave did not change its amplitude. Compared with the intracavitary ECG, the incidence of malposition would have been signifi cantly higher with Lum's formulas (48% vs. 12%, P <0.001) and Peres' formulas (51% vs. 12%, P <0.001). Conclusion The intracavitary ECG was associated with a lower incidence of tip malposition than Peres' and Lum's formulas. It is also the only technique allowing one to immediately correct a primary malposition during catheter insertion. Introduction An ultrasound-guided (UG) technique is the recom mended procedure for central venous catheterization (CVC). But ultrasound may not be available in emergency situations, and therefore guidelines also propose that physicians remained skilled in landmark (LM) placement. We conducted this prospective observational study to determine the learning curve of the LM technique in residents only learning the UG technique. Methods During the fi rst 3 months of their rotation in our ICU, residents inexperienced in CVC used only the real-time UG technique. During the following 3 months, residents were allowed to place CVC by means of the LM technique when authorized by the attending physician. Results A total of 172 procedures (84 UG and 88 LM) were performed by the inexperienced residents during the study. The success rate was lower (72% vs. 84%; P = 0.05) and the complication rate was higher (22% vs. 10%; P = 0.04) for LM compared with UG procedures. Comparison between the fi ve last UG procedures and the fi rst fi ve LM procedures performed demonstrated that the transition between the two techniques was associated with a marked decrease of the success rate (65% vs. 93%; P = 0.01) and an increase of the complication rate (33% vs. 8%; P = 0.01). After 10 LM procedures, residents achieved a success rate and a complication rate of 81% and 6%, respectively. Conclusion Residents who only learn the UG technique will not be immediately able to perform the LM technique, but require specifi c training based on at least 10 LM procedures. Whether the LM technique should still be taught when an ultrasound device is not available must therefore be addressed. Introduction Bedside insertion of peripherally inserted central catheters (PICCs) results in tip malposition in up to 48% [1] , with catheters frequently terminating in the internal jugular vein (IJV) or in peripheral veins [2] . In an attempt to reduce tip malposition, we developed a standardized approach to PICC installation. This study aims to validate that method. Methods From a 34-bed adult ICU, we retrospectively reviewed PICC insertions over a 6-month period before the intervention program (control group). We designed a prospective interventional pilot study on 40 consecutive patients (intervention group). Patients in the intervention group were positioned in a standardized fashion and the PICC length was measured from easily identifi ed anatomic landmarks. During PICC insertion, the patient's head was either rotated ipsilaterally to the site of PICC insertion or the ipsilateral IJV was manually compressed, depending on the patient's capacity to collaborate. Once the PICC was inserted, an ultrasound survey was conducted to identify the catheter in the subclavian vein (SC) and ensure its absence in the ipsilateral IJV. The primary endpoint was defi ned as PICC tip position, obtained from the post-procedural chest X-ray. A catheter was considered to be in an optimal position if the tip resided in the distal third of the superior vena cava (SVC), adequate if it resided between the subclavian vein (SC) and the distal third of the SVC, and aberrant if in any other location. In the retrospective control arm, 105 PICCs were reviewed for tip position. Optimal, adequate, and aberrant positions were found in 22 (21%), 49 (47%), and 34 (32%) respectively, in comparison with 17 (43%), 15 (38%), and eight (20%) in the intervention group (P <0.05 between both groups). In the control arm, 11 (10%) PICCs terminated outside the central venous system, whereas none failed to achieve central venous access in the intervention arm. Conclusion Using the standardized method described above, PICC tip positioning can be greatly improved. In our results, 100% of catheters placed using the standardized method allowed for central venous access. This pilot study paves the way for a larger, multicentric evaluation of the bedside installation method of PICC. References Introduction According to the European Society of Parenteral and Enteral Nutrition guidelines [1] , post-insertion chest X-ray (CXR) is not necessary if the location of the tip has been verifi ed during the procedure and if pleura-pulmonary damage has been ruled out by other methods. The aim of this study is to assess feasibility and safety of an echo-ECG-guided method of central venous catheter (CVC) insertion and to evaluate whether post-insertion CXR can be avoided. Methods We enrolled only patients admitted to our ICU and candidate to elective CVC insertion, who had a detectable P wave on surface ECG. Our insertion protocol included: preliminary ultrasound (US) scan of central veins and pleural space; US-guided puncture and US control of the correct direction of the guidewire; intracavitary ECG method for tip location (cavo-atrial junction (CAJ) = maximal P wave); and US scan of pleural space to rule out pneumothorax (PNX). Post-insertion CXR was performed in all patients to rule out PNX and verify tip location close to the CAJ (CAJ = 3 cm below the carina [2] ). Tip location between 1 and 5 cm below the carina -in the lower 1/3 of the superior vena cava (SVC) or in the higher 1/3 of the right atrium (RA) -was considered acceptable [1] . Introduction Chest auscultation and chest X-ray are commonly used to detect postoperative abnormalities and complications in patients admitted to intensive care after cardiac surgery [1, 2] . The aim of the study was to evaluate whether chest ultrasound represents an eff ective alternative to bedside chest X-ray to identify early postoperative abnormalities. Methods A total of 151 consecutive patients (103 male and 47 female) were studied by chest auscultation, ultrasound and X-ray upon admission to intensive care after cardiac surgery. Six pathologic entities were explored by each method: postero-lateral pleural eff usion and/ or alveolar consolidation (PLAPS), alveolar-interstitial syndrome (AIS), alveolar consolidation (AC), pneumothorax (PTX), pleural eff usion (PE), and pericardial eff usion with or without cardiac tamponade. Positions of the endotracheal tube and central venous catheter were also checked. Results Ninety-four of the 151 patients included (62%) showed abnormalities on chest X-ray (AC 9%, AIS 25%, PLAPS 42%, PE 3.3%, PTX 2%). Compared with chest X-ray, chest ultrasound had a sensitivity of 86% and a specifi city of 99% for AC, a sensitivity of 95% and a specifi city of 100% for AIS, a sensitivity of 97% and a specifi city of 98% for PLAPS, a sensitivity of 99% and a specifi city of 100% for PE, and a sensitivity and specifi city of 100% for PTX. Furthermore, chest ultrasound detected all pericardial eff usions while neither chest X-ray nor chest auscultation were able to identify them. Chest ultrasound identifi ed all cases of endotracheal tube (two patients) and central venous catheter (two patients). There was a highly signifi cant correlation between abnormalities detected by chest ultrasound and X-ray (k = 0.90), but a poor correlation between chest auscultation and X-ray abnormalities (k = 0.15). Conclusion Chest auscultation may help identify endotracheal misplacement and tension pneumothorax but it may miss most of major abnormalities. Chest ultrasound represents a valid alternative to chest X-ray to detect all postoperative abnormalities and misplacements. Introduction Assessment of volume status and responsiveness guides resuscitation strategy. Non-invasive techniques are desirable. Changes in carotid fl ow time have been proposed as a marker of volume status, but few data support their use. We sought to determine whether carotid fl ow time decreased in the volume-depleted state of acute blood loss, and whether volume-depleted individuals would demonstrate an increase in carotid fl ow time after a passive leg raise (PLR) maneuver. Methods Volunteers aged 18 to 55 presenting to the hospital's blood donor center for whole blood donation were eligible to participate. Individuals with a history of aortic or carotid artery disease, atrial fi brillation, or a contraindication to blood donation were excluded. Prior to blood donation, an investigator performed an ultrasound of the right common carotid artery with a high-frequency linear transducer, obtaining a Doppler tracing of carotid artery fl ow. Measurements of peak velocity, systole time, and carotid fl ow time were obtained. A PLR was performed for 30 seconds, followed by repeat measurements of carotid velocity and fl ow time. Whole blood was then collected according to the blood donor center's protocol. Immediately after blood donation, repeat measurements of carotid fl ow and velocity were obtained in the supine position and after a PLR. Carotid fl ow times corrected for heart rate (FTc) were analyzed with Student's t test. The institutional review board approved the study. Results Eighty donors were screened for participation by two investigators; 68 consented and completed donation (60.3% female, mean age 31). Donors had mean blood loss of 450 ml. The mean FTc supine after blood donation was 296 ms; this was signifi cantly diff erent from the FTc prior to donation (supine = 320 ms, PLR = 323 ms; P <0.0001). The mean FTc following blood donation and PLR was 321 ms, signifi cantly diff erent from the supine position after donation (P <0.0001), but not pre-donation measurements. Conclusion Ultrasound measurement of carotid fl ow time was signifi cantly decreased in the setting of acute blood loss. An autobolus by PLR after blood loss restored FTc to pre-donation levels. Further investigation of FTc as a non-invasive predictor of volume responsiveness is warranted. Introduction Central venous catheters play an important role in patient care; however, their use is associated with various complications and more frequently through the subclavian vein (SCV) route. A previous study showed that ultrasound-guided cannulation of the SCV in critical care patients is superior to the landmark method and should be the method of choice in these patients [1] . The aim of this study was to compare short-axis and long-axis approaches for ultrasound-guided subclavian vein cannulation with respect to indicators of success. Methods Eighty-three patients undergoing cardiac surgery and requiring central venous cannulation were randomized to receive longaxis or short-axis ultrasound-guided cannulation of the subclavian vein by a skilled anesthesiologist. First-pass success, unsuccessful placement, number of attempts, number of needle passes, skin and vessel puncture, time to successful catheterization and complications were considered as outcomes. The subclavian vein was successfully cannulated by ultrasoundguided techniques in all patients. Central venous cannulation failed in two and 10 cases respectively with short-axis and long-axis view and the other view was used successfully. The fi rst-pass success rate was signifi cantly higher in the short-axis group (73%) compared with the long-axis group (40%) (P = 0.005). The procedure time, number of attempts, needle redirection, and skin and vessel punctures were signifi cantly lower in the short-axis than long-axis group (P <0.05). The overall number of complications did not diff er signifi cantly between groups even if artery puncture and hematoma occurred more frequently in the long-axis group. Moreover, the need to change the ultrasoundguided insertion technique was more frequent in the long-axis group. Conclusion Ultrasound-guided subclavian vein cannulation by an experienced operator has a higher fi rst-pass success rate and lower access time using the short-axis than long-axis approach. Reference (Figure 1 ). The Bland-Altman plot showed 95% limits of agreement from -8.96 to +8.83% and mean diff erence (bias) of -0.07% ( Figure 2 ). Conclusion We may use %ΔSV measured by TTE after PLR to predict FR, which is noninvasive and less time-consuming than other invasive techniques. Introduction Echocardiography is commonly used during both venoarterial (V-A) and venovenous extracorporeal membrane oxygenation (ECMO). In many circumstances, transoesophageal echocardiography (TOE) is the preferred monitoring tool. It can aid in cannula positioning, especially during double-lumen cannula placement for V-V ECMO, weaning of V-A ECMO and diagnose causes of high-access pressures and circuit fl ow problems. We use TOE as our preferred monitoring equipment before, during and after establishing ECMO. We sought to investigate how often information gained from TOE imaging had a major impact on management decisions. Methods A single-centre observational study at a tertiary referral institution. All patients supported with V-A or V-V ECMO during an 18-month period were included. Routine procedures such as wire position checks during cannulation or information gained to assist weaning from V-A support were not included. Results Twenty patients were supported with either V-A (all peripheral) or V-V ECMO during the observation period. In 12 patients (60%) TOE was instrumental in diagnosing potentially fatal complications or altered clinical management. In three patients on V-A support, afterload reduction and modulation of inotropic support was necessary due to extensive spontaneous echo contrast formation in the left ventricle and stagnant pulmonary blood fl ow; two out of these three patients, immediately after establishing support, required intra-aortic balloon counterpulsation to reverse clot formation around the aortic valve and root. Introduction Right-sided precordial leads (V3R to V5R) and posterior chest leads (V7 to V9) provide important information for the right ventricle and posterior wall. These additional lead electrocardiograms (ECGs) improve diagnostic value in acute coronary syndrome patients [1] . However, these additional electrocardiograms are not routinely recorded due to the time-consuming procedure involved. Recently these synthesized six additional lead ECGs using the standard 12-lead ECG system (Nihonkoden Co. Ltd) have been developed [2, 3] . But the accuracy is not clear. The purpose of the present study was to evaluate the accuracy of synthesized ECGs at the ST part. Methods Standard 12-lead and actual V3R to V5R, V7 to V9 lead ECGs at Tokyo Medical University Hospital were successfully recorded and compared with synthesized ECGs at the J point, M point that was defi ned for the point after 1/16 RR interval and T wave amplitude. ECGs of the complete right branch block, complete left branch block and pacing rhythm were excluded. Results A total of 1,216 ECGs were correctly recorded. The diff erences of actual and synthesized at the J point, M point and T wave amplitude were very small. Means of the diff erence ± 2SD were V3R/V4R/V5R/V7/ V8/V9: J point, 17 ± 1/14 ± 1/13 ± 1/12 ± 1/15 ± 1/18 ± 1 μV; M point, 15 ± 1/13 ± 1/12 ± 1/12 ± 1/12 ± 1/13 ± 1 μV; and T wave amplitude, 20 ± 3/32 ± 2/16 ± 2/37 ± 2/39 ± 2/43 ± 3 μV. There were positive correlations between all actual and synthesized ECGs of J point, M point and T wave amplitude ( Figure 1 ). Conclusion The ST part of synthesized V3R to V5R and V7 to V9 lead ECGs appears to be highly reliable. Synthesized additional lead ECGs might be useful to diagnose ischemic heart disease. Introduction Acute and chronic systemic infl ammatory conditions are associated with aortic stiff ening. Carotid-femoral pulse wave velocity (PWV), a marker of aortic stiff ness, increases in patients with infl ammatory diseases and independently correlates to levels of C-reactive protein (CRP). The eff ects of massive infl ammatory response in early sepsis on mechanical properties of the aorta have not been investigated. The objective of the current study was to prospectively assess aortic stiff ness in patients with early severe sepsis and septic shock and relate it to infl ammatory and haemodynamic variables and outcome. Methods We recruited patients meeting criteria for severe sepsis and septic shock within 24 hours of admission to ICU. After haemodynamic stabilisation, PWV was recorded at inclusion and after 48 hours using dual-channel plethysmography. Severity of illness was assessed with APACHE II and serial SOFA scores, haemodynamic and infl ammatory parameters (CRP, procalcitonin and fi brinogen) recorded. A 28-day follow-up was performed to distinguish between survivors and nonsurvivors. Results Twenty consecutive general ICU patients (six with severe sepsis and 14 with septic shock) were enrolled in the study; median age 59 years (IQR 56.5 to 72), APACHE II score 17 (13 to 20.5), SOFA score 5 (IQR 4 to 9). At 28 days, six patients had died. Median initial PWV was 10.4 (IQR 6.9 to 12.1) m/second in patients with severe sepsis, and 6.8 (IQR 5.3 to 7.5) m/second in patients with septic shock (P = 0.13). After 48 hours, PWV in the severe sepsis and septic shock groups had become similar, 9.3 (IQR 7.3 to 11.1) m/second and 9.2 (IQR 7.8 to 13) m/second respectively (P = 0.96). PWV had signifi cantly increased in survivors (7.8 to 12.3 m/second) (P = 0.04) versus nonsurvivors (6 to 7.8 m/second) (P = 0.69). Higher PWV correlated with increasing systolic pressure and lower CRP levels (r = 0.73, P = 0.01). Conclusion In early sepsis, aortic stiff ness is decreased in patients with greater disease severity, and in survivors increases to median levels within 48 hours. The main factors associated with lower pulse wave velocity are lower systolic pressure and higher CRP levels. The association of high serum CRP levels with low aortic stiff ness in patients with sepsis does not match data described in the literature [1] . Reference Approximately 15 million people worldwide suff er from congestive heart failure (CHF). The most severe symptom of CHF is pulmonary congestion -an acute increase in extravascular lung fl uid due to acute decompensation of heart failure. To date, no direct, reliable, simple and non-invasive method is available for assessment of thoracic fl uids. Continuous dependable monitoring of pulmonary edema for hospital patients can improve management and reduce duration of hospitalization. KYMA's solution is based on a matchboxsized monitoring device, which monitors thoracic fl uid content and trends. The μCor monitor transmits and receives electromagnetic signals that are propagated through tissue layers. Conduction is highly related to the amount of accumulated fl uids in tissues. This trial investigated the correlation between KYMA's lung water index (LWI) and directly assessed extravascular lung water (EVLW) in the scenario of acute pulmonary edema induced in seven sheep. Methods Acute pulmonary edema was induced by intravenous infusion of noradrenaline and dextrane, with stepwise increased dosage. KYMA measurements of LWI were compared with PiCCO extravascular lung volume as the reference gold standard. The experiment continued until maximum increase of EVLW and complete heart failure were reached. Results All seven sheep developed pulmonary edema, which was validated by increased LVEDP and EVLW. A linear correlation was found between invasive measurements of EVLW (PiCCO) and non-invasive KLWI. Conclusion KYMA's fl uid index demonstrated excellent correlation with invasive lung water measurement (R 2 = 0.96; P <0.0001) and was able to detect dynamic accumulation of EVLW in a range of 40 to 50 cm 3 increments ( Figure 1 ). Since the change in fl uid content between a normal and congested lung ranges between 250 and 500 cm 3 , the demonstrated sensitivity as refl ected in this study, supports its use for high-resolution and accurate thoracic fl uid monitoring. The aim of this work is to verify that low incidence of hemodynamic interventions in hemodynamically monitored patients in our postoperative ICU is not caused by insuffi cient attention being paid to the drop of noncalibrated cardiac index (nCI). In the last 2 years there was a need for hemodynamic intervention in 92% of patients with perioperative cardiac output monitoring, while only 31% of these patients needed at least one hemodynamic intervention postoperatively in the ICU. Methods High-risk patients planned to undergo elective major abdominal surgery were routinely monitored with LiDCOrapid; the monitoring continues overnight after surgery. The target nCI for the postoperative monitoring, as well as hemodynamic interventions in case of its drop, were prescribed by the anesthetist in the protocol upon handover of the patient to the postoperative ICU. The hemodynamic protocol in the ICU was then nurse driven with compulsory recording of patient's hemodynamic data into the patient's documentation every 2 hours and at the time of each intervention. Data from the memory of the LiDCOrapid monitor were analyzed using LiDCOview software and compared with patients' documentation. Results A total 649 hours of hemodynamic record were analyzed in 40 patients chosen at random from 121 patients monitored in the last 2 years. We found 22 drops in nCI that were noticed by the nursing staff and led to volume challenge in accordance with the protocol. One-half of these interventions was carried out outside the 2-hour interval of protocol-required monitoring and documentation entry. We identifi ed nine drops in nCI that were not reacted to. None of them happened within the fi rst 4 hours postoperatively and all of them were outside the 2-hour interval of protocol-required monitoring. In fi ve of these episodes the pulse pressure also dropped, suggesting impairment of the reading of the waveform from the arterial line. The need for hemodynamic intervention in postoperative care was quite rare in our patients. The drop in nCI was noticed in most of the cases and the patient was treated according to the protocol. However, isolated episodes of nCI drop were missed during the night after the operation. The results were pointed out to the nursing staff and a comparative survey will be conducted. As with any monitoring modality, (non)adherence to the protocol may become the limiting issue to the benefi t for the patient. The aim of this work is to evaluate perioperative hemodynamic monitoring for major abdominal surgery using the estimated continuous cardiac output (esCCO) technique available with the NIHON KOHDEN Vismo monitor as compared with the LiDCOrapid hemodynamic monitoring system. Methods The esCCO technique is a novel noncalibrated non-invasive method of cardiac output monitoring. It is based on the estimation of pulse pressure from the speed of pulse wave propagation, which is measured as a time delay between the ECG R-wave and the corresponding peripheral pulse wave detected by pulse oximetry -the delay being called the pulse wave transit time (PWTT) [1] . LiDCOrapid estimates noncalibrated cardiac output (nCO) by pulse wave contour analysis from an arterial line and is a well-established monitoring modality in our department. The patients planned for elective major abdominal surgery with ASA score of III and expected operation duration over 90 minutes are routinely monitored by LiDCOrapid with a perioperative hemodynamic optimalization protocol. In accordance with the protocol, an arterial line was inserted and LiDCOrapid monitoring was started before the induction of anesthesia. At the same time, non-invasive esCCO monitoring was started. Results Ten patients were monitored with a total of 141 esCCO-nCO measurement pairs. The correlation coeffi cient between esCCO and nCO was 0.65 and in the Bland-Altman diff erence analysis bias was +1.2 l/minute and 95% limits of agreement were ± 2.6 l/minute. The change of cardiac output between two consecutive measurements detected by LiDCOrapid was detected by esCCO in 80% of cases. Conclusion Hemodynamic monitoring with esCCO yields cardiac output values diff erent from those measured by LiDCOrapid. The reliability of PWTT is questionable when systemic vascular resistance The aim of this animal study was to determine the feasibility and accuracy of a new non-invasive system for determination of cardiac output (CO) compared with a known device (PiCCO; Pulsion Medical). In ICUs and ORs, hemodynamic management using goaldirected therapy has been shown to improve patient outcomes [1, 2] . The invasiveness of current technology may prevent clinicians from obtaining hemodynamic information. We developed a noninvasive system for detection of hemodynamic variables utilizing a photo-acoustic sensor (PA-S) and indicator dilution. The PA-S utilizes a combination of a miniature ultrasound sensor, a laser diode and an optical detector to accomplish the measurement. Methods Five pigs weighing 25 to 29 kg were used in the study approved by the local animal use and care committee. The animals were anesthetized, medically paralyzed, intubated and mechanically ventilated. Central venous and arterial catheters were placed for injection of the indicator and connection to the PiCCO system. The PA-S was placed in contact with the skin over the saphenous artery on the animal and adjusted to receive the maximum acoustic signal. The adjustment included the depth of ultrasonic penetration and the exact location on top of the blood vessel. Each indicator dilution was accomplished using three central venous injections of 15 ml normal saline at room temperature. Variation in CO was accomplished by removing up to 50% of the blood volume after splenectomy. The PA-S results were obtained using the empirical regression method. The resulting cardiac output readings from the PA-S were compared with the PiCCO readings. The correlation between the PA-S and the PiCCO system was (R 2 = 0.746) over a range of CO from 1 to 6 l/minute. The Bland-Altman analysis demonstrated a bias of -0.05 l/minute and precision of 0.50 l/ minute. Conclusion This animal study demonstrated the feasibility of the new PA-S for determination of indicator dilution CO in a limited range. The system showed good correlation with the PiCCO system even in the case of vasoconstriction as a result of blood loss. Since the PA-S utilizes transpulmonary indicator dilution, other variables such as intrathoracic blood volume, global end-diastolic volume and extravascular lung water can be calculated. Additionally, continuous CO can be obtained from the optical sensor signal utilizing arterial waveform analysis. References The aim was to evaluate the performance of arterial pressure-based cardiac output (APCO) [1] and pulse wave transit timebased cardiac output (esCCO) [2] monitors in Thai patients undergoing coronary artery bypass graft surgery (CABG) with cardiopulmonary bypass. Methods We studied 50 Thai surgical patients undergoing CABG with cardiopulmonary bypass and requiring pulmonary artery catheter and radial artery catheter placement as a standard of clinical care. All patients were measured for APCO using the Vigileo/FloTrac and for esCCO using the esCCO monitoring system. The data were compared with thermodilution cardiac output (TDCO) monitoring as a reference method, simultaneously at pre-induction, post-induction, and every 30 minutes thereafter until the completion of the surgery. The bias and precision were assessed using Bland-Altman analysis. Results In total, 310 pairs of simultaneous measurements of APCO versus TDCO and 303 pairs of esCCO versus TDCO were obtained from 50 patients. Both APCO (R = 0.53, P <0.0001) and esCCO values (R = 0.56, P <0.0001) were correlated with TDCO values. Either of the changes in APCO (R = 0.63, P <0.0001) or any changes in esCCO (R = 0.60, P <0.0001) were correlated with changes in TDCO. For APCO relative to TDCO, the bias, precision, and the limits of agreement were 0.70, 1.63, and -2.5 to 3.9 l/minute, while those of esCCO were 1.20, 1.59, and -1.9 to 4.3 l/ minute, respectively. Comparisons of the bias of APCO and esCCO revealed a level of signifi cance of P <0.001. Conclusion Despite the overestimation of cardiac output measurements, APCO and esCCO calibrated with patient information has shown an acceptable trend as compared with TDCO in Thai patients undergoing CABG with cardiopulmonary bypass. Compared with esCCO, APCO demonstrated no signifi cant diff erences of precision; however, a lower mean bias was exhibited. Introduction Transpulmonary thermodilution is useful in the critical care. Following PiCCO, VolumeView is now available. Although previous studies concluded that signifi cant correlation was found between VolumeView and PiCCO [1, 2] , all experiments and data analysis were done by Edwards Lifescience. The aim of the present study was to compare the new VolumeView with PiCCO to clarify their compatibility and to give a neutral answer. Methods Six pigs (about 10 kg) were used and we made sepsis models by LPS administration. All pigs were instrumented with a right (Pulsio-Cath) and a left (VolumeView) thermomistor-tipped femoral arterial catheter. The central venous catheter was inserted through the right jugular vein. CO, GEDV and EVLW were measured by the two systems. We performed measurements at 57 points. There was a good correlation between the two devices regarding CO and GEDV, but VolumeView showed higher GEDV than PiCCO. Regarding EVLW, there was no signifi cant correlation between two systems. VolumeView showed signifi cantly higher EVLW than PiCCO ( Figure 1 and Table 1 ). Introduction Goal-directed therapy used in the perioperative period of patients undergoing cardiac surgery shortens the length of ICU stay [1] . We aimed to compare the postoperative results of the liberal and restrictive fl uid strategy used in patients undergoing pulmonary resection surgery (PRC). Methods We have been using the restrictive fl uid strategy since March 2013 in our institute. Patients who were on the liberal fl uid regime were analyzed retrospectively. From March 2013 until today, patients who were on restrictive fl uid strategy were analyzed prospectively. A total of 125 patients were included in the study. Age, duration of anesthesia, type of fl uids given intraoperatively, fl uid index (ml/kg/hour), fl uid intake/output balance, creatinine and lactate levels were compared with pulmonary and renal morbidity, and length of stay in ICU, using multivariate analysis. Results A signifi cant correlation (P <0.05) was established between the amount of crystalloid given intraoperatively, fl uid index and fl uid balance with pulmonary morbidity (n = 52). The fl uid index and inotropes usage were correlated with the postoperative creatinine levels (P <0.05). There was no correlation between perioperative lactate levels with fl uid balance and fl uid index. Intraoperative blood loss, the amount of given crystalloid, colloid, blood and FFP, fl uid balance, duration of anesthesia and postoperative blood transfusion were found to be related (P <0.05) with the length of ICU stay. Four percent of the patients required renal replacement therapy and the overall mortality was 0.8%. Conclusion To reduce the morbidity of patients undergoing major surgery, the protocols of using the restrictive fl uid strategy in the perioperative period and simultaneous protection of end organs, especially the kidneys, is currently the subject of this discussion. We observed that the restrictive fl uid strategy did not lead to global organ hypoperfusion, which was monitored by lactate. Even though there was a negative correlation between the fl uid index with creatinine levels and renal failure, the need for renal replacement therapy was observed only in one case. As a conclusion, the postoperative pulmonary morbidity and length of ICU stay can be reduced in patients who undergo PRC by using the restrictive fl uid strategy (4.2 ± 0.3 ml/kg/hour), without causing any vital organ dysfunction. The rationale for perioperative fl uid therapy has been to preserve organ perfusion. However, conservative postoperative fl uid balance (FB) is now encouraged to avoid the eff ects of iatrogenic fl uid overload. In addition, fl uid expansion has been described as a confounder of acute kidney injury (AKI) diagnosis by dilution serum creatinine (SCr). We conducted a prospective audit of FB and renal function in patients undergoing major elective surgery admitted to critical care over a 28-day period. Fluid overload was defi ned as: (positive FB) / (weight × 0.6) × 100%. Results Thirty-two patients (56% female, median age 64) were studied over a median of 3 critical care days (range 1 to 7). Total FB was +3.9 l at discharge; however, 75% of this occurred intraoperatively so that positive FB in critical care was only +390 ml/day. Patients had median 9% fl uid overload at discharge. Two patients had transient AKI stage 1. Overall SCr decreased signifi cantly from preoperatively to discharge (P = 0.003), median 73 to 55 μmol/l, with decreases occurring both postoperatively and during critical care stay ( Figure 1 ). Conclusion We achieved near-neutral fl uid balances after admission, but did not resolve intraoperative fl uid accumulation. SCr fell signifi cantly, even after there was no further net fl uid accumulation. This suggests a decrease in creatinine generation after major surgery, rather than fl uid expansion, may largely account for sCr decreases in these patients. The aims of the study are to assess the usefulness of pulse pressure variation (PPV) as a predictive marker of fl uid responsiveness and to estimate the value of central venous-arterial diff erence of carbon dioxide (PCO 2 cv-a) to predict the outcome of critically ill septic patients. The question of whether a septic patient needs fl uids or not is crucial. Although PPV is a very reliable predictor of volume responsiveness, there are many limitations for its application. Cardiac arrhythmia, spontaneous breathing and low tidal volume ventilation prevent the extended use of this index. Methods This is a post-hoc analysis of data from a prospective observational study [1] , which included a population of patients with severe sepsis or septic shock as the main reason for ICU admittance. After an echo for the assessment of the cardiac systolic performance, we measured PPV, central venous oxygen saturation, blood gases from arterial and central venous lines, central venous pressure, systolic, diastolic, mean and pulse pressures, before and after volume challenge. We calculated changes in pulse pressure before and after volume challenge, and an increase of 9% was used as criterion to defi ne volume responsiveness [2] . Results Among 72 patients (71% men, APACHE II score 23.2) included in this study, 41 (57%) were responders. Due to spontaneous breathing and cardiac arrhythmia we were able to calculate PPV only in 18 patients (25%). Moreover, only eight (11%) calculations of PPV proved useful because of the application of low tidal ventilation (4 to 6 ml/kg ideal body weight) in the remaining 10 patients. We did not fi nd any value of the PCO 2 cv-a to predict the outcome of the patients (mortality 26.3%), since the diff erence between survivors and nonsurvivors was not statistically signifi cant (7.7 vs. 7.9, P >0.05). Conclusion The usefulness of PPV, the marker with the best performance in the prediction of volume responsiveness, due to its limitations, is very limited. The renal resistive index (RI) determined by Doppler ultrasonography allows a semiquantitative evaluation of kidney vasculature at the bedside. Interpretation of the RI in clinical practice is diffi cult due to interaction with cardiac output, heart rate (HR) and blood pressure [1, 2] . The impact of global hemodynamics on the RI remains to be evaluated. This study aims to investigate the relationship between the RI and changes in central hemodynamic during a central hypovolemia model in healthy volunteers (HV). Methods Eleven healthy volunteers (27 ± 8 years; eight male) participated in this study. Two diff erent models were performed: the fi rst model was performed by applying the head-up tilt (HUT) test. The complete maneuver was done by a 10-minute step that consisted of tilting the table from a supine position (Sup) to an angle of 70° (HUT) and back to supine (Sup'). The second model was performed by applying three consecutive valsalva maneuvers. Global hemodynamics included stroke volume (SV), HR, and mean arterial pressure, which were continuously measured non-invasively with a Finometer. At least three RI readings were obtained and averaged from the right and left kidneys in all HV. Results All HV had a signifi cant decrease of SV from 83 ± 17 ml to 63 ± 14 ml and an increase in HR from 67 ± 10 bpm to 88 ± 13 bpm during the HUT. Figure 1 shows the temporal changes of mean RI in both kidneys. A signifi cant decrease in the RI in both kidneys was seen during HUT. After the move back to supine, RI returned to baseline values, with a signifi cant variation of RI in the early measurements on the right kidney compared with late measurements on the left kidney (0.67 ± 0.05 vs. 0.61 ± 0.05, P <0.05). Valsalva maneuvers signifi cantly increased the RI in the right and left kidneys, from 0.6 ± 0.04 and 0.6 ± 0.05 to 0.7 ± 0.1 and 0.68 ± 0.15 (P <0.05), respectively. Conclusion These preliminary results showed that Doppler renal RI was aff ected equally in both kidneys during HUT, suggesting an eff ect of hemodynamic alterations during our model of central hypovolemia. Introduction Tissue hypoxia occurs frequently during surgery and may contribute to postoperative organ dysfunction [1] . We hypothesised that intraoperative optimisation of tissue oxygenation reduces postoperative complications and evaluated the feasibility of the optimisation protocol used. Methods We randomised 50 high-risk patients who underwent major abdominal surgery. Tissue oxygenation was monitored at the thenar eminence using near-infrared spectroscopy. All patients were treated according to a standard care algorithm. In addition, patients in the intervention group received dobutamine if necessary to keep tissue oxygenation ≥80%. Data were recorded continuously and complications were recorded during the hospital stay with a maximum of 28 days. The number of complications tended to be lower in the intervention group (11 vs. 20). Eleven patients in the intervention group had no complication, versus seven in the control group. There was no signifi cant diff erence between groups in length of stay in ICU or in hospital. Administration resulted in a 5% increase of tissue oxygenation. The cardiac index increased 0.3 (0.0 to 0.6) l/minute/m 2 ( Figure 1 ). The overall protocol adherence was 94%. Conclusion Intraoperative optimisation of tissue oxygenation will potentially result in better outcome after high-risk abdominal surgery. The protocol used may be considered feasible for clinical practice. Introduction Passive leg raising (PLR) has been suggested as a simple diagnostic tool to guide fl uid administration in critically ill patients [1] . Although the basic principles of PLR have been well studied, little is known about the impact of the introduction of this technique in daily clinical practice. The aim of the study was to describe the changes in fl uid balance of ICU patients before and after the introduction of PLR, and to determine the compliance of medical personnel with a PLRdriven protocol of fl uid administration. Methods In this single-centre prospective study, mixed ICU patients equipped with a PiCCO system received fl uid therapy on the basis of PLR test results in the fi rst 48 hours of treatment, after careful introduction of a new PLR-driven protocol. Exclusion criteria were increased abdominal pressure, fracture of leg or pelvis, deep vein thrombosis, head trauma and pregnancy. The control group existed of patients admitted to the ICU 1 year prior to the introduction of the protocol. We included 21 patients in each group. There was no signifi cant diff erences in the fl uid balance between the control and study group after 24 hours (5.0 ± 2.9 l vs. 3.6 ± 2.7 l, P = 0.11) and 48 hours (5.7 ± 3.5 l vs. 4.8 ± 3.7 l, P = 0.39). However, compliance with the protocol was poor (56%). After 2/11 positive tests, fl uid was not administered; and after 21/39 tests, fl uid was administered despite a negative test result ( Figure 1 ). The implementation of a PLR-driven protocol of fl uid administration did not change mean fl uid balances after the fi rst 48 hours. Noncompliance with the protocol was an important confounder. Reference Identifying which patient will be a fl uid responder is crucial. The aim of this prospective study is to evaluate which patient is a fl uid responder in septic shock using dynamics echocardiography with the distensibility index of the inferior vena cava (DIIVC), which has been shown to be predictive of an increase in CI [1] . Methods In a period of 1 year, 42 adult patients were admitted to the ICU in septic shock. All patients were treated according to the guidelines of the Surviving Sepsis Campaign International Guidelines for Management of Severe Sepsis and Septic Shock 2012 [2] . All patients were mechanically ventilated and subjected to transesophageal echocardiography in bicavale projection, calculating the DIIVC through the M-mode by measuring the change in diameter with the acts of positive pressure ventilation and using the formula: (100 × (maximum diameter -minimum diameter)) / minimum diameter. A value >18% was considered a predictor of an increase in CI >15%. The CI was measured continuously through use of the pressure recording analytical method on a Mostcare Vytech monitor. All patients were subjected to a bolus of 500 ml crystalloid through a central venous catheter only after reaching central venous pressure (CVP) ≥10 mmHg during treatment. The results were expressed as the mean with standard deviation and percentage. Results Sixteen patients (38%) had a change in DIIVC <18%, on average 11 ± 5%, and when subjected to fl uid challenge did not have an increase in CI >15%, average 8 ± 3%, not proving to be a fl uid responder. Twentysix patients (62%) had a change in DIIVC >18%, on average 45 ± 22%, and when subjected to fl uid challenge had an increase of the CI >15%, on average 31 ± 8%, proving to be a fl uid responder. Introduction Dynamic arterial elastance (Eadyn), defi ned as the pulse pressure variation (PPV) to stroke volume variation (SVV) ratio, has been suggested as a predictor of the arterial pressure response to fl uid administration. Rather than a steady-state assessment, Eadyn depicts the actual slope of the pressure-volume relationship providing a dynamic evaluation of the arterial load. So the higher the Eadyn value, the more likely arterial blood pressure is to improve after fl uid challenge (FC). The aim of this study was to assess the eff ectiveness of Eadyn, measured non-invasively in preload-dependent, spontaneously breathing patients. Methods Patients admitted postoperatively and monitored with the Nexfi n monitor were enrolled in the study. Patients were included if they were spontaneously breathing and had an increase in cardiac output ≥10% after a FC. They were classifi ed according to the increase in mean arterial pressure (MAP) after FC into MAP-responders (MAP increase ≥10%) and MAP-nonresponders (MAP increase <10%). Eadyn was calculated from the PPV and SVV values obtained from the monitor. Results A total of 34 FCs from 26 patients were studied. Seventeen FCs (50%) induced a positive MAP response. Preinfusion Eadyn was signifi cantly higher in MAP-responders (1.39 ± 0.41 vs. 0.85 ± 0.23; P = 0.0001) ( Figure 1 ). Preinfusion Eadyn predicted a positive MAP response to FC with an area under the ROC curve of 0.92 ± 0.04 of SE (95% CI: 0.78 to 0.99; P <0.0001). A preinfusion Eadyn value ≥1.06 (grey zone: 0.9 to 1.15) discriminated MAP-responders with a sensitivity and specifi city of 88.2% (95% CI: 64 to 99%). Conclusion Non-invasive dynamic arterial elastance, defi ned as the PPV to SVV ratio, predicted the arterial pressure increase to fl uid administration in spontaneously, preload-dependent patients. Introduction According to Guyton and colleagues [1] , expansion of blood volume can increase cardiac output (CO) in so far as the change in volume aff ects the mean systemic fi lling pressure (Pmsf ). However, rapid stress relaxation occurs after acute intravascular volume expansion so that the eff ect of volume on Pmsf and CO may disappear after a few minutes. The objective of the present study is to describe the extent of the haemodynamic changes generated by a fl uid challenge during 10 minutes on critically ill patients. Methods Patients admitted to the ICU were monitored with a calibrated LiDCOplus (LiDCO, UK) and Navigator (Applied Physiology, Australia) to estimate Pmsf analogue (Pmsa). Then 250 ml Hartmann's solution or Volplex was infused over 5 minutes. Data were recorded automatically from baseline to 10 minutes after the end of fl uid infusion. The time to maximum response and percentage change between baseline and last value of diff erent haemodynamic parameters are also reported. Results Sixty fl uid challenges were studied in 34 patients, with a mean duration of 5.3 minutes (± 2.5). In 22 events (37%), CO increased more than 10% (responders). The change between baseline and last value was greater in nonresponders for heart effi ciency (Eh) (-9.2% ± 9.7 vs. -3.1% ± 13.9, P = 0.05) but not in other haemodynamic variables. Time to maximal response on CO was 2 minutes after the end of the infusion ( Figure 1 ). Conclusion Stress relaxation damps down the eff ect of a fl uid challenge after 10 minutes except in terms of heart effi ciency. The eff ect of a fl uid challenge should be assessed up to 2 minutes after the end of fl uid infusion. Failure to do so may mislead clinicians about the patient's fl uid responsiveness. The latest sepsis guideline has emphasized early resuscitative fl uid management [1] . Early goal-directed therapy (EGDT) has been shown to improve 28-day mortality in recent studies [2, 3] . This strategy was based on improving tissue perfusion and oxygenation in spite of other supportive and therapeutic measures. Technically and historically, central venous pressure (CVP) measurement is one of the most dependent methods to estimate fl uid responsiveness and intravascular volume status on resuscitation. The Surviving Sepsis Campaign (SSC) guidelines recommended goal levels of CVP 8 to 12 mmHg in order to obtain appropriate tissue perfusion [1] . In this study, we objected to re-evaluate eff ectiveness of a fl uid resuscitation strategy in sepsis, comparing the eff ect of patients' daily fl uid balances (DFB) and CVP on patients' survival. Methods Patient records (APACHE II, length of stay (LOS), CVP, DFB, vasopressor and ventilator needs) were retrospectively collaborated, and a randomly-assigned 100 (63 men and 37 women, age 64.2 ± 15.5 years) were statistically analyzed for survival function. Results The mean APACHE II score was 23.6 ± 7.7, LOS was 9.7 ± 10.0 days, intubated period was 6.4 ± 8.6 days, vasopressor period was 4.7 ± 5.5 days, CVP was 10.5 ± 5.5 mmHg, DFB was 1,147.9 ± 1,157.6 ml, and 42 survived. Kaplan-Meier survival and COX regression analysis showed that CVP levels of 6 to 9 mmHg and DFB +800 to +900 ml, but not above, signifi cantly predicted survival, and also shorter LOS, intubated days and lower vasopressor needs with earlier discharge possibility. On the other hand, over-increased DFB and CVP levels strictly correlated with longer LOS and higher mortality rates, and the fi rst 24-hour mean fl uid balance alone was surprisingly not predictive. Conclusion Fluid resuscitation therapy is a double-edge-sword. (1) Despite lower volumes, higher volumes also increase mortality. (2) Overall, DFB seemed more important than the fi rst 24-hour DFB. Introduction Previous in vitro studies have shown that photometric assays may overestimate fi brinogen levels after hemodilution with HES. The in vivo eff ect of HES on fi brinogen assays was therefore studied. Methods Forty patients with intracranial tumor gave their consent to participate in this ethical approved study. Plasma fi brinogen levels were analyzed with ELISA, two photometric assays (Dade and Multifi bren) and one mechanical (Hook). In addition, ROTEM FibTEM-MCF was analyzed. Results Twenty-fi ve of the 40 patients received 1 l HES. Mean reduction of hematocrit was 17%. ELISA was lower than Hook and Multifi bren. The FibTEM relative decrease of 43% diff ered signifi cantly from the other assays. See Results To imitate early resuscitation in hypovolemic shock, medium was replaced with 40% v/v of L/R with dipyridyl plus H 2 O 2 for 4 hours, followed by replacing with complete medium for 44 hours. In such vs. 23 ± 85 ml, P = 0.037) received in the OR and mean volumes of hypotonic crystalloids administered in the fi rst 24 hours (P = 0.002), 48 hours (P = 0.010) and 72 hours (P = 0.007) of ICU admission. Serum sodium, chloride, lactate or creatinine levels were not signifi cantly diff erent between the two groups. Conclusion The administration of hypotonic fl uids in the fi rst 72 hours of liver transplantation may be a risk factor for prolonged ICU admission. This eff ect appears independent of serum electrolyte levels or renal dysfunction. Reference Introduction Vasopressin is frequently used to maintain blood pressure in refractory septic shock [1] . This study is looking into the hypothesis that vasopressin compared with norepinephrine would decrease the severity of septic status, evolution to multiple organ dysfunction, length of hospitalization, and mortality among patients with septic shock. Methods In this randomized, double-blind study, we assigned patients who still needed vasopressor to restore tissue perfusion after fl uid resuscitation to receive norepinephrine (0.05 to 2.0 μg/kg/minute) or vasopressin (0.01 to 0.03 U/minute) with low doses of norepinephrine. Both groups had the vasoactive drug infusions titrated and tapered to maintain a target mean blood pressure. The analysis included the total time of use and dosage of vasopressors every 6 hours, the move to single organ dysfunction and multiple organ failure, length of ICU stay and hospitalization, and mortality 7, 14 and 28 days after the start of infusions. Results A total of 407 patients underwent randomization but 387 patients were included in this study (191 patients received vasopressin, and 196 received norepinephrine only). The total time for vasopressors was 37 hours and 68 hours in the vasopressin and norepinephrine groups with P = 0.02. Single organ dysfunction and multiple organ dysfunction using vasopressin and norepinephrine were respectively: 37.7% vs. 49.2%, P = 0.02; and 17.8% vs. 26%; P = 0.05. Length of stay in the ICU was 14 and 17 days (P = 0.29) and the time of hospitalization was 23 and 28 days (P = 0.11) respectively. There was a signifi cant diff erence between the vasopressin and norepinephrine groups in the mortality rate at 14 and 28 days (29.3% vs. 36.7%, P = 0.05; 34% and 42.3%, P = 0.03), but there were no signifi cant diff erences in the overall rates for 7-day mortality (21.2% vs. 23.9%, respectively; P = 1.1). Conclusion Early application of vasopressin reduced the time of vasopressor use, progression to multiple organ dysfunction, length of stay in the ICU, and mortality rates at 14 and 28 days as compared with norepinephrine only. This observed diff erence can be attributed to early restoration of tissue perfusion in the control group making the state of septic shock shorter and reducing the potential for multiple organ dysfunction, which directly infl uenced patient survival. Reference Methods A 28-year-old man was admitted to the ICU following intubation and sedation for seizures due to acute alcohol withdrawal. A CT scan of the brain was normal. In the ICU the patient became haemodynamically unstable and fresh blood was aspirated from the nasogastric tube. Gastroscopy showed bleeding oesphageal varicies and hypertensive portal gastropathy. The patient was treated with variceal banding, tazobactam/pipacillin (4.5 g three times daily) and terlipressin (2 g four times daily) as per protocol for variceal bleeding. He was successfully extubated 48 hours later. Results On admission the patient's serum sodium was 139 mmol/l. The patient received terlipressin for 4 days in the following regimen: 2 g four times daily for 48 hours, then 1 mg four times daily for 24 hours and then 0.5 mg four times daily for a further 24 hours before stopping. On the last day of his terlipressin therapy, the patient's GCS dropped from 15 to 11. Serum sodium had fallen acutely to 116 mmol/l. The last two doses of terlipressin were cancelled and no other treatment for hyponatraemia was administered. His serum sodium and GCS returned to normal limits within 13 hours of terlipressin cessation with no neurological consequences. Conclusion There are few case reports of terlipressin-induced hyponatraemia [1] . Hyponatraemia is considered a rare side eff ect brought about by the partial agonist eff ect of terlipressin on renal vasopressin V2 receptors. Sola and colleagues monitored serum sodium concentrations in patients with acute variceal bleeding. They found that rapid reduction in serum sodium was common (up to 36%) and one patient developed osmotic demyelination syndrome. Hyponatraemia resolved on cessation of terlipressin [2] . Close monitoring of serum sodium levels is essential in patients on terlipressin therapy so rapid drops in sodium can be identifi ed and managed to stop associated neurological complications. Introduction Patients with distributive shock who require high-dose vasopressors have a high mortality. Vasopressors belong to two classes: catecholamines and vasopressin analogs. Each class has limitations. Patients receiving catecholamines often develop tachyphylaxis and metabolic complications (for example, lactic acidosis). Vasopressin analogs can cause mesenteric or myocardial ischemia and oliguria. Angiotensin II (ATII) is an endogenous peptide that increases blood pressure and aldosterone production. Preclinical data suggest a role for ATII in the treatment of sepsis-associated acute kidney injury. ATII may prove useful in patients who remain hypotensive despite catecholamine and vasopressin therapy. This is the fi rst randomized clinical trial to date which seeks to evaluate ATII for use in distributive shock. Methods Twenty patients with distributive shock and a cardiovascular Sequential Organ Failure Assessment score ≥4 were randomized to either ATII infusion (n = 10) or placebo (n = 10) plus standard of care. ATII was started at a dose of 20 ng/kg/minute, and titrated per a protocolized schedule for a goal of maintaining a mean arterial pressure (MAP) of 65 mmHg. The infusion (either ATII or placebo) was continued for 6 hours and then titrated off . The primary endpoint was the eff ect of ATII on the standing dose of norepinephrine required to maintain a MAP of 65 mmHg. Secondary endpoints included the eff ect of ATII on urine output, serum lactate and creatinine clearance, as well as 30-day mortality. The mean age was 62.9 ± 15.8 years. Of the patients, 75% were male, 45% were Caucasian and 40% were African American. The 30-day mortality for the two groups were similar for the ATII cohort and the placebo cohort (50% vs. 60%, P = 1.00). ATII resulted in marked reduction in norepinephrine dosing in all patients. The mean norepinephrine dose for the placebo cohort was 20.1 ± 16.8 μg/ minute vs. 7.3 ± 11.9 μg/minute for the ATII cohort (P = 0.022). The most common adverse event was hypertension, which occurred in 20% of patients receiving ATII. Conclusion ATII is an eff ective vasopressor agent in patients with distributive shock requiring multiple vasopressors. The initial dose range of ATII that appears to be appropriate for patients with distributive shock is 2 to 10 ng/kg/minute. Further studies to assess the use of ATII in patients with distributive shock are warranted. Introduction Patients with septic shock die mainly due to refractory shock. Vasopressin is commonly used as an adjunct to catecholamines to support blood pressure in refractory septic shock, but its eff ect on mortality is unknown. We hypothesized that vasopressin as compared with norepinephrine would decrease mortality among cancer patients with septic shock. Methods In this, randomized, double-blind trial, we assigned patients who had cancer and septic shock and needed a vasopressor to receive norepinephrine or vasopressin in addition to open-label vasopressors. All vasopressor infusions were titrated and tapered according to protocols to maintain a target blood pressure. The primary endpoint was the mortality rate 28 days after the start of infusions. Results A total of 107 patients underwent randomization in this fi rst part of trial, and were infused with the study drug (53 patients received vasopressin, and 54 norepinephrine), and were included in the analysis. There was no signifi cant diff erence between the vasopressin and norepinephrine groups in the 28-day mortality rate (67.9 and 58.5%, respectively; P = 0.31). There were no signifi cant diff erences in the overall rates of serious adverse events (5.3% and 5.5%, respectively; P = 1.00). Conclusion Vasopressin did not reduce mortality rates as compared with norepinephrine among patients with cancer and septic shock who were treated with catecholamine vasopressors. Introduction Clinical study suggests that beta-blockers, by decreasing the heart rate (HR) together with an increase in stroke volume (SV), do not negatively aff ect cardiac output (CO), allowing an economization of cardiac work and oxygen consumption in patients with septic shock [1] . Whether this hemodynamic profi le leads to an amelioration of myocardial performance is still unclear. The objective of the present study was therefore to elucidate whether a reduction in HR with esmolol is associated with an improvement of cardiac effi ciency in patients with septic shock who remained tachycardic after hemodynamic optimization. Methods After 24 to 36 hours of initial hemodynamic stabilization, 24 septic shock patients with HR >95 bpm and requiring norepinephrine (NE) to maintain mean arterial pressure (MAP) between 65 and 75 mmHg Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 despite adequate volume resuscitation received a continuous esmolol infusion to maintain the HR between 94 and 80 bpm. NE was titrated to achieve a MAP between 65 and 75 mmHg. To investigate myocardial performance, we simultaneously assessed LV ejection fraction (LVEF), tricuspidal annular plane solid excursion (TAPSE) by echocardiography, the dP/dt MAX and the cardiac cycle effi ciency (CCE) both estimated from the arterial pressure waveform. Data were obtained at baseline and after achieving the predefi ned HR threshold (T1). Results For a MAP between 65 and 75 mmHg, esmolol administration signifi cantly decreased HR (115 ± 10 to 91 ± 7 bpm), NE (0.7 ± 0.4 to 0.5 ± 0.3 μg/kg/minute), and dP/dt MAX (1.1 ± 0.3 to 0.8 ± 0.3 ms/ mmHg). Conversely, TAPSE (15 ± 3 to 20 ± 3 mm), CCE (-0.2 ± 0.4 to -0.03 ± 0.4 units) and SV (37 ± 8 to 42 ± 10 ml) signifi cantly increased at the end of the study period (all P <0.05). CO (4.1 ± 0.8 to 3.9 ± 0.8 l/ minute) and LVEF (46 ± 10 to 48 ± 10%) did not change. Conclusion In patients with established septic shock who remained tachycardic after hemodynamic optimization, titration of esmolol to reduce the HR to a predefi ned threshold economized cardiac function, resulting in a maintained CO with a lower HR and a higher stroke volume. Such a hemodynamic profi le was characterized by an improved cardiac effi ciency, as indicated by the decrease in dP/dt MAX associated with an increase in CCE. Finally, echocardiographic data suggest that reducing HR with esmolol positively aff ects right ventricular function. In the intensive care setting, most physiologic parameters are monitored automatically. However, urine output (UO) is still monitored hourly by manually handled urinometers. This study evaluated an automatic urinometer (AU) and compared it with a manual urinometer (MU). Methods This was a prospective study in the ICU of a cardio-thoracic surgical clinic. In postoperative patients (n = 36) with indwelling urinary catheters and an expected stay of 24 hours or more, hourly UO samples were measured with an AU (n = 220, Sippi®; Observe Medical, Gothenburg, Sweden) or a MU (n = 188, UnoMeter™ 500; Unomedical a/s, Birkeroed, Denmark), and thereafter validated by cylinder measurements. Malposition of instrument at reading excluded measurement. Data were analyzed with the Bland-Altman method. The performance of the MU was used as minimum criteria of acceptance when the AU was evaluated. The loss of precision with the MU due to temporal deviation from fi xed hourly measurements was recorded (n = 108). A questionnaire, fi lled out by the ward staff (n = 28), evaluated the ease of use of the AU compared with the MU. Results Analysis according to Bland-Altman showed a smaller mean bias for the AU, +1.9 ml, compared with the MU, +5.3 ml (P <0.001). There was no statistical diff erence in measurement precision between the two urinometers, defi ned by their limits of agreement (±15.2 ml vs. ±16.6 ml, P = 0.11). The mean temporal variation with the MU was ±7.4 minutes (±12.4%), limits of agreement ±23.9 minutes (±39.8%), compared with no temporal variation with the AU (P <0.001). A total 86% of the ward staff considered the AU superior to the MU (P <0.001). Conclusion The AU had a signifi cantly lower bias than the MU and the loss of precision of hourly UO due to temporal deviations using the MU was avoided with the AU. The AU was also evaluated higher by the ward staff , refl ecting perception of higher reliability, easier use, less contact with urine bags and less time usage for measurements. The features of the AU may also indicate a favorable clinical impact in the normal ward, when staffi ng does not allow hourly measurements with a MU. Introduction Fall in mean arterial blood pressure (MAP) during anaesthetic induction is common and may result in profound hypotension. Using the LiDCORapid (LiDCO Ltd, UK), we previously demonstrated that the fall in MAP is usually driven by a reduction in stroke volume (SV) that was presumed due to venodilation. Lowdose phenylephrine (PE), a venoconstrictor, commenced immediately prior to induction ameliorated these eff ects to a signifi cant degree [1] . However, it is important that the pre-induction administration of PE should not cause any marked changes in MAP and CO. We retrospectively studied a group of 40 high-risk patients about to undergo major peripheral vascular surgery where PE was commenced immediately pre-induction. The haemodynamic eff ects of this dose of PE on MAP, SV and CO were analysed. Methods A radial artery line was inserted prior to induction of anaesthesia, and baseline MAP, CO and SV were obtained using the LiDCORapid. The PE infusion was commenced at a rate of 1 to 2 mg/ hour. Remifentanil was then administered using a target-controlled infusion pump until a 2 ng/ml predicted eff ect site concentration (Ceff ) was reached. Propofol was then administered to induce anaesthesia. Haemodynamic changes from the pre-PE baseline to commencement of propofol were recorded at 5-second intervals. The changes in MAP, SV and CO were assessed as the percent change in values obtained immediately prior to PE and followed for at least 5 minutes until commencement of propofol. Results Patient demographics, mean (range): age 71 (46 to 89) years, 31 male, weight 80 (55 to 115) kg, ASA 3 (2 to 4). The changes in MAP and CO were not clinically or statistically signifi cant. MAP increased by an average of only 2%, range -25 to +12; CO by 0%, range -23 to +16 (P = 0.35 and P = 0.36 respectively). Conclusion In a previous study [1] , PE administered prospectively prior to induction markedly reduced the haemodynamic changes following induction of anaesthesia in high-risk patients using remifentanil and propofol. This eff ect was achieved without signifi cant changes in MAP and CO in the 5-minute to 10-minute period between commencement of PE and induction of anaesthesia. Reference Introduction Acetaminophen (APAP) is commonly administered in the surgical ICU (SICU) for its analgesic and antipyretic eff ects. Case reports have described the potential for APAP to cause allergic reactions with and without hypotension. Furthermore, there have been case reports of APAP causing isolated hypotension in the absence of other allergic responses [1] . This has been well described following intravenous administration [2] . However, other routes of administration causing hypotension and associated diagnoses remain to be elucidated. The present case series describes 11 patients with APAP-induced hypotension in the SICU at a Level I trauma center. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Methods Patients admitted to the SICU over a 7-month time frame who were reported by the nursing staff to have experienced hypotension following the oral or rectal administration of APAP were included. Their electronic medical records were retrospectively reviewed to describe the change in systolic blood pressure (SBP) and mean arterial pressure (MAP) within the fi rst hour following all administrations of APAP. Additional data collected consisted of patient age, sex, admission diagnoses and formulation/route of APAP administration. . Conclusion It appears from this small case series that patients with ICH may experience more isolated hypotension following administration of APAP when compared with others. These changes seemed to be greater than previously reported for intravenous administration. It may also be possible that APAP solution exudes hypotensive eff ects more commonly when compared with tablet or rectal formulations. Introduction Profound myocardial depression can occur after cardiac surgery. Use of ventricular assist support through venoarterial extracorporeal membrane oxygenation (ECMO) has been positively reported. This study will focus on the outcomes of patients who, upon suff ering hemodynamic failure after cardiac surgery, were supported by the use of ECMO during their stay in the surgical ICU of Incor FMUSP. Methods This was a retrospective, single-center and observational study. The records of 48 patients who underwent cardiac surgery and, subsequently, needed percutaneous or surgical implantation of ventricular assist devices were evaluated. The evaluation considered the following criteria: basal characteristics, indications for ventricular assistance, duration, length of ICU and hospital stay, and hospital mortality, through data collection forms. Of the 48 patients included on the study, 26 (54%) were males, and 31 (64%) were younger than 18 years old. These patients developed cardiogenic shock during 72 hours after cardiac surgery. In all cases, ECMO was inserted after cardiac surgery. Of all patients, 32 (66%) were central ECMO, inserted in the operative room, and 16 were percutaneous, inserted in the ICU. The median duration of ventricular assistance was 6 days (IQR 0 to 41), the length of ICU stay was 16 days (IQR 1 to 111), and hospital stay was 29 days (IQR 1 to 198). Twenty patients survived (41%) and were discharged from our hospital. Conclusion The use of mechanical circulatory assists devices is an effi cient tool to manage seriously ill patients after cardiac surgery. This tool should be considered early in the diagnosis of cardiogenic shock after cardiac surgery. Introduction Mortality from cardiogenic shock remains high [1] and, despite a physiological rationale, intra-aortic balloon counterpulsation (IABP) has recently been shown to be ineff ective in reducing mortality [2, 3] . Veno-arterial extracorporeal membrane oxygenation (V-A ECMO) may off er a survival advantage over IABP. The objective of this study was to describe the characteristics and outcomes of patients supported with IABP or Impella and to identify the characteristics of patients who die, despite mechanical assistance, for whom a proposed V-A ECMO programme may be benefi cial. Methods A retrospective cohort study in a 30-bed, medical-surgical ICU. All adult patients supported with IABP or Impella over 2 years to March 2013 were identifi ed and data were extracted by case-note review. Subgroup analysis was carried out for patients aged ≤65 and for those who fulfi lled the modifi ed Melbourne criteria for V-A ECMO [4] . Data collected included demographic data, physiology and organ support at baseline and at 6, 12, and 24 hours, ICU and hospital outcomes and cause of death. Comparisons between survivors and nonsurvivors were made with t test/chi-squared tests as appropriate. Results A total of 129 patients were identifi ed: 78% were male, mean age was 70 years (SD ±11.8), mean APACHE II score was 20 (±5) and ICU mortality was 44%. Comparing survivors with nonsurvivors the only statistically signifi cant diff erence was metabolic acidosis (-6.8 ± 5.3 vs. -10.9 ± 7.0 mEq/l; P <0.05). Heart rate, mean arterial pressure, lactate, central venous oxygen saturation, cardiac index, arterial blood pH and mechanical ventilation failed to show a signifi cant diff erence. Eleven of these patients would have fulfi lled the proposed criteria for V-A ECMO, with an ICU mortality of 36%. Conclusion Only metabolic acidosis was associated with mortality in patients supported with mechanical assist devices. Our data do not allow discrimination of survivors from nonsurvivors. Patients who fulfi lled the proposed criteria for V-A ECMO showed a similar mortality to a recent series treated with V-A ECMO [4] . The proposed criteria do not identify a cohort, in this population, that would expect a mortality benefi t from V-A ECMO. The normobaric oxygen paradox (NOP) is a recent concept that postulates the use of intermittent hyperoxia to stimulate erythropoietin (EPO) production [1] . Hyperoxia increases oxygen free radicals and may lead to endothelial damage and vasoconstriction [2] . We evaluated the microvascular response to transient hyperoxia and its eff ects on EPO production. Methods Six patients with hemodynamic stability and mechanically ventilated with FiO 2 <50% were included in this prospective observational study. Patients underwent a 2-hour period of hyperoxia (FiO 2 100%). The sublingual microcirculation (sidestream dark-fi eld imaging (SDF)) was evaluated at baseline (t0), 2 hours after hyperoxia (t1), and 2 hours after return to basal FiO 2 (t2). SDF monitoring was continuously performed also during the variation of FiO 2 for 2 minutes. EPO levels were assayed at baseline and for 2 days. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Results An early vasoconstriction and a trend towards total vessel density (TVD) reduction were observed at t1 ( Figure 1 ). The TVD tended to increase without returning to baseline levels at t2. EPO increased in four patients out of 6 (P = NS). A negative correlation was found between the change in TVD after hyperoxia (t1 -t0) and the change in EPO (r = -0.88, P = 0.03). Conclusion Hyperoxia leads to vasoconstriction that seems to be reversible at hyperoxia cessation. Further data are needed to verify the effi cacy of the NOP in stimulating erythropoiesis in the critically ill. There might be a relation between hyperoxia-induced reduction in vessel density and the EPO increase. Introduction It is unknown whether lactate monitoring aimed to decrease levels during the fi rst hours in patients undergoing cardiac surgery improves outcome. The aim of this study was to evaluate the eff ect of lactate monitoring and resuscitation directed at decreasing lactate levels in patients admitted to the ICU in the fi rst 8 hours with lactate level ≥3.0 mEq/l. Methods Patients were randomly allocated to two groups. In the lactate group, treatment was guided by lactate levels with the objective to decrease lactate by 20% or more per 2 hours for the initial 8 hours of ICU stay. In the control group, the treatment team had no knowledge of lactate levels (except for the admission value) during this period. The primary outcome measure was the incidence of complications in 28 days. The lactate group received more fl uids and dobutamine. However, there were no signifi cant diff erences in lactate levels between the groups. The rate of complications was similar between groups (11% vs. 7%, P = 0.087). Length of ICU stay was higher in the lactate group (3.5 vs. 2.4 days, P = 0.047) when compared with the control group. Conclusion In patients with hyperlactatemia on ICU admission, lactate-guided therapy did not reduce complications and was related to a longer ICU length of stay. This study suggests that goal-directed therapy aiming to decrease initial lactate levels does not result in clinical benefi t. Introduction Blood lactate clearance, a surrogate of tissue hypoxia, is associated with increased mortality in septic patients. However, no study has directly measured lactate clearance at the tissue level in the post-resuscitation period of sepsis. This study aimed to examine the relative kinetics of blood and tissue lactate clearances and to investigate whether these are associated with outcome in ICU patients having severe sepsis or septic shock during the post-resuscitation phase. Methods A microdialysis catheter was inserted in the subcutaneous adipose tissue of the upper thigh and interstitial fl uid samples were collected. Serial measurements of blood and interstitial fl uid lactate levels were performed over a 48-hour period. Lactate clearance was calculated according to the formula: (lactate (initial) -lactate (delayed) / lactate (initial)) × 100%. Lactate (initial) is blood or tissue lactate within the fi rst 24 hours after ICU admission (H0). Lactate (delayed) is blood or tissue lactate at H4, H8, H12, H16, H20, H24 and H48 (H = hours). Results A total of 112 patients having septic shock (n = 79) or severe sepsis (n = 33) were examined. Tissue lactate clearance was higher compared with blood lactate clearance at H0 to H8 (P = 0.02), H0 to H12 (P = 008), H0 to H16 (P = 0.01), H0 to H20 (P = 0.01), and H0 to H24 (P = 0.02). Tissue lactate clearance was higher in survivors compared with nonsurvivors at H0 to H12, H0 to H20 and H0 to H24 (P = 0.02, for all). Multivariate analysis showed that ARACHE II along with tissue clearances at H0 to H12, H0 to H20 and H0 to H24 <30% were independent outcome predictors. Blood lactate clearance was not related to survival. Conclusion In critically ill septic patients, after the initial resuscitation phase, adipose tissue clears lactate earlier than blood. High tissue lactate clearance, but not blood lactate clearance, is associated with a favorable clinical outcome. Introduction The aim of this study was to determine the correlation between levels of serum lactate (SL) and conventional hemodynamic parameters (HPs) (mean arterial pressure (MAP), heart rate (HR), central venous pressure (CVP), urinary output (UP)) in patients with severe sepsis. In a subgroup, advanced HPs (central venous saturation (SvO 2 ), peripheral temperature (PT), cardiac index (CI), global end-diastolic volume index (GEDI) and extravascular lung water (ELWI)) were compared with levels of SL. Methods An observational prospective, single-center, pilot study was performed in intensive care (IC) of a medium-sized teaching hospital. Adult patients with severe sepsis were included and received standard goal-directed therapy (Surviving Sepsis Guidelines). Every patient received an arterial line and a central venous line in the upper diaphragm position. A subgroup received pulse contour cardiac output (PiCCO)-guided resuscitation and PT measurements. Pearson correlation coeffi cients (PCCs) were calculated between HPs and SL, which were measured every 4 hours for the fi rst 48 hours after inclusion. P <0.05 was considered statistically signifi cant. Results Twenty-fi ve patients (12 men) were included. Mean age was 68 years (30 to 93), mean APACHE II score 31 (20 to 42). The most frequent reasons for IC admission were abdominal sepsis (n = 11) and pneumosepsis (n = 7). Mean HPs (with SD and range) were respectively: Figure 1 shows the relation between two HPs with the highest correlation with SL. Conclusion The conventional HPs MAP, HR, UP and SvO 2 are signifi cantly correlated to levels of SL, but clinical value might be limited due to the relatively low correlation coeffi cients. In a small subgroup, PT is better correlated to the level of SL. Delayed assessment of serum lactate in sepsis is associated with an increased mortality rate R Arnold 1 Introduction Lactate assessment early in the resuscitation of sepsis has been recommended as a diagnostic biomarker. An abnormal lactate, independent of blood pressure, is an indication for aggressive fl uid resuscitation and its normalization is a recommended endpoint of resuscitation. The objective of this study was to evaluate the eff ect of the timing of lactate assessment on patient outcomes in sepsis. Methods Data were compiled using the Clinical Vigilance for Sepsis electronic health record (EHR) screening tool, which identifi ed consecutive patients from two hospital systems over 12 months at a 300-bed community hospital and over 24 months from a 500-bed academic tertiary care center. CV Sepsis alert screens the EHR to identify the presence of infection based on a multifactor alert system including labs, vital signs, and treatment team documentation. A physician order for intravenous antibiotics was used as a surrogate for suspected infection. The database identifi ed 37,160 consecutive patients treated for infection from a total of 216,550. Patients with a measured lactate were divided relative to its measurement within 3 hours (eLac) or greater than 3 hours (dLac) of sepsis identifi cation as recommended by the Surviving Sepsis Campaign. The CV Sepsis alert was the reference standard for time zero. Patients were compared in each group for the occurrence of the primary outcome of in-hospital mortality. Results A total 5,072 of 37,160 consecutive patients (13%) had a measured lactate. Sepsis patients experienced an overall 3% (1,186/37,160) mortality rate. In total, 4,153 (82%) patients had measured lactate within 3 hours, and 919 (18%) were delayed, with a decreased morality rate (eLac 6.8 vs. dLac 24.7, P <0.0001). There was no diff erence in average lactate levels between the groups (eLac: 2.1 ± 2.6, dLac: 2.3 ± 3.0, P = NS). A larger ratio of delayed-lactate patients had a lactate ≥4 mmol/l (dLac 12.6% vs. eLac 8.7%). Conclusion The delay in lactate assessment relative to clinical evidence of infection was associated with an increased mortality rate. The average lactate level in each group did not account for this eff ect. The timing of the assessment, not the lactate level, was prognostic of outcome. The mortality benefi t associated with lactate assessment within the 3-hour guideline suggests that an increased clinical awareness may lead to early initiation of time-sensitive interventions known to improve outcomes. Introduction The objective of this study was to determine whether lactate clearance (LC) is a signifi cant indicator of mortality in patients with colorectal perforation. LC has been associated with mortality in heterogeneous critically ill patients, but its role as a predictor of mortality in homogeneous patients with colorectal perforation is unclear. We retrospectively analyzed the clinical data of patients who underwent emergency surgery for colorectal perforation and were admitted to the ICU of our hospital from January 2003 to August 2013. Patients with traumatic, iatrogenic, and appendicitis perforations were excluded. The primary endpoint was survival to hospital discharge. The modifi ed Sequential Organ Failure Assessment (mSOFA) score, a customized SOFA score excluding the central nervous system component [1] , was used for prognostic scoring. The mSOFA score and several clinical factors were analyzed by univariate analysis as possible predictors of survival. We collected lactate levels and base excess (BE) measured during surgery and at 6, 12, and 24 hours after the fi rst measurement and calculated the respective LC values. The associations of initial blood lactate level, LC, and BE with mortality were assessed by receiver operating characteristics (ROC) curve and logistic regression analyses. Introduction Measurement of arterial lactate (A-LACT) levels has been used to monitor poor tissue perfusion, predicting mortality and guiding resuscitation. Peripheral venous lactate (V-LACT) has been regarded as an unreliable test, but a less invasive approach. We aimed to determine correlation between A-LACT and V-LACT and agreement of both in order to determine the usefulness of V-LACT as a biomarker for assessment in sepsis. The arterial lactate and venous lactate levels were strongly correlated in the condition of sepsis or septic shock. Consequently, V-LACT may be used in substitution for A-LACT particularly in lactate levels not higher than 4 mmol/l. However, trending should be generally applied instead of the absolute value. The hyperdynamic left ventricular ejection fraction (HDLVEF) in the ICU is a common fi nding thought to be associated with critical illness and possibly sepsis. The exact etiology of hyperdynamic ejection fraction has yet to be determined, and the prognosis of these patients has not been well defi ned. Methods The cohort consisted of 2,632 adults admitted to the ICU with echocardiogram reports using the MIMIC-II database, and was divided into those with HDLVEF and those with normal left ventricular ejection fraction (NLVEF). Those with impaired ejection fraction were excluded from the analysis. Baseline comparisons were performed using chisquared tests for equal proportion with results reported as numbers, percentages, and 95% CIs. Continuous variables were compared using t tests and reported as means with 95% CIs, while non-normally distributed data were compared using Wilcoxon rank-sum tests and reported as medians. Results Patients with HDLVEF had increased mortality in hospital, at 28 days and at 1 year when compared with patients with NLVEF. HDLVEF patients more frequently required renal replacement therapy (RRT), vasopressors and mechanical ventilation. Of the 2,632 patients, 1,220 were septic. There was an increased proportion of HDLVEF in the septic compared with the nonseptic groups (11.2% vs. 8.6%, P = 0.026). Interestingly, other statistically signifi cant associated comorbidities were cancer, CHF, arrhythmias, and hypertension, which were more commonly seen in the HDLVEF group. See Table 1 . Introduction Pulmonary regurgitation (PR) that develops after right ventricular (RV) outfl ow reconstruction including the Rastelli and Norwood procedure may often result in serious cardiac events early after surgery. We hypothesized that PR may be associated with pulmonary vascular resistance (PVR) and RV contraction. Accordingly, we assessed the impact of PVR on PR and RV function using a swine model. Methods Eight pigs (14 ± 2 kg) underwent total resection of the pulmonary valve cusps under cardiopulmonary bypass (PR group). This was compared with a control group (n = 6) that underwent only bypass. In both groups, the pulmonary regurgitant fraction (PRF) and cardiac output were measured by a pulsed Doppler fl ow meter, and the percent segmental shortening of RV (%RVSS) and RV end-diastolic dimension (RVDd) were measured by sonomicrometry. We also performed dobutamine stress evaluation as well as changing the PVR by carbon dioxide (PaCO 2 ) and inhaled nitric oxide (NO). Results All bypass time was 18 ± 3 minutes. In the PR group, the PRF was 40 ± 4% and the RVDd was 53 ± 9 mm* (vs. control 34 ± 6 mm). *P <0.05. A signifi cant reduction in the %RVSS (18 ± 1%* vs. control 22 ± 1%) and the cardiac output (2.1 ± 0.2 l/minute* vs. control 2.5 ± 0.3 l/minute) were observed. The PRFs were 60 ± 5% (PaCO 2 >80 mmHg), 37 ± 2% (PaCO 2 <20 mmHg), 24 ± 2% (NO 20 ppm; PaCO 2 40 mmHg), and were positively correlated with the PVR ( Figure 1A ). During the dobutamine stress, the %RVSS was increased (baseline 18 ± 1%, 5γ 21 ± 2%, 10γ 26 ± 3%), and was negatively correlated with the PRFs (Figure 1B) . Conclusion These results indicated that massive PR resulted in marked deterioration of RV performance; however, low PVR and high RV contractility may contribute to reduce the severity of PR and improve cardiac function. Nitric oxide may be a useful treatment modality the Introduction Cardiogenic oscillation is the fl uctuation in fl ow tracing in mechanically ventilated patients. Large cardiogenic oscillation may cause autotriggering in adult patients after cardiac surgery [1] and inaccurate volume monitoring [2] . However, it is unknown how cardiogenic oscillation is problematic in pediatric patients. Therefore, we prospectively surveyed cardiogenic oscillation in pediatric patients after cardiac surgery. Methods We enrolled 17 pediatric patients who underwent cardiac surgery using cardiopulmonary bypass. They were mechanically ventilated with pressure-controlled ventilation. We measured the amplitude in cardiogenic oscillation and compared them between their admission to the ICU and before extubation. We performed statistical analysis with the t test and considered P <0.05 signifi cant. Results Cardiogenic oscillation was 2.1 ± 0.6 l/minute just after the surgery ( Figure 1 ). Autotriggering occurred in seven of 17 patients when triggering sensitivity was set at 1 l/minute. Before the extubation, cardiogenic oscillation signifi cantly decreased to 1. Introduction Postoperative new-onset atrial fi brillation (PNAF) is very common after cardiac surgery. The infl ammatory response due to surgery and cardiopulmonary bypass (CPB) may contribute to PNAF by inducing atrial dysfunction [1] . Corticosteroids reduce the infl ammatory response and may thus reduce atrial dysfunction and PNAF [2] . The aim of this study was to determine whether dexamethasone protects from left atrial dysfunction and PNAF in cardiac surgical patients. Methods Patients undergoing cardiac surgery were randomized to a single dose of dexamethasone (1 mg/kg) or placebo after inducing anesthesia. Transesophageal echocardiography was performed in patients after CPB. The primary outcome was left atrial total ejection fraction (LA-TEF) after sternal closure; secondary outcomes included left atrial diameter and PNAF, detected by Holter monitoring. Results Sixty-two patients were included. Baseline characteristics were well balanced. Postoperative LA-TEF was 36.4% in the dexamethasone group and 40.2% in the placebo group (P = 0.15) ( Figure 1 ). Secondary echocardiographic outcomes were also insignifi cant ( Table 1 ). The incidence of PNAF was 30% in the dexamethasone group and 39% in the placebo group (P = 0.47). Conclusion Intraoperative high-dose dexamethasone did not have any protective eff ect on postoperative LA-TEF or dimension and did not reduce the risk of PNAF in cardiac surgical patients. Introduction Ranolazine, a piperazine derivative, is used as an antianginal drug to treat patients with chronic angina in clinical practice [1] and may improve coronary blood fl ow by reducing compression eff ects of ischemic contracture, and by improving endothelial function [2, 3] . In the present study we investigate the vascular eff ects of ranolazine on the endothelium, adrenergic system and Ca 2+ in isolated rat aorta. Methods Rat aortic segments (3 mm long) with and without endothelium were mounted for isometric tension recording in organ baths containing Krebs-Henseleit solution. Electrical fi eld stimulation (2, 4 and 8 Hz, 20 V, 0.25 ms duration for 30 seconds) was provided by a Grass S88 stimulator via two platinum electrodes positioned on each side and parallel to the axis of the aortic segment. Concentrationresponse curves of ranolazine (10 -7 to 10 -4 M) were obtained in a cumulative manner using endothelin-1, noradrenaline, thromboxane A2 and KCl as constrictor agents. The contractile responses to electrical fi eld stimulation were abolished by tetrodotoxin, guanethidine and prazosin, indicating that the contractile eff ect is due to the action of noradrenaline on alpha adrenoreceptors. Ranolazine diminished (P <0.05) neurogenic adrenergic contractions induced by electrical fi eld stimulation in aortic rings with and without endothelium. Ranolazine produced concentration-dependent relaxation in rings precontracted with noradrenaline (Emax 86 ± 6%, n = 10; P <0.05) but not in rings precontracted with endothelin-1, thromboxane A2 and KCl. Neither L-NAME (10 -4 M), an inhibitor of nitric oxide synthase, nor indomethacin (10 -5 M), an inhibitor of cyclooxygenase, modifi ed the relaxation induced by ranolazine. The calcium antagonist nifedipine (10 -6 M) reduced the relaxation induced by ranolazine. Conclusion These results indicate that ranolazine diminished the contractile response induced by adrenergic stimulation, suggesting an eff ect as an adrenergic blocker. The relaxant eff ects of ranolazine on rat aortic vessels is not dependent on the endothelium-derived factors (nitric oxide or dilator prostanoids) but involves an interference with the entry of calcium through dihydropyridine calcium channels. Introduction Early extubation post coronary artery bypass grafting does not increase perioperative morbidity and reduces the length of stay (LOS) in the ICU and in hospital [1] . Use of low-dose opioidbased general anaesthesia and time-directed protocols for fasttrack interventions does not increase mortality or postoperative complications in low-moderate-risk patients and has been found to have a reduced time to extubation and shortened ICU stay [2] . Our mean time to extubation is 6 hours, although patients are assessed to be safe to be weaned from mechanical ventilation at 2 hours following arrival in the ICU. This study aims to identify factors that delay extubation in patients undergoing routine cardiac surgery at our institution. Methods A prospective analysis was performed on all patients post adult cardiac surgery from 14 May 2013 to 10 July 2013. Emergency surgical patients and those with intraoperative complications were excluded. Results A two-sample t test was used to analyse the data. Patient demographics are presented in Table 1 . There were signifi cant delays in time of extubation in those who received morphine prior to extubation compared with those that did not (P = 0.0184) ( Table 2 ). There were no signifi cant diff erences in LOS in ICU or hospital. Factors such as age, EUROSCORE and type of operation did not have an infl uence on time to extubation. Conclusion Administering morphine prior to extubation causes signifi cant delays in weaning from mechanical ventilation. We plan to introduce intraoperative and postoperative protocols to facilitate rapid weaning from mechanical ventilation for elective cardiac surgical patients. Introduction Tako-tsubo cardiomyopathy (TCM) is an acute cardiac syndrome with regional hypokinesia in the left ventricle (LV), often aff ecting the apex causing apical ballooning. TCM is frequent in patients with adrenergic overstimulation and is probably common in ICU patients [1] . In a TCM rat model we evaluated whether diff erent anesthetic agents could attenuate LV akinesia in TCM. Methods Isoprenaline was intraperitoneal (i.p.) injected, which induces LV akinesia and apical ballooning within 90 minutes [2] . We performed the study in two diff erent settings. In the fi rst setting, spontaneously breathing rats (n = 12 in each group) were sedated with either pentobarbital, ketamine, isofl urane or no anesthetic before i.p. isoprenaline. One additional group received the K-ATP blocker glyburide before sedation with isofl urane. In the second setting, rats were anaesthetized with ketamine + midazolam, mechanically ventilated and the carotid artery was cannulated. Before i.p. isoprenaline, animals were randomized to either no isofl urane (0 MAC), isofl urane 0.5 MAC or isofl urane 1.0 MAC (n = 12 in each group). Arterial blood gas was obtained before isoprenaline and 60 minutes after isoprenaline. The heart rate (HR), systolic blood pressure (SBP) and body temperature (BT) were recorded continuously. After 90 minutes, echocardiography was performed. Extent of akinesia was expressed as the percentage of total LV endocardial length. End-diastolic and end-systolic LV volumes were measured, and stroke volume (SV) and cardiac output (CO) were calculated. Results In spontaneously breathing rats, the degree of akinesia was signifi cantly lower with pentobarbital and isofl urane (± glyburide) but not with ketamine compared with controls. The degree of akinesia was lowest with isofl urane. In ventilated rats, the degree of apical akinesia (%) was signifi cantly lower at 0.5 MAC (8.7 ± 7.3) and 1 MAC (5.7 ± 7.4) versus 0 MAC (17.7 ± 8.0). This was accompanied by a higher CO and SV. HR was lower at 1 MAC (6%) and SBP was lower at 0.5 MAC (106 ± 7) and 1 MAC (98 ± 7) versus 0 MAC (126 ± 8). BT and pH was lower in both isofl urane groups. In a multivariate model, isofl urane was the only variable that was independently associated with the degree of LV akinesia. Conclusion Isofl urane prevents experimental TCM and preserves LV function, an eff ect not mediated via opening of K-ATP channels. The eff ect cannot be explained entirely by attenuation of myocardial stress. Isofl urane sedation in the ICU might be an interesting approach for patients suff ering from hyperadrenergic conditions at risk of developing TCM. Introduction Preoperative therapy with angiotensin-converting enzyme inhibitors (ACEI) is common in patients undergoing cardiac surgery. The aim of this study was to evaluate the still-debated impact of preoperative use of ACEI on postoperative renal function in cardiac surgery patients [1, 2] . Methods A total of 624 consecutive patients, who underwent cardiac surgery from July 2012 to October 2013, were evaluated. Data were prospectively collected in our clinic's electronic database and were retrospectively analyzed as to preoperative ACEI therapy. The chisquare test was used for correlations. Endpoints of the study were the development of postoperative acute kidney injury (AKI) and the diff erence between hospital admission and discharge glomerular fi ltration rate (GFR In-hospital mortality was higher in group 2 (0 vs. 4.6%, P = 0.094). Conclusion A primary visit to a local hospital was associated with longer reperfusion time and was associated with higher mortality. Therefore, the reperfusion time could be reduced by patient education and management of the community healthcare system. Introduction A potential measure of post-ischemic milieu in myocardium subject to acute ischemic injury is to assess the socalled myocardial salvage index (SI) by relating the fi nal infarct size to the initial myocardium at risk (MaR). Cardiac magnetic resonance (CMR) has previously been shown to enable determination of both infarct size using late gadolinium enhancement and MaR using T2weighted imaging. This technique could thus potentially be used to identify infl ammatory responses that could be targeted to accomplish cardioprotection. The aim of this study was to relate SI, as determined by CMR, to the infl ammatory response in patients with acute myocardial infarction. Methods Fifteen patients with fi rst-time ST-elevation myocardial infarction were included in the study. All patients underwent primary PCI due to an acute occlusion in one branch of the left coronary artery. Final infarct size and MaR was determined by CMR performed 1 week after the acute event. The ischemic time was defi ned as the time from pain onset to opening of the occluded vessel. Blood samples were taken for assessment of infl ammatory response. Infl ammatory cells were analyzed by fl ow cytometry in a BD FACS Aria. Parameters were gated against control antibody and fl uorescence minus one strategy. Cytokine patterns were analyzed by BioRad BioPlex multiplex protein analysis technology. The SI did not correlate with MaR (P = 0.2720, R 2 = 0.09191). The population was divided into the lowest half of the SI (representing the most hostile milieu; SI: 23 to 57%) and the upper half of the SI (representing the friendliest post-infarction milieu; SI: 71 to 95%). The patients' profi le of adaptive infl ammatory response was characterized by fl ow cytometry. The two groups did not diff er with regard to their T-regulatory response (CD25 + FoxP3 + , P = 0.7203) or NK-cell (CD3 -CD56 + , Introduction Elevated cardiac troponin levels are common in ICU patients even in the absence of acute coronary syndromes and may be predictive of mortality. The recently introduced high-sensitivity cardiac troponin T (HS cTnT) assay has resulted in an increased detection of elevated cTnT in ICU patients [1] . The aim of this study was to determine the prevalence of elevated cTnT using the HS assay and its relationship with mortality. Methods A retrospective observational study was performed on all ICU admissions over a 12-month period. Data were obtained from the clinical information system (ICIP; Philips) and the ICU audit databases (AcuBase). Data collected included patient demographics, peak cTnT value, APACHE II score, requirement for organ support and mortality. The primary outcome measure was hospital mortality. Data were analysed using SPSS v.17.0. cTnT levels were divided into categories for analysis: normal (<14 ng/l) and elevated. The elevated category was further subdivided into quartiles. Univariate analysis was performed between potential risk factors and mortality followed by multivariate regression analysis to ascertain independent predictors of mortality. Results There were 417 admissions to the ICU during the study period, 89 of whom were excluded because of an absent cTnT value, leaving 328 patients included in the analysis. cTnT was elevated in 85% of patients. ICU mortality was 19% and hospital mortality was 28%. Hospital mortality (%) per cTnT category was: <14 ng/l = 2%; 14 to 38 ng/l = 19%; 39 to 90 ng/l = 26%; 91 to 252 ng/l = 39%; >252 ng/l = 43%. On univariate analysis, cTnT levels, age, ventilation and APACHE II score were signifi cantly associated with mortality. cTnT levels were signifi cant in multivariate regression independent of age and ventilation but did not reach signifi cance (P = 0.06) in a multivariate analysis that included the APACHE II score. Conclusion In 85% of general ICU patients, troponin measured by HS cTnT assay was elevated. cTnT levels were signifi cantly associated with mortality and are predictive of mortality independent of age and mechanical ventilation, but not independently of APACHE II score. There was a high correlation between troponin levels and APACHE II scores. Reference In the view of robust consequences following cardiac surgery, acute kidney injury (AKI) remains a major concern. Rhabdomyolysis (RML) following cardiac surgery and its relation to AKI need to be investigated. We aim to study the prevalence of RML development following cardiac surgery and the perioperative risk factors that may expedite the occurrence of RML. Methods All patients undergoing cardiac surgeries in our hospital were enrolled in the study during the period of 1 year in a prospective descriptive study measuring the occurrence of RML and its association with AKI, where all patients in the study underwent serial assessment of serum creatinine kinase (CK) and serum myoglobin. Serial renal function, prior statin treatment, cardiac injury, lengths of ventilation, and lengths of stay in the ICU and hospital were monitored. Conclusion ICU and hospital mortality rates were high at 40% and 56% respectively. Increasing age did not correlate with mortality. However, invasive ventilation and time between hospital and ICU admission were associated with a higher mortality. This may be due to a delay in diagnosis or surgical intervention. We suggest that one considers early intervention in patients aged over 80. A functional outcome measure at discharge may be a more clinically relevant endpoint. Reference Introduction The aim of this study was to clarify the risk factors for acute renal impairment (ARI) in patients with severe acute pancreatitis (SAP Introduction Controversy surrounds the empirical use of antibiotics in severe acute pancreatitis (SAP). There are concerns that the widespread use of antibiotic therapy in the absence of documented infection may lead to selection of drug-resistant organisms [1] . The aim of this study was to review the profi le of pancreatic fl uid isolates in patients with SAP admitted to the ICU. Methods Data were reviewed for 38 patients admitted to the ICU over a 5-year period. We evaluated organisms cultured from pancreatic specimens, as well as the prevalence of drug-resistant organisms in this group of patients. Results Aspirate of pancreatic material for culture was obtained in 55% of patients (n = 21). The mean time to acquisition of samples for culture from admission to ICU was 15.5 days. Fluid was sterile in 67% (n = 14) of initial samples. Gram-positive organisms were cultured from 43% (n = 9) of samples, Gram-negative organisms from 5% (n = 1) and yeasts from 5% (n = 1). Antibiotic therapy was administered in 95% of patients prior to samples being obtained for culture. On review of all samples received from patients (including nonpancreatic specimens), vancomycin-resistant enterococci (VRE) were isolated in 13 patients. Linezolid-resistant enterococci (LRE) were isolated in six patients, fi ve of whom had VRE isolated prior to the culture of LRE. Extendedspectrum beta-lactamase organisms were isolated in two patients, and carbapenem nonsusceptible Gram-negative organisms in three patients. The mean APACHE II score was 18.5 and overall hospital mortality was 26%. Conclusion In the majority of patients, initial aspirates of pancreatic material were sterile. This may be a result of prior antibiotic usage. Where organisms were cultured from initial aspirates, Gram-positive organisms predominated, possibly as a result of prior anti-Gramnegative antibiotic use. Therefore, in patients with ongoing sepsis who are receiving broad-spectrum antibiotic therapy, consideration needs to be given to the empiric treatment of Gram-positive infection, and in particular drug-resistant organisms such as VRE. Local epidemiology should be taken into account. Rationale use of antibiotics, in accordance with best-practice guidelines, may limit development of drug resistance; however, other risk factors for resistance may exist in this group and this would need to be further evaluated. Reference Introduction Liver transplantation (LT) in acetaminophen-induced acute liver failure (APAP-ALF) patients often presents signifi cant challenges. The King's College Criteria (KCC) have been validated on admission but not in later phases of illness. The aim was to improve determinations of prognosis on and after admission in APAP-ALF patients using the classifi cation and regression tree (CART) methodology to construct optimal binary splits on independent variables to predict outcome. Methods CART models were applied to US ALFSG registry data for prediction of 21-day spontaneous survival on admission and late stage (days 3 to 7). Analyses were carried out using R software (package rpart) for all (n = 803) APAP-ALF patients enrolled between January 1998 and September 2013 with complete outcome data. Training data were used to build CART trees and test data were used to evaluate prediction accuracy (AC), sensitivity (Sn) and specifi city (Sp). Introduction Extracorporeal membrane oxygenation (ECMO) is increasingly used for the treatment of refractory but potentially reversible respiratory and/or cardiac failure. Data on perioperative support with veno-arterial (V-A) and veno-venous (V-V) ECMO for adult liver transplant recipients are scarce [1, 2] . We report our experience of ECMO support in patients with acute liver failure (ALF) as a bridge to transplant and postoperative ECMO use following complications after surgery. Methods A retrospective study in a specialist tertiary referral ICU. Patients supported with V-V or V-A ECMO before, during, or after orthotopic liver transplant (OLT) were identifi ed. Results In total, four patients were supported during a 12-month period. Two patients required V-V and two patients V-A support. Two patients with ALF were bridged to OLT, one patient V-V ECMO for refractory respiratory failure and the other patient required emergency V-A support for treatment of intraoperative arrest. Both patients were successfully transplanted but died subsequently on ECMO: disseminated aspergillosis and haemophagocytic syndrome, and anoxic brain injury respectively. Two patients received postoperative ECMO support. The fi rst was treated with V-V ECMO for refractory persistent hypoxaemia following OLT for hepato-pulmonary syndrome, the second received emergency V-A support (eCPR) following cardiac tamponade and arrest on postoperative day 2. Both patients made a full recovery. Conclusion Emergency ECMO support before and after liver transplant is feasible. Despite the poor outcome in patients with ALF, we consider ECMO a valuable option to bridge selected patients to transplant. References frequently observed and is termed post-reperfusion syndrome. Recent studies showed the existence of a specifi c heart disease associated with cirrhosis termed cirrhotic cardiomyopathy (CCM). The aim of this study was to investigate whether the CCM has an infl uence on the development or the severity of post-reperfusion syndrome. Methods Fifty-two consecutive liver transplant patients were included in a retrospective observational study. The variables recorded were: age, etiology of the liver disease, MELD and MELD Na scores, the associated pathologies, the length of the QT interval, and plasma levels of brain natriuretic peptide (BNP). The patients with known renal or heart disease and the recipients of organs from extended criteria donors were excluded from the study. The QT interval was corrected for the heart rate using Bazett's formula (QTc). Statistical analysis was performed using SPSS Statistics v.19.1. Results In our study the criteria used to defi ne post-reperfusion syndrome relied on the hemodynamic changes that occurred at reperfusion. Preoperative echocardiography showed normal systolic and diastolic function at rest in all of the patients. For the identifi cation of patients at risk for CCM we used two of the supportive criteria from the recent defi nition of CCM: prolonged QTc interval and increased BNP levels. The study group included 28 men (53.8%) and 24 women. Mean (± SD) age was 50.5 (± 11.4). Mean MELD and MELD Na scores were respectively 15.51 (± 5.43) and 18.9(± 6.22). The value of BNP correlated well with the length of the QTc interval (P = 0.005), and with MELD and MELD Na scores (P = 0.025 and P = 0.001). In our study, post-reperfusion syndrome occurred in 63.4% of the patients. We could not fi nd a correlation between post-reperfusion syndrome and the BNP levels (P = 0.85) or the prolonged QTc interval (P = 0.38). The post-reperfusion syndrome did not correlate with the severity of the liver disease as assessed by MELD and MELD Na scores. The severity of post-reperfusion syndrome did not correlate with QTc prolongation or BNP levels. Conclusion Reperfusion is a critical time during liver transplantation. The clinical predictors of post-reperfusion syndrome are still under debate [1] . Our study showed that the post-reperfusion syndrome is not correlated with the severity of the liver disease or with the presence of risk factors indicating CCM. Introduction Combined heart-liver transplantation (CHLT) is an uncommonly performed procedure for patients with coexisting cardiac and liver disease [1] . The purpose of this study was to examine and describe the perioperative management of patients undergoing CHLT. Methods A retrospective review was performed of patients undergoing CHLT at our institution from 1999 to 2013. Results Twenty-seven CHLTs were performed, with 4/27 including simultaneous kidney transplantation. Familial amyloidosis was the indication for 21 CHLTs (78%), and 12 of these explanted livers were used for domino transplantations. Nineteen patients (70%) were receiving inotropic infusions at the time of organ availability. The median preoperative MELD score was 12, and elevations in preoperative international normalized ratio were due to warfarin in all but one patient. Liver transplantation immediately preceded cardiac transplantation in 2/27 cases to reduce high-titer donor-specifi c antibodies. Venovenous bypass was utilized in 14 operations (52%) performed with the caval interposition liver transplantation approach, cardiopulmonary bypass during liver transplantation in two cases (7%), and no bypass in 11 operations (41%) performed with a caval sparing (piggyback) surgical technique. Postoperatively, the median duration of mechanical ventilation, ICU stay, and hospital stay until discharge were 1 day, 5.5 days, and 15 days, respectively. Transfusions in the fi rst 48 hours following CHLT were not substantial in the majority of patients. One patient died within 30 days of CHLT. Conclusion CHLT is a life-saving operation that is performed with relatively low mortality and can be successfully performed in select patients with congenital heart disease. Patients undergoing CHLT at our institution had relatively preserved hepatic function but limited cardiac function often requiring inotropic support. Cardiac transplantation typically precedes liver transplantation during CHLT given the decreased ischemic tolerance of the cardiac graft [2] . However, liver transplantation prior to cardiac transplantation may serve to mitigate high-titer donor-specifi c antibodies. Various aforementioned operative approaches may be successfully utilized for the liver transplantation portion of these procedures. We attribute the favorable outcomes and perioperative courses to the multidisciplinary approach to care that CHLT patients receive at our institution. References Introduction The balance between coagulation and fi brinolysis was a prominent factor in the pathophysiology of sepsis, but this mechanism has been poorly understood. We aimed to determine whether collapsing this balance during the fi rst day of sepsis correlates with progression of organ dysfunction and subsequent death. Methods This study included all patients with sepsis admitted to a tertiary referral hospital in Japan. Global coagulation tests and hemostatic molecular markers such as fi brin/fi brinogen degradation products (FDP), D-dimer, thrombin-antithrombin complex (TAT) and plasmin-alpha 2 plasmin inhibitor complex (PIC) were measured within 12 hours after admission, and then SOFA and APACHE II scores and in-hospital mortality were evaluated. Patients were divided into three groups based on the levels of TAT/PIC and FDP/D-dimer and diff erences of clinical outcome between groups were assessed by chi-square analysis and ANOVA. Results We enrolled 101 patients; 87 patients survived, and 14 died. Mortality was signifi cantly higher in the high TAT/PIC group (0%, 19% and 24% for low, middle and high TAT/PIC groups, respectively; P = 0.011). In addition, SOFA and APACHE II scores were signifi cantly higher in the low FDP/D-dimer group (APACHE II = 22.3, 18.9 and 15.3; P <0.01, SOFA = 8.6, 6.5 and 5.1; P <0.01, for low, middle and high FDP/ D-dimer groups, respectively). See Figure 1 . Conclusion We demonstrated that the balance between coagulation and fi brinolysis, assessed with FDP/D-dimer and TAT/PIC ratios, was correlated with disease severity and clinical outcomes in sepsis, suggesting that impaired balance of the hemostatic system might play a pivotal role in progression of sepsis pathophysiology. Introduction Hemodynamic disorders in critically ill patients are often connected with bacterial load. Bacterial load is usually associated with bacteremia, LPS, high level of IL-6, PCT and also with aromatic microbial metabolites [1] [2] [3] , and so forth. In our opinion, microbial metabolites can participate in hemodynamic disorders in critically ill patients, particularly due to their infl uence on NO production [4] and intestinal permeability. Methods In a prospective study we observed critically ill patients on the day of admission to a polyvalent ICU, severe cardiac pathology was excluded. The level of phenylpropionic, phenyllactic, p-OHphenyllactic, p-OH-phenylacetic acids and total phenylcarboxylic acids (PhCAs) were measured in blood serum using gas chromatography (GC-FID). The level of PCT and NT-proBNP were measured using Elecsys 2010. Comparison between patients with hypotension (on vasopressor support) (group A) and without (group B) was performed. We studied 50 ICU patients with diff erent diseases: pneumonia (n = 15), severe kidney failure (n = 13), abdominal surgical pathology (n = 10), alcoholic cirrhosis (n = 5), soft-tissue infection (n = 7). In group A (24/50) the median of PhCAs was 17.8 (IR 11.4 to 30.0) μmol/l, and in group B (26/50) it was 7.2 (IR 3.7 to 13.2) μmol/l, P = 0.003 (t test). In group A, all patients (with or without documented infections) had symptoms of infection manifestation [5] , 20/24 (83.3%) of them died. In group B, the symptoms of infection manifestation were revealed in 12/26 (46%) cases, and the mortality was signifi cantly lower, 3/26 (11.5%) (P <0.05). General mortality was 23/50 (46%). The profi le of PhCAs diff ered in groups A versus B. Conclusion The total level of PhCAs in critically ill patients with hypotension was considerably higher than in hemodynamically stable patients. The participation of microbial factor in pathogenesis of hemodynamic disorders in the presence of systemic infl ammation may be validated with the load of microbial metabolites (PhCAs). Introduction Coagulation abnormalities are common in severe sepsis or septic shock [1] . Methods A prospective observational cohort study of 100 patients above 18 years of age diagnosed with severe sepsis or septic shock on admission. The fi rst blood sample collected on admission was analyzed. Data were collected through a predesigned pro forma. Those with previous history of any coagulation disorders were excluded. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Results Univariate analysis showed signifi cant correlation of APACHE II, platelet, PT, aPTT, fi brinogen and D-dimer with mortality in patients with severe sepsis or septic shock. Multivariate analysis showed APACHE II >20 (P = 0.001), fi brinogen <2 (P = 0.019) and D-dimer >1(P = 0.06) are independent predictors of mortality in severe sepsis or septic shock. See Table 1 . Introduction Systemic infl ammation caused by infection or trauma often leads to adverse outcome in critically ill patients. Binding of ligands to the receptor for advanced glycation end products (RAGE) activates several pathways, including the nuclear factor-kappa B pathway, which generates infl ammatory cytokines, proteases and oxidative stress. RAGE activation has been suggested to link amplifi cation and perpetuation of infl ammation to subsequent organ damage and adverse outcome in sepsis, acute lung injury and myocardial dysfunction [1] . The soluble receptor, sRAGE, is thought to act as a decoy, thus protecting against further RAGE activation. High mobility group box 1 (HMGB1) is a nuclear protein that is released during cellular stress and damage. S100A12 is a neutrophil-derived protein that acts as a proinfl ammatory danger signal. Both are ligands for RAGE. We hypothesized that excessive RAGE activation is linked to adverse outcome in critically ill patients and that a diff erent pattern of RAGE activation and infl ammation may be present in patients according to underlying pathology. Methods We measured sRAGE, HMGB1 and S100A12 serum levels upon admission, day 7 and the last day in the ICU in 405 critically ill surgical patients who needed intensive care for at least 7 days and in 69 matched healthy controls. We assessed the relation of these levels with clinical complications and outcome, in comparison with C-reactive protein (CRP) as a routinely measured clinical parameter of infl ammation. Results Upon ICU admission, levels of sRAGE, HMGB1, S100A12 and CRP were higher as compared with healthy levels. HMGB1, S100A12 and CRP remained elevated throughout the ICU stay but sRAGE decreased to levels lower than in healthy volunteers by day 7. sRAGE and CRP showed distinct time profi les during the ICU stay in patients undergoing cardiac versus other surgery and in patients with versus without sepsis upon admission. Elevated sRAGE upon admission, unlike CRP, was associated with need for renal replacement therapy, liver dysfunction, circulatory failure and mortality. Except for mortality, these associations remained in multivariate logistic regression analysis correcting for baseline risk factors. Conclusion Critical illness alters several components of the RAGE axis. Elevated sRAGE levels upon admission to the ICU were associated with adverse outcome, independent of baseline pathology. Reference the endotoxin activity assay (EAA), a newly developed rapid assay of endotoxin. Blood endotoxin levels (EA levels) of 314 patients admitted to our university hospital ICU were measured within 24 hours from admission, and their correlation with disease severity and outcome was examined. Methods The study is a single-center retrospective analysis of critically ill patients admitted to our university hospital ICU from November 2006 to March 2012. All patients whose EA and procalcitonin levels were measured and severity criteria of disease recorded were enrolled. A total of 314 patients were analyzed. Results The mean ± SD of all ICU-admitted patients (n = 314) was 0.39 ± 0.25, and that of healthy volunteers (n = 61) was 0.10 ± 0.09. The mean APACHE II score at admission increased in parallel with increased EA levels. ). There were no signifi cant diff erences regarding baseline demographic and clinical data apart from blood pressures, which were lower in sepsis group. Serum leptin on day 2 only was higher in the sepsis group (44.2 ± 17.7 μg/l vs. 31.1 ± 2.1 μg/l, P = 0.008) with no diff erence on days 0 and 4 of admission. We detected a serum leptin level of 38.05 μg/l on day 2 to be 93% sensitive and 100% specifi c to diagnose sepsis. The three serum CRP levels were higher in the sepsis group compared with the SIRS group (61.2 ± 9 mg/l vs. 48.9 ± 7.1 mg/l, P <0.001 on day 0, 71.5 ± 9.6 mg/l and 196.8 ± 39.8 mg/l in the sepsis group vs. 56.9 ± 8 mg/l and 73.7 ± 32.5 mg/l in the SIRS group for days 2 and 4 respectively, P <0.001 for both). We found a CRP of 67.5 mg/l on day 2 having 87% sensitivity and 93% specifi city for the diagnosis of sepsis. Conclusion We concluded that although serum leptin may not be benefi cial in early diff erentiation between sepsis and non-infectious SIRS on admission, it may be highly specifi c on the second day. Figure 1 ). Plasma CCP on day 0 had a good capacity for the diagnosis of PA NP: CCP on day 0 ≤17.5 ng/ml yielded a sensitivity of 86.5% and specifi city of 66.7% (AUC 0.74; 95% CI 0.630 to 0.829; P = 0.0001; Figure 2) . Conclusion Plasma CCP level ≤17.5 ng/ml is a sensitive and specifi c candidate diagnostic biomarker of PA NP. Introduction Systemic infl ammation is a generalized response to internal or external infl ammatory stimuli often resulting in multiple organ failure. Therefore, early diagnosis of systemic infl ammation would be of great therapeutic and prognostic importance. Various infl ammation biomarkers have been used in clinical and experimental practice, but a defi nitive diagnostic tool for an early detection of systemic infl ammation remains to be identifi ed. The neurotransmitter acetylcholine (Ach) has been shown to play an important role in the infl ammatory response. Serum cholinesterase (butyrylcholinesterase (BChE)) is synthesized in the liver and acts as the major Ach hydrolyzing enzyme in plasma. Hence, BChE activity has been widely used as a biomarker for liver function. However, the role of this enzyme in the infl ammatory pathomechanism has not yet been fully understood. Here, we describe a strong correlation between the BChE activity and the systemic infl ammatory response. Methods In this study we measured BChE activity in healthy subjects and in critically ill patients, clinically diagnosed with systemic infl ammation or sepsis. Furthermore, we measured the levels of routine infl ammation biomarkers and liver function parameters in blood of critically ill patients. Introduction Bloodstream infection (BSI) is associated with a reduction in circulating lymphocytes. Lymphopenia has been proposed as an early marker of BSI in pyrexial adults in the emergency department (ED) setting [1] . The aim of this study was to compare lymphocyte The values of C-reactive protein (CRP) and procalcitonin (PCT) were investigated to determine their eff ects on postoperative complications in patients with or without systemic infl ammatory response syndrome (SIRS). In 183 patients, in a prospective observational study, serum CRP and PCT values were collected every day starting on postoperative day 1 through day 5. The defi nition of SIRS includes two or more of the following: temperature >38 or <36°C; heart rate >90 beats/minute; respiratory rate >20/minute; arterial carbon dioxide pressure <32 mmHg; white blood cell count >12,000/mm 3 Introduction The occurrence of sepsis-induced immune suppression is associated with multiple organ dysfunctions, although the exact role of T-cell malfunction is obscure. We investigated the impact of sepsis on the adaptive immune system and to monitor T-cell receptor (TCR) diversity. Methods TCR diversity was analyzed in peripheral blood mononuclear cells (PBMCs) isolated from septic shock patients at three time points (days 1, 3 and 7 after diagnosis of septic shock). TCR diversity was measured in genomic DNA isolated from PBMCs using the Human Immun TraCkeRb test (ImmunID Technologies, Grenoble, France) [1] . Multi-N-plex PCR was performed using a primer specifi c to a V gene family and several primers specifi c to J segments. The signal is measured as a function of the fl uorescence intensity of the reference marker. Rearrangement validation and map generation were detected and analyzed using the Constel'ID software (ImmunID Technologies). HLA-DR expression on CD14 cells was measured by fl ow cytometry. Results TCR diversity was markedly decreased in septic patients at day 1 compared with healthy volunteers. A recovery of TCR diversity was observed at days 3 and 7 except for dead patients. HLA-DR expression was signifi cantly decreased in septic patients at day 1. The total lymphocyte count reduced in septic patients at day 1, but the lymphocyte count recovered at days 3 and 7 except for dead patients. Introduction Sepsis is associated with immune hyporesponsiveness but the immunological processes behind this are ill defi ned. Methods This study quantifi ed diff erences in plasma concentrations of cytokines between septic patients with faecal peritonitis, age and gender-matched surgical patients (without sepsis) and age and gender-matched healthy participants. In addition, cytokine levels were measured in supernatant from peripheral blood mononuclear cells stimulated with anti-CD3 and anti-CD3+anti-CD28, incubated for 4 days. Cytokine concentrations of IL-1β, IL-5, IL-6, IL-8, IL-10, IL-13, IL-17A, IFNγ and TNFα were determined by multiplex cytometric bead array. Results Plasma levels of IFNγ and IL-13 were lower in septic patients compared with healthy participants. In contrast, plasma levels of IL-6 (see Figure 1 ) and IL-8 were increased in septic patients compared with both surgical patients and healthy participants. Plasma levels of IL-10 were signifi cantly higher only in comparison with surgical patients. Following incubation with anti-CD3 and anti-CD3+anti-CD28, concentrations of IL-1β, IL-5, IL-6 (see Figure 1 ), IL-13, IL-17A, IFNγ and TNFα were markedly decreased in samples from septic patients. In addition, stimulation with anti-CD3+anti-CD28 resulted in lower production of IL-10 in septic patients. Lower concentrations of IL-8 were detected in septic patient samples stimulated with only anti-CD3. We found cytokine levels of IL-12p70 remained unaff ected across all groups and stimuli. Conclusion We demonstrated a proinfl ammatory cytokine profi le in blood from septic patients, preceding a pan downregulation of all assessed cytokines following in vitro T-cell stimulation. To our knowledge, this study is the fi rst to perform an immune functional assay across these three groups. Introduction Activated protein C (APC) defi ciency is prevalent in severe sepsis and septic shock patients. The aim of the study was to relate the anticoagulation activity evaluated by APC with other coagulation parameters adjusted to 28-day mortality. Methods A cohort study of 150 critically ill adults. Age, sex, sources of infection and coagulation markers within 24 hours from severe sepsis or septic shock onset, defi ned according to Surviving Sepsis Campaign (SSC) criteria, were studied. We analyzed APC activity using a hemostasis laboratory analyzer (BCS® XP; Siemens). A descriptive and comparative statistical analysis was performed using SPSS version 15.0 (SPSS Inc., Chicago, IL, USA). We analyzed 150 consecutive episodes of severe sepsis (16%) or septic shock (84%) admitted to the UCI. The median age of the study sample was 64 (interquartile range (IQR): 22.3 years; male: 60%). The main sources of infection were: respiratory tract 38%, intra-abdomen Introduction Flow-cytometric analysis is still restricted to cancer and immunocompromised patients. There are no clinical studies that evaluate the immunological changes in traumatic brain injury (TBI) patients. The objective of this study is to determine whether patients with severe TBI (GCS <9) manifest early (<48 hours after injury) signs of immunosuppression and whether this condition increases the incidence of infection during the ICU stay. We retrospectively analyzed data from 54 patients, including 10 patients with isolated brain injury and 44 patients with brain and extracranial injuries. The fl ow-cytometric analysis was performed within 48 hours of trauma. Collected data are shown in Table 1 . Results Preliminary analysis is limited to descriptive statistics that show an immediate immunosuppression condition after TBI, as established by reduction of CD4 + T lymphocytes and CD8 + T lymphocytes. Signifi cant data are collected in Table 2 . Conclusion In severe TBI patients, an immunosuppression state is early developed. It is relevant to establish whether this condition could aff ect the course and prognosis of ICU patients. .03) was found in low and high Cl:Na ratios (P = 0.081 and P = 0.046) (Figure 1 ). Conclusion Stewart's strong ion theory provides assessment for the etiology of acid-base disturbances. This evaluation can be performed easier and faster with the Cl:Na ratio. This study demonstrates that the disturbed Cl:Na ratio is associated with increased mortality in sepsis and severe sepsis patient groups. Introduction This study demonstrates that low level of plasmalogens (Pls) is an important marker of peroxisomal dysfunction. The primary-OH group in glycerol Pls was not substituted by the acyl group (fatty acid) as in diacylphospholipids but by the aldehydogenic alkenyl group (fatty aldehyde) found in the form of vinyl ether [1] . It is known that there is disruption of several organ systems in diseases connected with peroxisomal dysfunction, as in the case of multiple organ failure. Methods The objects of study were 18 people with multiple organ failure (35.6 ± 8.7 years) of various etiologies. The blood of 16 healthy Introduction Acute alcohol exposure suppresses proinfl ammatory response, which may be related to increased susceptibility to infections [1] .The purpose of the study was to investigate the eff ect of acute exposure to alcohol on TNFα production capacity and TNFα receptors (TNFRs) in an ex vivo model of whole-blood stimulation with lipopolysaccharide (LPS). Methods Whole blood was taken from healthy volunteers and was placed in tubes containing EDTA and immediately transferred to the laboratory. Heparinized blood samples were diluted 1:10 in RPMI 1640 culture medium (100 μl whole blood added in 900 μl RPMI 1640). Samples were preincubated with 0‚ 5‚ 12.5‚ 25‚ 50‚ 100 and 200 mM alcohol (EtOH) for 10 minutes at room temperature. After incubation, 500 pg LPS was added to each sample for 4 hours at 37°C. At the end of the process, samples were centrifuged (1,800 rpm, 5 minutes, r.t.). Culture supernatants were collected and stored at -70°C until measurements. TNFα and TNFR levels were determined in culture supernatant using the ELISA method [2] . Results We studied 24 healthy males volunteers aged 36.5 ± 1.4 (X ± SEM). TNFα was not detected in samples treated without alcohol in the absence of LPS stimulation (control) or in the presence of alcohol alone (data not shown). TNFα production was signifi cantly decreased at a dose of 25 mM alcohol after LPS stimulation (P <0.0001) compared with LPS-challenged samples ( Figure 1A ). Alcohol had no eff ect on TNFR I production when incubated with or without LPS (data not shown). Alcohol at lower doses (<50 mM) seemed to decrease TNFR II levels, but an increase in TNFR II levels was observed at higher doses (>50 mM) of alcohol, which was statistically signifi cant at doses of 100 and 200 mM alcohol after LPS stimulation ex vivo (P <0.001) ( Figure 1B ). Our observations indicate a suppression of pro infl ammatory response, but also a diff erential eff ect of alcohol on TNFR II production of whole blood in the presence of LPS challenge depending on the degree of alcohol intoxication. Introduction Neutrophils play a central role in eliminating bacterial pathogens, but may also contribute to end-organ damage. Interleukin-8 (IL-8), a key modulator of neutrophil function, signals through neutrophil specifi c surface receptors CXCR-1 and CXCR-2. Expression of these surface receptors can be altered by perfusion through an extracorporeal device. Extracorporeal methods of immune modulation have shown promise for treatment of sepsis; however, achieving an appropriate response is a major challenge [1] . In this study a mechanistic mathematical model was used to evaluate and deploy an extracorporeal sepsis treatment that modulates CXCR-1/2 levels. Conclusion Prolactin seems to be a stress hormone, probably related to the severity of illness, increased along with IL-6 and eHSP90 levels in SS. Also in these patients, cortisol correlates to lactate and eHSP90, but in contrast to SIRS the iHSP72 and iHSP90 immune response is depressed. Introduction Organ injury is a hallmark of sepsis, in particular acute kidney injury (AKI). Yet the mechanisms involved in sepsis-induced AKI are not well understood. Energy prioritization is an important cell defense mechanism, and thus we hypothesized that exogenous activation of AMP-protein kinase (AMPK), a master regulator of cellular energy metabolism, protects against sepsis-induced AKI. Methods Sixty C57BL/6 male mice, 6 to 8 weeks of age, weighing 20 to 25 g were divided into six groups: 1, cecal ligation and puncture (CLP); 2, CLP+AICAR (AI, AMPK activator, 100 mg/kg 24 hours before CLP); 3, CLP+compound C (AMPK inhibitor, 30 mg/kg; CoC); 4, sham; 5, sham+AI; 6, sham+CoC. Blood/tissue samples were collected 8 hours after CLP. Renal function (creatinine (Cr, mg/dl), BUN (mg/dl) and cystatin C (CysC, ng/ml)), cytokine expression (ELISA), endothelial activation (ICAM-1 expression), neutrophil adhesion (PMN, fl uorescence, anti-CD11b mAb-tagged PMNs) and vascular leak (Evan's blue) were assessed. The eff ect of AI given 4 hours (n = 12) before and 2 hours after (n = 11) CLP was also evaluated. Methods Ninety-seven C56Bl6 male mice were subject to multitrauma after crush of the femur and chemical pneumothorax by turpentine. Mice surviving 72 hours after the injuries were challenged intravenously with one 7log 10 log-phase inoculum of P. aeruginosa and survival was recorded. In separate experiments, mice were sacrifi ced post injury; splenocytes were isolated and stimulated with 10 ng/ ml LPS and cytokines were measured in supernatants by an enzyme immunoassay. Quantitative cultures of the right lung, kidney and liver were performed. The same procedures were done for sham-operated mice and for mice subject only to femur crush and to pneumothorax. Results Initial experiments with 21 mice showed that the overall death rate for this model of multitrauma was 66.7% with most deaths occurring in the fi rst 48 hours. In the second set of experiments, 12 mice remaining alive 72 hours post injury were challenged with P. aeruginosa; mortality was 37.5% compared with 75% of 12 non- Introduction The purpose of this study was to evaluate whether patients with sepsis exposed to clopidogrel have lower mortality and fewer days on mechanical ventilation compared with patients not exposed to clopidogrel. A recent post-hoc analysis of the PLATO trial suggests that the antiplatelet agent ticagrelor provided a signifi cant Introduction The aim of this study was to determine the correlation between regional tissue oxygenation (SrO 2 ) measured by near-infrared spectroscopy (NIRS), central venous oxygen saturation (SvO 2 ) and levels of serum lactate (SL) in patients with severe sepsis. Methods An observational pilot study was performed in an ICU of a medium-sized teaching hospital. Adult patients admitted with severe sepsis were included and three NIRS electrodes were attached: on the left side of the forehead (LF), the right side of the forehead (RF) and the right forearm (RA Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Methods This was a post-hoc, subgroup analysis of a multicenter retrospective cohort study [2] conducted in three tertiary referral hospitals in Japan. All patients with sepsis-induced DIC who required ventilator management were included. We stratifi ed all patients with diff erent disease severity, as defi ned by APACHE II and SOFA scores to three strata. Intervention eff ects estimated as hazard ratios were analyzed by Cox regression analysis adjusted for propensity model to detect subgroup heterogeneity of the eff ects of rhTM on in-hospital mortality. Results Eligible were 162 patients with sepsis-induced DIC; 68 patients received rhTM and 94 did not. After adjusting for imbalances, rhTM administration was signifi cantly associated with reduced mortality only in patents in the stratum II group (APACHE II, 22 to 27) (adjusted hazard ratio, 0.20; 95% confi dence interval, 0.05 to 0.74; P = 0.016), while not signifi cant in stratum I and stratum III (Figure 1) . A similar tendency was observed in analysis for SOFA score (stratum I (SOFA, -10), P = 0.368; stratum II (SOFA, 11 to 12), P = 0.012; stratum III (SOFA, 13-), P = 0.673). Conclusion In a group of oncology patients admitted to the ICU with neutropenia and severe sepsis/septic shock, we found an in-hospital mortality of less than 50%. This is similar to the general population. Reference Of the 83 ICUs that were contacted, 69 (83%) responded to the survey. Only 7% of ICUs still perform daily routine CXRs for all patients while 65% of ICUs say never to perform CXRs on a routine basis. A daily meeting with a radiologist is established in the majority of ICUs and is judged to be important or even essential. The therapeutic effi cacy of routine CXRs was assumed by intensivists to be lower than 10% or to be between 10 and 20%. The effi cacy of on-demand CXRs was assumed to be between 10 and 60%. There is consensus between intensivists to perform a routine CXR after endotrachial intubation, chest tube placement or central venous catheterization. Conclusion The strategy of daily routine CXRs for critically ill patients has developed from a common practice (63%) in 2006 [2] to a rare practice (7%) nowadays. Intensivists still assume the value of routine CXRs to be higher than the effi cacy that is reported in the literature. This might be due to the clinical value of negative fi ndings, which has not been studied before. We investigated the effi cacy and safety of chest radiographs (CXRs) performed on specifi ed indications only, directly after cardiac surgery. CXRs in the ICU are frequently obtained routinely for postoperative cardiosurgical patients, despite the fact that the diagnostic and therapeutic effi cacy of these CXRs is now known to be low [1] . Routine CXRs may only be benefi cial for certain indications and the discussion regarding these indications and the safety of abandoning routine CXRs is still continuing [2] . Methods We prospectively included all patients who underwent major cardiac surgery in the year 2012. A direct postoperative CXR was performed only for certain specifi ed indications. An on-demand CXR could be obtained during the postoperative period according to other specifi ed indications. For all patients who did not have a CXR taken before the morning of the fi rst postoperative day, a control CXR was then performed. All CXR fi ndings were noted and classifi ed, including whether or not they led to an intervention. Diagnostic and therapeutic effi cacy values were calculated. Results A total of 1,351 patients were included who mainly underwent coronary artery bypass grafting, valve surgery or a combination of both. Eighteen percent of patients underwent a minimally invasive cardiac surgery. The diagnostic effi cacy for major abnormalities was clearly higher for the postoperative and on-demand CXRs, performed on indication, when compared with the next-morning routine CXR (6.7% and 6.9% vs. 2.9%) (P = 0.002). The therapeutic effi cacy was also clearly higher for the postoperative and on-demand CXRs (2.9% and 4.1%), while the need for intervention after the morning control CXR was now reduced to be minimal (0.6%) (P = 0.002). producing an escalated response to an increasing score are used eff ectively within the Trust, yet mortality is high in patients admitted with a diagnosis of pneumonia. The CURB-65 score may not be an eff ective risk stratifi cation tool for predicting mortality in the older patient nor to guide intensive care admission [1] . Jarvis and colleagues developed an EWS based on laboratory blood tests and used it in conjunction with physiological EWS to eff ectively risk stratify all medical patients [2] . Methods Of 2,158 patients admitted with pneumonia during 12 months from August 2012, data were collected for 1,598 who had received the required blood tests. Data included dates of admission, discharge and death if appropriate, gender, haemoglobin (g/dl), white cell count (10 9 /l), sodium (mmol/l), potassium (mmol/l), urea (mmol/l), creatinine (mmol/l) and albumin (g/l) on admission. A composite EWS was calculated and measured against outcome. Results Of 1,598 patients, 538 died during this admission. It is uncertain whether death was due to pneumonia as only admission diagnosis is recorded but overall mortality was 35%. Analysis of data showed a strongly positive relationship between increasing EWS and increased risk of mortality with a correlation coeffi cient of 0.97. Conclusion Observational results suggest an EWS based on laboratory tests can be used to risk stratify patients with pneumonia and could be used to treat those with higher scores more aggressively earlier in their illness. Further analysis is required to determine whether age is a contributing factor and how this may modify the EWS. We need to determine whether laboratory-based EWS risk stratifi cation can be used in isolation, or whether it contributes suffi ciently to an existing physiological EWS that a combined system would improve outcome in our patients. Introduction Previous studies reported that video-assisted thoracoscopic surgery (VATS) is more benefi cial for pulmonary lobectomy concerning the late postoperative period than open thoracotomy [1, 2] . However, the early postoperative period when the rate of complications is highest has been scarcely examined. Our study aimed to compare the residual pulmonary function of pulmonary lobectomy patients after VATS and the standard thoracotomy approach in the immediate postoperative period. Conclusion Preoperative lung function prediction severely overestimates the real lung capacity of lobectomy patients in the immediate postoperative period. VATS lobectomy seems more benefi cial from the point of early postoperative lung function. VATS lobectomy should be considered in patients with poor preoperative spirometry results to ensure better postoperative outcome. Introduction The aim of our study is to evaluate the lung ultrasound (LUS) reaeration score (ReS) as a predictive tool for non-invasive ventilation (NIV) effi cacy in general wards for acute respiratory failure (ARF) treatment. Even if ICUs are considered the safest place to perform NIV, a shortage of intensive beds has lead to NIV application outside the ICU. With appropriate patient selection, treatment-timing choice, adequate monitoring and staff training, NIV application in general wards can allow one to safely treat patients at an early stage with better cost-eff ectiveness [1] . Few data assessing the right tool to evaluate NIV treatment effi cacy are available. Methods We present preliminary data of a prospective observational ongoing study. Sixteen patients undergoing NIV treatment outside the ICU for ARF of any origin were evaluated with LUS at three times: before NIV application (T0), and at 5 minutes (T5) and 60 minutes (T60) of NIV treatment. US scan was performed in six regions for each emithorax. LUS patterns were defi ned as: consolidation (C); multiple coalescent B-lines (B+); multiple irregularly spaced B-lines (B) and normal aeration (A). A LUS-ReS [2] was calculated detecting changes in the US pattern when comparing T0 to T5 and T0 to T60 assessments. Outcome was defi ned as NIV failure in the case of tracheal intubation or death within 1 week from ARF outset, otherwise NIV success. Ultrasound Society (DEGUM) for ultrasonography in surgery, anaesthesia or medicine were invited to participate in an online survey. Frequency of exposure to patients with suspected pneumothorax, frequency of LUS use, assessment of diagnostic accuracy of LUS for ruling-out or ruling-in pneumothorax and preferences regarding technical aspects were enquired about. Results Eighty-nine physicians responded. Average exposure to pneumothorax cases was 1/week. Fifty-fi ve per cent of respondents used LUS 'always' or 'frequently' . Thirty-four per cent of physicians rated LUS as 'always accurate' , and a further 54% as 'accurate in a majority of cases' in ruling out pneumothorax. Twenty-one per cent rated LUS as 'always accurate' and 69% as 'accurate in a majority of cases' in ruling in pneumothorax. Physicians reporting frequent exposure to pneumothorax patients used LUS in a higher proportion of cases ('high caseload sonographers') and were more confi dent to rule out pneumothorax (Figure 1 ). In total, 16 diff erent combinations of transducers, probe orientations and scanning modes were reported. Linear transducers, sagittal probe-orientation, and B-mode and M-mode scanning were most often selected (38%). Conclusion Physicians' use of LUS in the diagnosis of pneumothorax was modest. Assessment of diagnostic accuracy gave markedly lower scores than reported in clinical trials [1] . Correlation between frequency of exposure, likelihood of LUS usage and confi dence in diagnostic accuracy warrants further research into the nature of the learning curve. Considerable variations regarding technical aspects of LUS refl ect the ambiguity of current recommendations [2] . More research is required to establish the most effi cient way of performing LUS for suspected pneumothorax and eff orts need to be made to promote its use in these scenarios. Smoke inhalation injury (SII) is progress to pulmonary edema, pneumonia, and acute respiratory distress syndrome. SII may cause bronchial mucosal edema; we hypothesized that narrowing of luminal air bronchus due to bronchial wall edema correlated with respiratory deterioration of SII patients. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Methods We prospectively studied 42 patients with a diagnosis of SII, according to visualized bronchoscopic fi ndings at admission, and 15 control subjects. The thoracic high-resolution computed tomography (HRCT) scan was obtained within a few hours of admission to our hospital. Airway wall dimensions were calculated using a validated method. The images were viewed on a workstation using a magnifi cation of ×5, and measurements of overall (D) and internal (L) diameter of the bronchi were made using electronic calipers, with wall thickness (T) being derived from these measurements (T = (D -L) / 2). Luminal area (Ai, mm 2 ) and total airway wall area (Ao) were calculated from L and D, respectively, using the formula: A = r 2 . We used both the ratio of airway wall thickness to total diameter (T/D ratio) and the percentage luminal area (LA% = (Ai / Ao + Ai) × 100). Results The mean age of the patients was 59 years, 32 of the patients were men. The mean (SD) diameter of the bronchi in SII patients measured was 3.9 (1.5) mm (range 0.9 to 9.0 mm). There were statistically signifi cant positive associations between wall thickening (expressed as T/D ratio) and luminal narrow (expressed as LA%) and the developed pneumonia (T/D ratio: R 2 = 0.56, P <0.01 and LA%: R 2 = 0.19, P = 0.005) and mechanical ventilation days (T/D ratio: R 2 = 0.37, P <0.0001 and LA%: R 2 = 0.32, P <0.001, respectively). No statistically signifi cant associations were identifi ed between T/D ratio or LA% and initial P/F ratio, infusion volume initial 24 hours, ICU stay days, and outcome. The mean T/D ratio and LA% were 0.25 (0.04) and 25.9% (7.6) for patients with SII and 0.35 (0.04) and 44.7% (5.6) for controls. We have shown with the use of HRCT scanning on admission that patients with SII have airway wall thickening compared with normal controls. Furthermore, airfl ow obstruction due to bronchial wall edema related with developed pneumonia and mechanical ventilation days in SII patients. Introduction A semi-upright position (45° position) in ventilated patients is recommended to prevent ventilator-associated pneumonia (VAP) and is one of the fi rst steps in progressive early mobility. We studied ventilated intensive care patients in a semi-upright position compared with a supine position to explore whether there was an improvement of oxygenation and ventilation. Methods We retrospectively studied 60 patients in a mixed medical, surgical, neurological ICU during 2003 and 2007 [1] [2] [3] . In this study the eff ects of 45° position on the peripheral oxygen saturation (SpO 2 ) and the end-tidal carbon dioxide (ETCO 2 ) were measured. Body position was changed with a Total Care® bed (Hill-Rom) after results for the supine position (10°) were obtained. Half an hour after the body position was changed, measurements were taken, which included SpO 2 and ETCO 2 . Conclusion We demonstrated a signifi cant increase in oxygen saturation and a signifi cant decrease in end-tidal CO 2 when the head of the bed was elevated during mechanical ventilation. We believe positional therapy in intensive care patients is very important. The semi-upright position is an easy, eff ective and safe treatment in ICU patients. This position is eff ective in the bundle prevention of VAP, is the fi rst step in progressive early mobility and is also eff ective in oxygenation and ventilation in mechanically ventilated patients. This clinical benefi t of head-of-bed elevation >30° must lead to a standard of care in mechanically ventilated patients. Since 2009 the semi-upright position is a standard of care in our hospital. The eff ect of sitting in an armchair on mechanically ventilated patients has not been studied enough. We study a group of patients ready for weaning for the respiratory pattern, mechanics and work of breathing during reclining in bed and after sitting in an armchair. Methods Thirteen patients who needed mechanical ventilation after 18 days (1 to 60) were studied during volume assist-control mechanical ventilation and spontaneous breathing (O 2 T, CPAP or PSV) in both positions. Airways, esophageal pressures and fl ow were registered for posterior analysis. Passive respiratory mechanics were measured by multiple linear regression methods, respiratory drive in esophageal pressure (P01) and respiratory eff ort using the pressure time product (PTP). Results On controlled mechanical ventilation the respiratory system and chest wall elastance were signifi cantly higher in sitting compared with reclining positions (Ers 39 ± 24 vs. 33 ± 25 cmH 2 O/l and 10 ± 3 vs. 7 ± 3 cmH 2 O/l), respiratory resistances were similar (15 ± 3 vs. 14 ± 5 cmH 2 O/l/second) in both positions. loser, or is a draw. The pair is fi rst compared on mortality; if no diff erence, then ventilator-free days (VFD) are compared. If the outcomes are the same for both endpoints, a draw results. Active versus placebo groups will then be compared using the win ratio, defi ned as the number of pairs in which the active group was the winner divided by the number of pairs that did not result in a draw. We examined sample size and power characteristics of the win ratio endpoint using trial simulations. Results Assuming a 15% 28-day mortality rate in the active arm and 20% in the control arm, to have 80% power with a two-tailed 0.05-level test for mortality would require 906 subjects per arm. Under the same assumptions with a diff erence in mean VFDs of 3 favoring the active arm (with common SD of 6), 130 subjects per arm provides 80% power. In approximately 32% of simulations, the win ratio result for each pair was determined by mortality. These simulation results assumed results within pairs were uncorrelated. If a positive correlation for each endpoint within pairs is assumed, power for the win ratio endpoint increases. Conclusion The win ratio method is both clinically meaningful and straightforward to explain. This method could provide a new approach to powering both superiority and non-inferiority trials of novel antibiotics. In particular, this method allows for well-powered phase 2 trials, and potentially decreases the required size of phase 3 trials. The primary objective was to assess the impact of failure to obtain sputum culture (SC) among patients requiring intubation for pneumonia. For patients admitted to an ICU with severe pneumonia, guidelines recommend obtaining a lower respiratory tract sample for culture. Our experience suggested this is rarely ordered from the emergency department (ED). Methods We retrospectively reviewed charts of all patients admitted through the ED with a diagnosis of pneumonia requiring intubation in the fi rst 24 hours between January 2011 and November 2012. Patients were classifi ed as SC collected or not collected. We recorded demographic data, SC results, antibiotic choice and de-escalation, ventilator-free days (up to day 14), and mortality in ICU and hospital. Inferential statistics were performed using SPSS version 20.0, with P <0.05 considered signifi cant. Of 50 patients we reviewed, 43 (86%) were intubated in the ED, 45 had SC ordered (only eight (18%) by ED physicians), and 37 (74%) had SC collected. There was no diff erence in age, gender or severity of illness as measured by APACHE score between the two groups. ICU mortality was lower in the SC collected group (24% vs. 69%, P = 0.007), as was hospital mortality (30% vs. 77%, P = 0.007) and antibiotics were de-escalated more often (89% vs. 8%, P <0.001). Patient with SC collected showed a trend toward signifi cantly more ventilator-free days (6.5 vs. 0, P = 0.053). Conclusion Sputum cultures were rarely ordered by ED physicians, and when not obtained in intubated patients with pneumonia, ICU and hospital mortality was higher, there was less antibiotic de-escalation, and a trend toward fewer ventilator-free days. Eff orts to improve collection of sputum cultures in these patients are warranted. The aim was to determine the nonventilatory factors that aff ect the noninvasive mechanical ventilation (NIMV) success in patients with hypercapnic respiratory failure. Methods A total of 41 patients were included in this prospective cohort study, who were followed for at least 96 hours in the ICU between January 2010 and November 2012 with the diagnosis of hypercapnic respiratory failure. Patients with ≥10 mmHg decrease in PaCO 2 in the fi rst 72 hours were accepted as successful (Group 1) and those without this decrease were accepted as unsuccessful (Group 2). Among the patients with similar NIMV application characteristics, the eff ect of age, APACHE II score, infection, bronchospasm (daily respiratory function tests were performed with portable spirometry), heart failure, thyroid dysfunction and physiologic dead space ventilation (VD/VT) on success were evaluated. In statistical analysis, t test, Mann-Whitney U test and regression analysis were used. Introduction Average volume-assured pressure support (AVAPS) has been developed to ensure a more fi xed tidal volume along with the convenience and advantages of pressure support ventilation. In this study we compared the AVAPS with the BIPAP. Methods Approval was obtained from the hospital's ethics committee for the study. Thirty-three patients over 18 years of age with acute respiratory failure as a result of either internal or surgical reasons were included in the study. This study was conducted with the Philips V 60 ventilator, which includes both BIPAP and AVAPS mode. The implementation protocol for non-invasive ventilation (NIV) included, fi rstly, a 2-hour BIPAP application (Period Bi) and then, without interruption, a 2-hour AVAPS application (Period AV). After measuring the basal blood gas analysis, patients were informed of how NIV would be practiced and what function it would have. In BIPAP mode, the ventilator parameters were adjusted as follows; EPAP: 5 to 7 cmH 2 O, IPAP: 15 to 20 cmH 2 O. Patient comfort was analyzed with a scale ranging from 0 to 2 (0: compatible, 1: medium-compatible, 2: noncompatible). During BIPAP ventilation, levels of arterial blood gases (pH, pO 2 , pCO 2 and SPO 2 ), comfort scale and hemodynamic data were recorded at the 30th minute, fi rst hour and second hour. After the patient was monitored for 2 hours in BIPAP mode, the mode was changed to the AVAPS by setting the required rates without removing the mask. EPAP settings were adjusted as follows for AVAPS: 5 to 7 cmH 2 O, Pmin to max: 10 to 25 cmH 2 O, tidal volume: 6 to 8 ml/kg. As in the BIPAP mode, we analyzed and recorded the rates of arterial blood gases, comfort scale and hemodynamic parameters at the 30th minute, fi rst hour and second hour. In case of agitation that prevents NIV, patients were sedated by dexmedetomidine. Results When we analyzed patients according to their body mass indexes (BMI), pH and pCO 2 values of the patients with BMI ≤30 showed a greater improvement at all three measurements in the AVAPS compared with BIPAP (P <0.001). When patient compliance was examined, the number of patients regarded as comfortable in the BIPAP period was 20 (66.7%), but this fi gure was 25 (83.3%) for the AVAPS. Conclusion Patient comfort was higher and need for sedation was lower in AVAPS. According to the results obtained from this study, the AVAPS had positive eff ects on pH, gas variation and patient comfort; therefore, it can be confi dently used in clinical practice. Oxygenation index outperforms the P/F ratio for mortality prediction K Davies 1 , C Bourdeaux 1 , T Peiris 2 , T Gould 1 The P/F ratio is widely used clinically and as part of research to categorise severity of respiratory failure [1] . However, no account is made of an important determinant of oxygenation; mean airway pressure (MAP). The oxygenation index (OI) incorporates the MAP and has been suggested as a more accurate means of determining severity of respiratory failure [2] . In addition, the optimal time for this assessment is unclear. We sought to answer these questions by analysing a large database of patient and ventilator data. Methods The ICU of the Bristol Royal Infi rmary has used an electronic clinical information system (CIS) since 2008, with every hour of care available for analysis as a result. Ventilated patients were identifi ed, the P/F ratio and OI were calculated and the worst values for these determined for four time periods (fi rst 12, 24, 36 and 48 hours of ventilation). Logistic regression analysis was used to create models to predict unit and hospital mortality. Results Data for over 150,000 hours of care in 4,886 patients was available for analysis. Excluding nonventilated patients and those transferred ventilated from another ICU, 2,156 patients provided data. In comparison with survivors, nonsurvivors were older, with higher OI and 24-hour SOFA scores and lower P/F ratios. The optimal time for calculation of both OI and P/F ratio for mortality prediction is the fi rst 12 hours of ventilation. The models using worst OI are better predictors of ICU and hospital mortality than those using worst P/F ratio (area under receiver operating curve 0.840 vs. 0.822). Conclusion Our analysis suggests that the OI is a more sensitive descriptor of the severity of respiratory failure than the P/F ratio and that this calculation should be performed using data from the fi rst 12 hours of ventilation. Introduction Weaning from the mechanical ventilation (MV) composes about 40 to 50% of the total length of MV. Besides, there is no single reliable parameter indicating that patient will tolerate extubation safely. The rapid shallow breathing index (RSBI) is relatively the best predictive parameter for the initial assessment of readiness for discontinuation of MV support. But evaluation of the RSBI is valuable during T-tube ventilation; and in clinical practice it is not always possible to perform this assessment. In this study, we aimed to determine the best MV mode and pressure combinations that can predict successful RSBI closest to values calculated in spontaneous ventilation and estimate the patients' readiness for weaning. before T-tube trial in all patients. Measurements in the spontaneous ventilation was performed with the COSMOPLUS Novometrix device that has both capnography and respiratory monitorisation function; other measurements were performed with ventilators. The mean age of the study group was 73 ± 10 years; 11 of them were female and mean APACHE II score was 19 ± 6. RSBI did not diff er signifi cantly between spontaneous mode and other combinations, but the best correlation with spontaneous mode was found with PS:5 PEEP:0 (P = 0.0001, r = 0.719), and the worst with PS:0 PEEP:5 combination. RSBI calculated in each combination showed no predictive value for weaning success. Respiration rate (f) was higher in the SBT failure group than the SBT success group. When measured at PS:0 PEEP:5 and PS:5 PEEP:0 combinations, the threshold value of f was found to be 27/minute (P = 0.03). Conclusion Although there was a correlation between RSBI measured in the T-tube and RSBI measured in diff erent mode and pressure combinations, especially with the combination of PS:5 PEEP:0, a threshold value for RSBI cannot be detected during MV to predict SBT success. showed that noninvasive neurally adjusted ventilatory assist (nNAVA) improves patient-ventilator interaction and synchrony. More recently we described a new setting for nNAVA (nNAVA15) able to reduce the peak of electrical activity of the diaphragm (EAdipeak) and dyspnea (assessed by a visual analogue scale, VASd), compared with both nPSV and nNAVA, in patients undergoing NIV through a helmet, by improving the rate pressurization. We therefore designed a physiological study to evaluate and compare the eff ects of nNAVA15 with nPSV and nNAVA on VASd, EAdipeak, pressurization rate and arterial blood gases (ABGs). Methods Fourteen patients undergoing noninvasive ventilation because of acute respiratory failure underwent three randomized 30-minute trials: nPSV (inspiratory support above positive endexpiratory pressure (PEEP) ≥8 cmH 2 O, fastest rate of pressurization); nNAVA (NAVA level to achieve a comparable EAdipeak as during nPSV); nNAVA15 (NAVA level at 15 cmH 2 O/μV and the maximum inspiratory airway pressure (Paw) set at the value corresponding to PEEP + inspiratory support during nPSV). Oxygen inspiratory fraction and PEEP remained unmodifi ed throughout the study period. The last minute of each trial was analyzed. Paw-time products of the initial 200 ms from the onset of ventilator pressurization (PTP200), of the initial 300 and 500 ms from the onset of the EAdi swing indexed to the ideal PTP (PTP300i and PTP500i, respectively), and of the triggering area (PTPt) were computed. ABGs and VASd were assessed at the end of each trial. Conclusion Compared with nPSV and nNAVA, nNAVA15 through a mask reduces VASd, assuring an optimal pressurization rate and triggering performance, without aff ecting the breathing pattern, neural drive and ABGs. Introduction Neurally adjusted ventilatory assist (NAVA) has so far been used in minimally sedated intensive care patients. NAVA has not been applied in patients in the operation room. The eff ect of diff erent sedatives/anesthetics on the electrical activity of the diaphragm (Edi) has not so far been studied. The aim of our study was to compare the eff ect of sevofl urane and propofol on the Edi signal and breathing pattern during sedation and anesthesia and also in combination with remifentanil. Methods A randomized cross-over study comparing sevofl urane and propofol sedation and anesthesia in 10 juvenile pigs. Remifentanil 0.1 μg/kg/minute was added after a period of anesthetic agent administration. The animals were ventilated with NAVA with fi xed level throughout the study. Respiratory variables were measured for the last 5 minutes of each 15-minute exposure. The Edi signal and spontaneous breathing were preserved with both anesthetics. The breathing variability, expressed as the coeffi cient of variation (SD/mean) of the tidal volume (CVvt), was high with both drugs. CVvt was greater during with propofol than with sevofl urane (CVvt 32 vs. 18% during sedation and CVvt 23 vs. 14% during anesthesia). The frequency of sighs was higher with propofol both during sedation (29 vs. 12 sighs/hour) and anesthesia (21 vs. 1 sighs/hour) than with sevofl urane. Conclusion NAVA can be applied during propofol and sevofl urane anesthesia in pigs, with well-preserved Edi and spontaneous breathing. The natural variability is maintained with NAVA even during anesthesia. In contrast to sevofl urane, propofol sedation and anesthesia is associated with a high frequency of sighs and post-sigh apneas, probably due to a centrally induced mechanism. Our data warrant studies of NAVA in humans undergoing anesthesia and surgery when neuromuscular blockade is not required. In this single-center, parallel-group trial, we randomly assigned adult patients presenting signals of defi cient gas exchange (PaO 2 /FIO 2 <250 at a PEEP of 5 cmH 2 O) in the immediate postoperative period to either intensive alveolar recruitment or a standard protocol, both using low-tidal volume ventilation (6 ml/kg/ibw). Our hypothesis was that an aggressive alveolar recruitment protocol will be translated to better lung compliance, better gas exchange, fewer pulmonary complications and reduced length of hospital stay when compared with the control group. Results A total of 320 patients were enrolled in the study, 163 patients in the standard protocol group and 157 in the intensive alveolar recruitment group. Patients of the interventional group presented a higher incidence of pneumonia than patients for the control group (5 (3.3%) vs. 19 (22%), P = 0.004). The length of the hospital stay was shorter among patients receiving intensive alveolar recruitment than among those receiving standard care (10.9 (9.9 to 11.9) vs. 12.4 days (11.3 to 13.6); P = 0.045). There was no diff erence between groups according to extrapulmonary complications and mortality. Conclusion In this trial, an intensive alveolar recruitment protocol associated with a protective mechanical ventilation strategy reduced pulmonary complication and length of hospital stay in patients undergoing cardiac surgery (NCT01502332). We used transthoracic endoscopy [1] to continuously record images of subpleural alveoli during recruitment manoeuvres and diff erent steady plateau pressures between the manoeuvres. Methods We investigated two groups of 13 rats each. Animals were ventilated with ZEEP or PEEP of 5 mbar, FiO 2 of 1.0 and tidal volume of 10 ml/kg. A double low-fl ow manoeuvre was designed, consisting of two consecutive low-fl ow manoeuvres with a peak pressure of 30 mbar, interrupted by a 5-second plateau phase at diff erent pressures (2, 4, 8 and 12 mbar). Alveolar size at the peak pressures and during the plateau levels was analyzed from the recorded videos frame by frame. Therefore the alveolar outline of 10 alveoli was marked manually and the outlined area was calculated [2] . Compliance of tidal breaths before and after the manoeuvres was calculated using two-point compliance. In both groups, analysis of the alveolar area revealed that alveolar size at the second peak of the manoeuvre did not diff er signifi cantly compared with the fi rst peak (100.97% in ZEEP group, 102.37% in the PEEP group). During the plateau phases there was a slight increase in alveolar size at higher plateau pressures (slope of linear regression at plateau 4 mbar: 0.1 %/500 ms for ZEEP group, 0.18%/500 ms for PEEP group; at plateau 8 mbar: 0.42%/500 ms for ZEEP group, 0.565%/500 ms for PEEP group). After the manoeuvres, compliance increased to 137.73% in the ZEEP group and 119.91% in the PEEP group. Conclusion In the healthy lung, once recruited, alveoli stay stable in size over wide pressure ranges. Further recruitment manoeuvres do not lead to further increase of alveolar size, but increase of compliance. During plateau phases, alveolar size increases dependent on pressure. This leads to the assumption that recruitment is not only pressure dependent but also time dependent. References Introduction Lung-protective mechanical ventilation requires positive end-expiratory pressure (PEEP) and tidal volume (VT) to be chosen with regard to the individual state of the lung. The shape of the intratidal compliance-volume profi les might refl ect the state of the lung (atelectatic, open, overdistended) and could therefore be classifi ed into shape categories that translate into PEEP suggestions [1] . Intratidal resistance-volume profi les might refl ect intratidal opening and collapse of the lung [2] . Using respiratory data from an animal study we suggest a classifi cation into resistance shape categories based on the slope of the R(V) profi les. Methods Fifteen pigs with lavage-induced lung damage were ventilated at two PEEP levels (0 and 12 mbar) and three tidal volumes (8, 12 and 16 ml/kg bodyweight). Compliance (C(V)) and resistance (R(V)) profi les for each individual animal and ventilation setting were calculated from respiratory data using the gliding-SLICE method [3] . C(V) profi les were associated with one of the six suggested shape categories. The dependency of the mean R(V) slope of all animals on the ventilation setting was used as a basis for classifi cation into resistance shape categories. Resistance shape categories were compared with compliance shape categories for each individual animal to test whether similar PEEP suggestions result from both methods. Results Small PEEP and VT were typically associated with increasing C(V), and decreasing C(V) corresponds to large PEEP and VT. A classifi cation of each C(V) profi le into one of six compliance shape categories was possible. The shapes of the R(V) profi les of individual animals were remarkably similar. The R(V) slope was typically largest for a PEEP and VT setting at which derecruitment was likely and smallest where overdistension was likely. Based on this a classifi cation scheme was defi ned: 10 2.5. It is possible that lung inhomogeneities act as stress raisers within the lung parenchyma, locally multiplying pressures. In a healthy lung the pleural surface, vessels and bronchi are detected as natural lung inhomogeneities. We studied the development of VILI with CT scan to determine where the fi rst lung lesion developed. Methods Piglets were sedated, orotracheally intubated and instrumented with arterial and central venous catheter and urinary catheter. The whole study was performed in the animal CT scan facility, which was equipped as an ICU, and CT scan was performed every 3 hours or if respiratory parameters (plateau/peak pressure) changed. We defi ned as new lesion lung regions the appearance of poorly infl ated/not infl ated lung regions not present in the previous CT scan image. We select the fi rst CT scan in which a relevant number of new lesions appeared (>15) and manually delineated the lesions; the lesions were classifi ed as close/not close to the pleural surface. Results Six piglets were studied (22 ± 8 kg) that were ventilated with a TV of 750 ± 71 ml (41 ± 1 ml/kg) up to development of VILI, defi ned radiologically as infi ltrates present in all pulmonary fi elds at CT scan plus development of lung edema. In the fi rst CT scan where lesions appeared, a median of 28 lesions (IQ range 22 to 30) were present. Of these lesions, 18 (17 to 22) (72%) were located near the pleura and nine (6 to 11) (28%) near vessels/bronchi. See Figure 1 . Conclusion In an experimental model of VILI the fi rst lung lesions appear below the pleural surface. Mutiple nonmutually exclusive possible explanations are possible: the pleural surface acts as a stress raiser; the mechanical friction of the lung with the ribs at very high tidal volume leads to parenchimal injury; and the lung skeleton is a fan-like structure starting from the hilum and going to the pleural surface, leading to increased stress/strain of the subpleural regions. Introduction During mechanical ventilation some of the energy delivered to the respiratory system (RS) is dissipated within it, while some is recovered during expiration. The amount of unrecovered energy represents mechanical work done on the RS by the ventilator and may be related to the development of ventilatory-induced lung injury (VILI). The unrecovered energy is measured as the hysteresis area of the pressure-volume (PV) curve in static and dynamic conditions. We explored how and where the energy is dissipated inside the RS. Methods In fi ve piglets (weight 21 ± 2 kg) under general anesthesia, we recorded PV curves to quantify dynamic dissipated energy (DE) at increasing tidal volume (TV) (150, 300, 450, 600, 750, 900) and at increasing respiratory rate (RR) (3, 6, 9, 12, 15) . We then recorded PV curves for the same TV infl ated with a super-syringe (100 ml), to quantify static DE. We also quantifi ed airway DE connecting the postmortem isolated tracheobronchial tree to the ventilator and recording PV curves at the respiratory setting previously described. Introduction CT-scan quantitative analysis (qCT) represents the gold standard to assess lung aeration and recruitment in ARDS patients. Lung ultrasound (LUS) has been proposed as a bedside nonirradiating alternative to assess lung recruitability, identifying patients who may benefi t from higher PEEP levels. We compared the two methods in the assessment of PEEP-induced lung aeration changes. Methods LUS and whole-lung CT scan were performed on ARDS sedated, paralyzed, mechanically ventilated patients at PEEP 5 and 15 cmH 2 O. LUS was performed considering six areas for each lung, with a comprehensive scan of the intercostal spaces in each area. We assigned to each area a score of aeration [1] : 0 (normal lung), 1 (≥3 noncoalescent B-lines), 2 (≥3 coalescent B-lines), 3 (consolidation). A cumulative LUS score (LUSS, ranging from 0 to 36 for the two lungs) was obtained as sum of all areas' individual scores, each area's score being the average of all pertaining LUS fi ndings. LUS recruiters upon PEEP increase from 5 to 15 cmH 2 O were defi ned by the switch of at least three areas to well aerated (area score 0). LUS-based assessment of lung aeration and lung recruitability was compared with qCT fi ndings. corresponded to 34 ± 13% of nonaerated tissue at qCT; LUSS >20 (n = 6) to 48 ± 18% (P <0.05). A good linear correlation was found between reduction at LUS of consolidated areas (area score 3) versus reduction of qCT nonaerated volume (R 2 = 0.66), and between reduction at LUS of poorly aerated areas (area score 1 to 2) versus reduction of qCT poorly aerated volume (R 2 = 0.74). Change at LUS of at least three areas to well aerated (LUS recruiters, n = 4) corresponded to a qCT increase in well-aerated lung volume of 788 ± 262 g versus 431 ± 35 g in the LUS nonrecruiter group (n = 3) (P <0.05). Conclusion These preliminary data suggest that LUS could be an accurate tool to assess lung aeration and recruitment at the bedside, avoiding the risks and workload related to the use of CT scan. Reference Introduction Growing evidence suggests that, as long as the total lung capacity is not overcome, dynamic (that is, tidal volume, VT) is more injurious than static (that is, positive end-expiratory pressure, PEEP) lung deformation [1] . Because the lung behaves like a viscoelastic body [2] , hysteresis may play a role in the development of ventilatorinduced lung injury. The aim of the study was to investigate the eff ects of increasing VT or PEEP on lung hysteresis. Methods In eight healthy piglets we measured total hysteresis and the peak inspiratory pressure (Ppeak) while randomly increasing VT (with no PEEP) or PEEP (with fi xed VT). P1 was extrapolated from the drop in airway pressure during an end-inspiratory pause [3] . Hysteresis attributable to lung parenchyma was computed as: total hysteresis -((Ppeak -P1) × VT). The main fi ndings are shown in Figure 1 . P values refer to oneway repeated-measures analysis of variance. Conclusion Lung hysteresis increases with VT, but not with PEEP. Further studies are needed to prospectively evaluate the role of lung hysteresis in the pathogenesis of ventilator-induced lung injury. Introduction Mechanical ventilation (MV) aims to enhance blood oxygenation and to remove carbon dioxide. However, excessive hyperinfl ation by MV may cause lung injury. Methods Eighteen cats (4 ± 1 kg) were anesthetized with propofol (loading dose of 6 mg/kg and constant rate infusion of 0.5 mg/kg/ minute) and neuromuscular blockade was achieved with rocuronium at 1 mg/kg/minute. Their lungs were initially mechanically ventilated in FiO 2 of 40%, with peak inspiratory pressure (Ppeak) of 5 cmH 2 O for 20 minutes, and then the Ppeak was increased by 5 cmH 2 O increments until 15 cmH 2 O every 5 minutes. Following that, Ppeak was decreased by 2 cmH 2 O every 5 minutes until reaching Ppeak of 5 cmH 2 O. The ventilator maintained the respiratory rate and inspiratory time at 15 breaths/minute and 1 second, respectively. Between the Ppeak increments, we applied a 4-second pause for a 5-mm computed tomography (CT) scan of the thorax area. The radiographic attenuation (in Hounsfi eld units, HU) was classifi ed as over-insuffl ation (1,000 to 900 HU), normal insuffl ation (900 to 500 HU) and atelectasic (500 to 100 HU). We split the lungs into three proportional gravitational zones (I, II and III) from apex to base. The three zones presented increased over-insuffl ated areas and decreased areas with normal insuffl ation with increasing Ppeak from 5 to 15 cmH 2 O. At 5 cmH 2 O, the areas of over-insuffl ation and normal insuffl ation in zones I, II and III were 13% and 36%; 4% and 22%; and 0.7% and 15%, respectively. At 15 cmH 2 O, the areas of over-insuffl ation and normal insuffl ation in zones I, II and III were 74% and 55%; 81% and 57%; and 82% and 71%, respectively. Conclusion The higher proportion of overly distended pulmonary areas in high Ppeak may increase the risk of lung injury. The lowest Ppeak (5 cmH 2 O) showed less potential to lung injury as it has higher areas of normal insuffl ation and less areas of over-insuffl ation in all gravitational zones. Introduction Activation of lymphocyte apoptosis while reducing the endogenous nitric oxide is a predictor of adverse outcome in newborns on mechanical ventilation [1] . With the aim to improve the results of treatment we studied the eff ect of inhalable nitric oxide on the immune system in newborns with respiratory diseases on mechanical ventilation [2] . Methods With the permission of the ethics committee in a controlled, randomized, blind clinical trial we included 27 newborns with respiratory diseases on mechanical ventilation. Randomization was performed by the method of envelopes. Group I (n = 17), patients receiving inhalation of nitric oxide at a concentration of 10 ppm for 24 hours controlling the level of methemoglobin (Pulmonox mini; Messer II NO Therapeutics, Austria). Group II (n = 10) did not receive inhaled NO. At admission and at 3 to 5 days we studied subpopulations of lymphocytes by one-parameter immunophenotyping using reagents (Immunotech Beckman Coulter, USA): fi tz-labeled CD3, CD4, CD8, CD14, CD19, CD34, CD56, CD69, CD71, CD95 monoclonal antibody, the relative content of lymphocytes in early and late apoptosis using Annexin V + -labeled FITK and propidium iodide (PL + ), labeled with PE (Saltag, USA), with results on the Beckman Coulter Epics XL cytometer (USA). The statistical power of the study was 80% (α ≤0.05). Results In Group I relative to group II at 3 to 5 days we registered an increase in mature monocytes (CD14) -23.1 ± 0.8% (P <0.05); reduction in the relative content of CD69 -3.8 ± 0.21%, lymphocyte of apoptosis: (Annexin V -FITC + PI -) -7.12 ± 0.46% and (Annexin V -FITC + PI + ) -0.79 ± 0.07% (P <0.001). The duration of mechanical ventilation was 4.1 ± 1.4 days (P <0.05). All patients survived. None of the patients showed clinical or laboratory evidence of adverse eff ects of inhaled nitric oxide. In Group II seven newborns died, and the duration of mechanical ventilation in survivors was 18 ± 3.4 days. Conclusion Inhalable of nitric oxide activates monocyte-macrophage immunity, stabilizes the apoptosis of T-lymphocytes, and reduces mortality and duration of mechanical ventilation in newborns with respiratory diseases on mechanical ventilation. Introduction Sepsis-induced diaphragm dysfunction (SIDD) has been widely described in the literature as a condition aff ecting the diaphragm muscle characterized by contractility loss of function and associated with a high mortality, assessed at around 54% [1] . Previous studies have investigated the pathogenesis of ventilator-induced diaphragmatic dysfunction (VIDD), its lipid metabolic alterations [2] and microcirculatory function processes. This study was designed to investigate on diaphragm muscle the eff ects of LPS-induced endotoxemia in rabbits undergoing two diff erent modes of mechanical ventilation. Methods A prospective randomized animal study in 25 invasively monitored and mechanically ventilated New Zealand White rabbits. The rabbits were randomized to control (n = 5), controlled mechanical ventilation (CMV) (n = 5), pressure support ventilation (PSV) (n = 5), or CMV or PSV with LPS-induced endotoxemia (CMV-LPS and PSV-LPS respectively) (n = 5 for each). The endotoxemia was induced by LPS injection in the CMV-LPS and PSV-LPS groups. Rabbits were anesthetized and ventilated for 24 hours, except for the control (30 minutes). A catheter able to detect the electrical activity of the diaphragm was placed to evaluate the diaphragm contractility at baseline and after 6, 12 and 24 hours. After 24 hours, we evaluated: the diaphragm microcirculation assessed by a sidestream dark-fi eld videomicroscopy; the mitochondria membrane potential; the lipid accumulation; and the diaphragm muscular fi ber structure. Results In endotoxemic animals, after 24 hours, the diaphragm contractility and fi ber structure, the microcirculation, mitochondrial membrane potential and lipid accumulation were severely compromised, but not in the CMV and PSV groups. Moreover, a slight but signifi cant increase of lipid accumulation was observed in the CMV and PSV groups in comparison with control (P <0.05). Conclusion In endotoxemic rabbits, the impaired microcirculation resulted in an increased lipid accumulation and in a disturbance of the mitochondria membrane potential and contractility of the diaphragm. No microvascular alterations have been observed in ventilated non-endotoxemic animals. Moreover, the diaphragm contractility dysfunction was more pronounced in endotoxemic animals. References Conclusion This study adds to the literature on the use of HFOV in H1N1 patients. We managed to replicate some of the existing evidence in respect to the population age [1, 2] and the incidence of mortality. Whilst P/F ratios improved on initiation of HFOV, these patients subsequently had long critical care and hospital stays. There is uncertainty regarding the use of HFOV in ARDS, but it may be a valuable treatment for H1N1 patients. The ventilation strategies employed and the subsequent consequences for H1N1 patients require further evaluation. The aim was to study EIT as a monitoring tool for tidal ventilation (TV) redistribution following switching patients from volume-controlled ventilation (VCV) to airway pressure release ventilation (APRV) in patients with severe ARDS. Methods Six patients with severe ARDS having Pplat ≥30 cm were included in the study. Patients ventilated with the ARDSnet strategy were subjected to EIT analysis. Regional TV distribution was monitored by an EIT system (PulmoVista 500®; Dräger Medical GmbH, Lübeck, Germany), dividing the lung fi eld into four same-size regions of interest levels of cell-free DNA (cfDNA) increase and have prognostic value in sepsis [2] . In the present study, the eff ect of tidal volume and PEEP on arterial and transorgan levels of cfDNA was investigated in a porcine postoperative sepsis model. Methods Two groups of anaesthetised pigs were ventilated with either protective ventilation (VT 6 ml/kg, PEEP 10 cmH 2 O; n = 20) or controls (VT 10 ml/kg, PEEP 5 cmH 2 O; n = 10) for 7 hours. An artery, the hepatic vein, the portal vein and the jugular bulb were catheterized. Continuous endotoxin infusion at 0.25 μg/kg/hour for 5 hours was started after 2 hours of laparotomy that simulated a surgical procedure. The group receiving protective ventilation showed lower levels of cfDNA in arterial blood compared with controls (P = 0.02). Transhepatic levels of cfDNA were higher compared with trans-splanchnic levels during the experiment (P = 0.02), but this eff ect was attenuated in the group receiving protective ventilation. No diff erence between the groups was detected in blood samples from the jugular bulb. Conclusion In experimental postoperative sepsis, protective ventila tion suppresses arterial levels of cfDNA. The liver seems to be a signifi cant contributor to systemic cfDNA levels, an eff ect that is suppressed during protective ventilation. Introduction HFOV is a promising rescue therapy for refractory hypoxia in severe ARDS. Methods This is a retrospective comparative study. We retrieved data for all patients with H1N1 infl uenza-related severe ARDS treated in the ICU during October 2009 to April 2013. Our ICU had only one HFOV machine during the pandemic. Patients were divided into two groups: HFOV group (received HFOV at fi rst eligibility) and conventional lung protective ventilation (CLPV) group (did not receive HFOV at fi rst eligibility due to nonavailability of HFOV). Eligibility criteria for rescue therapy by HFOV were: P/F ratio ≤100; PEEP needed above 12 cm; Pplat ≥30 cm on CLPV. There was no selection or omission bias for HFOV application and HFOV was applied to the fi rst eligible patient. Patient demographic data, laboratory parameters, hemodynamic variables, and oxygenation and ventilator settings were recorded while on CLPV at fi rst HFOV eligibility in all patients. The total of 43 patients who met the rescue therapy criteria were further grouped into the HFOV group (24 patients) and the CLPV group (19 patients) depending upon modality of ventilation received after satisfying fi rst-time HFOV eligibility criteria. Both groups were comparable for diff erences with Fisher's t test for qualitative variables and ANOVA for quantitative variables ( The importance of protective lung ventilation in reducing mortality in adult respiratory distress syndrome (ARDS) patients is well described [1] . However, suboptimal compliance with the recommended tidal volumes has been reported [2] . Therefore, we wished to assess our compliance in adhering to protective lung ventilation in patients with, or at risk of developing, ARDS. Methods A retrospective review was done on all mechanically ventilated patients in the ICU of Tallaght Hospital over a 6-month period (February to July 2013). Hourly tidal volumes were recorded Conclusion Compliance with protective lung ventilation in our ICU is suboptimal. This may be due to the lack of education and guidelines in the unit regarding protective lung ventilation. Moreover, accurate recording of patient height and determination of predicted body weight should be documented to facilitate accurate tidal volume calculation and protective lung ventilation. Introduction During mechanical ventilation the expiration occurs passively and is determined by the recoil forces of the respiratory system. In an experimental study in pigs we could fi nd that linearization of the expiratory fl ow via FLow-controlled EXpiration (FLEX) [1] is lung protective [2] . Utilizing electrical impedance tomography (EIT) we aimed at investigating the mechanisms underlying the lung-protective eff ects of FLEX. Methods All experiments were approved by the local animal welfare committee. Twelve pigs with oleic acid-induced lung injury were ventilated in the volume-controlled mode (VCV). In six animals, expiratory fl ow was linearized via FLEX. PEEP was set to achieve similar mean airway pressure in the control group (n = 6) and in the FLEX group (n = 6). Using EIT, the local distribution of ventilation was measured and alveolar derecruitment during the no-fl ow phase in late expiration was quantifi ed. Results During ventilation with FLEX the no-fl ow phase in late expiration was reduced by 50% compared with passive expiration. Derecruitment during the no-fl ow phase was clearly reduced by FLEX compared with VCV. Furthermore, intratidal ventilation was more homogeneously distributed during ventilation with FLEX compared with conventional passive expiration. Conclusion In comparison with conventional VCV with passive expiration, the no-fl ow phase in late expiration is reduced and so is the time the lung persists on the lowest pressure level (PEEP) during the breath. The reduced low-pressure time is associated with reduced endtidal derecruitment. In a lung mechanically stabilized and recruited by sustained airway pressure throughout the expiration phase, the distribution of ventilation is more homogeneous. These mechanisms of alveolar recruitment maintenance can explain the lung-protective eff ects of FLEX. Introduction Numerous parameters have been suggested for the prediction of weaning from mechanical ventilation; however, these parameters have limited success in the prediction of weaning outcome. The aim of this study is to assess the success of minute peak fl ow rates (spontaneous peak inspiratory fl ow rate (SPIF) and spontaneous peak expiratory fl ow rate (SPEF)) measured during a spontaneous breathing trial (SBT) in the prediction of weaning outcome. Figure 1 . Conclusion Positive FB in the 48 hours preceding the SBT predicted WF in COPD individuals. We recognized that no intervention was performed in order to accelerate the weaning process. Brain natriuretic peptide levels were not available. The rapid shallow breathing index (RSBI) is considered a good parameter to predict mechanical ventilator liberation. We hypothesized that the RSBI provides no benefi t when clinical readiness criteria are met. Methods Adults with acute respiratory who required MV for more than 24 hours, excluding COPD, were assessed daily as a liberation protocol ( Figure 1 ). During the RSBI step, RSBI was recorded and blinded to the researcher. The liberation process was continued regardless of the RSBI result. The primary outcome was the success rate of mechanical ventilator liberation with or without RSBI. Results Analysis of 120 cases with clinical characteristics as presented in Table 1 . There was no statistically signifi cant diff erence between using only clinical readiness and using clinical readiness and the RSBI (92% vs. 89%, P = 0.43). Conclusion The inclusion of RSBI in our standard mechanical ventilator liberation protocol for patients who met the clinical readiness criteria did not signifi cantly increase the success rate of mechanical ventilator liberation. [1] . VAP is confi rmed by positive microbiology in approximately one-third of patients with suspected VAP [2] , implying that there is scope to improve antibiotic stewardship. In a single-centre study, bronchoalveolar lavage fl uid (BALF) infl ammatory mediators (in particular interleukin-1 beta (IL-1β)) [3] and neutrophil proteases [4] demonstrated potential as biomarkers to exclude VAP. We aimed to validate these fi ndings in a multicentre study. Methods We conducted a prospective, multicentre observational study of 167 patients with clinically suspected VAP from 12 ICUs across the UK. VAP was confi rmed by growth of a potential pathogen in BALF at >10 4 colony-forming units/ml. IL-1β, IL-8, matrix metalloproteinase-8 (MMP-8), MMP-9 and human neutrophil elastase (HNE) were measured in BALF by cytometric bead array. IL-6, IL-8, MMP-8, MMP-9 and HNE were measured in serum. Patients were dichotomised into VAP and Conclusion This study demonstrates that IL-1β eff ectively excludes VAP when validated in a multicentre study. The performance is further improved by the addition of IL-8, and the combination could form a rapid diagnostic assay to exclude VAP. Biomarker analysis appears to have the potential to improve antibiotic stewardship, and this concept should be formally tested in the setting of a randomised controlled trial. Introduction Ventilator-associated pneumonia (VAP) is a frequently occurring nosocomial infection in ICU patients and has been associated with increased morbidity, prolonged duration of ventilation and ICU stay and increased costs for healthcare. It was shown that early diagnosis of VAP and immediate initiation of appropriate antibiotics is associated with reduced morbidity and mortality. The aim of this study is to evaluate the potential ability of a screening test based on the clinical pulmonary infection score (CPIS) to identify and treat patients with VAP. Methods All fi les belonging to patients between 18 and 80 years old admitted to the ICU and supported by mechanical ventilation for longer than 48 hours were evaluated retrospectively. Demographic data of the patients, the time of mechanical ventilation, duration of the ICU stay and results (survival or death) were recorded. The CPIS was calculated after 48 hours for the diagnosis of VAP. The patients with CPIS >5 intubated were evaluated VAP(+) and the others with CPIS ≤5 were thought VAP(-). The diagnosis of VAP was bacteriologically confi rmed with the culture of endotracheal aspirate. Statistical evaluations were done according to the results on the day of intubation and the results on days 2, 3, 5, 8 and 10 after intubation. Scores of APACHE II and CRP levels were also recorded on the same days. The duration of mechanical ventilation and ratio of death were signifi cantly higher in the patients with VAP(+). CPIS levels in the patients with VAP(+) were signifi cantly higher than the patients with VAP(-) in the days after the diagnosis. CPIS levels were also higher in the patients with VAP(+) on the day of diagnosis. At the same day the parameters, which included the CPIS, body temperature, leukocyte number, tracheal secretions, PaO 2 /FiO 2 levels and the presence of infi ltrates on the chest radiograph, were signifi cantly higher in VAP(+) patients (P <0.05). ROC curves were formed for CPIS scores to be used in diagnosis VAP and the cutoff point had a sensitivity of 97.44% and a specifi city of 100% for diagnosing VAP. Conclusion At the end of the study, it was concluded that using the CPIS for early diagnosis and treatment of VAP and thinking that the patients with CPIS >5 were VAP(+) are guiding factors to resolve the problems associated with VAP in ICU patients. Methods A retrospective analysis of 17 patients (male/female ratio: 6/11; median age: 35 (range 16 to 63)) who underwent arteriovenous or venovenous interventional lung assist (iLA; Novalung, Germany) support as bridging to primary LTX (n = 11) or re-LTX (n = 6) between 2005 and 2013. The underlying diagnosis was bronchiolitis obliterans syndrome III in re-LTX patients (n = 6), cystic fi brosis (n = 5), idiopathic pulmonary fi brosis (n = 2), emphysema (n = 1), adult respiratory distress syndrome (n = 1), hemosiderosis (n = 1), and chronic obstructive lung disease (n = 1), respectively. The type of iLA was arteriovenous in 10 and venovenous (iLA active) in seven patients. The median bridging time was 14 (1 to 58) days. The type of transplantation was bilateral LTX (n = 6), size-reduced bilateral LTX (n = 5), lobar bilateral LTX (n = 4), and right single LTX with contralateral pneumonectomy (n = 1), respectively. Hypercapnia was eff ectively corrected in all patients within the fi rst 12 hours of iLA therapy: PaCO 2 levels declined from 145 (70 to 198) to 60 (36 to 99) mmHg, P <0.0001. iLA was initiated during non-invasive ventilation in three patients, of whom one was intubated prior to LTX. All other patients (n = 14) were placed on iLA while on invasive MV. Of those, three patients were extubated and remained on iLA until LTX, one patient was weaned from iLA and remained on MV until LTX, and one patient was weaned from iLA and MV prior to LTX. Five patients were switched to extracorporeal membrane oxygenation (venovenous n = 2, venoarterial n = 3) after 5 (1 to 30) days on iLA support. One patient died prior to LTX due to septic multiorgan failure (SMOF). All others (n = 16; 94%) were successfully transplanted. Of these, two patients died in the ICU due to SMOF. The remaining 14 patients (82%) survived to hospital discharge and were alive at a median follow-up of 20 (1 to 63) months. In patients with life-threatening hypercapnia, bridging to LTX with iLA is feasible, and results in favorable short-term and longterm outcome. The primary aim of this study was to identify in adult intensive care patients whether there was a diff erence in the acute physiological response when a patient is sat out into a chair compared with when a patient is placed in a chair position using an electric bed. The secondary aim of this study was to observe the functional outcome of these patients [1] . Methods The study was conducted in an adult tertiary referral ICU over a 3-month period. Patients that met predetermined inclusion/exclusion criteria were allocated to either sitting in a chair or sitting up in an electric bed. Heart rate, respiratory rate, tidal volume and mean arterial pressure were obtained for all patients when they were supine in bed, at 1 minute and at 1 hour in the new position. Arterial blood gases were obtained at 1 hour in their new position. A functional outcome measure known as the Chelsea Critical Care Physical Assessment Tool (CPAX) was also taken on the day of admission, on the day of sitting out, on the day of discharge from intensive care and at ward level. All data were analysed using Student's t test. Results Sixteen subjects were recruited for this study. There was a signifi cant increase in paO 2 (13.6 ± 2.35 kPa, P = 0.01) and decrease in paCO 2 (4.82 ± 1.27 kPa, P = 0.02) in the chair group at 1 hour after sitting out in the chair when compared with baseline (10.9 ± 2.44 kPa; 5.41 ± 1.32 kPa). Also there was a signifi cant increase in tidal volume in the chair group after 1 minute of sitting out (403 ± 118 ml) compared with baseline (314 ± 105 ml). There was no diff erence in the electric bed group for all physiological parameters. The chair group had a better CPAX score on discharge from intensive care (chair group 24; electric bed group 13) and on discharge from the hospital (chair group 39; electric bed group 16). There were no adverse cardiovascular responses to either position. Conclusion Sitting suitable critically ill patients out into a chair is safe and can signifi cantly improve the arterial blood gas measurement and the tidal volume when compared with sitting a patient into the sitting position in an electric bed. Reference Introduction Sputum is essential for the protection of the respiratory tract but also plays a signifi cant role in the pathophysiology of lung disease [1] . This is evident in critical care where high sputum loads contribute to respiratory failure [2] . The quantity of sputum produced by a patient can impact on key decisions such as weaning, extubation and discharge. We undertook a survey to establish whether there was a consensus on how we quantify sputum on our intensive therapy unit (ITU). Methods We conducted a multidisciplinary team questionnaire of our 28-bed tertiary ITU. Staff were asked how they quantifi ed sputum load in intubated patients. They were also asked to rate statements on a fi ve-point scale pertaining to sputum characteristics. The results were analysed in Excel 2010. Results One hundred members of staff completed the sputum production in intensive therapy (SPIT) questionnaire (21% doctors, 71% nurses, 8% physiotherapists). Sputum load was deemed to be important or essential by more than 95% of respondents when making decisions to extubate or decanulate. The quantifi cation of sputum was inconsistent: 39% of respondents counted the frequency of suctioning, 24% measured the quantity of sputum in the suction tubing, whereas 25% used another method. An eff ective cough, consistency and colour were felt to be more important features of sputum than blood staining. Conclusion Our results showed a very high level of agreement on the importance of knowing sputum load for decisions to extubate, decanulate or discharge from the ITU. In contrast, there was little consensus on how we should quantify sputum load in ventilated patients. This lack of standard approach may contribute to uncertainty in the clinical decision-making process. We have developed an objective sputum scoring system. Components identifi ed as important by our survey such as suction frequency, sputum consistency and colour are included. We have recognised the benefi ts of the standardised Bristol stool chart to facilitate communication and believe this can be achieved with sputum load in ventilated patients. Introduction Bilevel non-invasive ventilation (NIV) is an established therapy in chronic obstructive pulmonary disease (COPD) and cardiogenic pulmonary oedema but evidence for its use in other acute respiratory conditions is less robust. Reported ICU mortality after NIV treatment of pneumonia ranges between 18 and 33% [1, 2] , compared with 10% in exacerbations of COPD [3] . We aimed to study the outcomes of patients with acute respiratory failure of mixed aetiology treated with NIV in our critical care unit and compare fi ndings with those already published. Methods Data were collected retrospectively on patients admitted to our critical care unit with acute respiratory failure requiring NIV over a 3-year period using the Metavision electronic patient record system. Patients with a primary surgical problem and those who received continuous positive airway pressure as a primary intervention were excluded. We recorded: primary respiratory diagnosis causing respiratory failure; patient demographics; serial arterial blood gas results; success of NIV as defi ned by the British Thoracic Society (BTS) [4] ; and mortality statistics. Conclusion NIV is used to treat acute respiratory failure due to a wide range of aetiologies in our unit with comparable mortality rates to large published series [1, 2, 5] . A successful response to NIV in the fi rst 6 hours is associated with a reduction in 1-year mortality when compared with nonresponders. We studied the eff ect of nasal high fl ow (NHF) for postoperative respiratory failure after extubation in our general surgical ICU. Recently some studies have reported that NHF improves oxygenation and reduces respiratory rate. However, the usefulness of NHF in the general surgical ICU has not been fully determined. Methods A prospective observational study was conducted in our general surgical ICU to investigate the eff ect of NHF on respiratory parameters in patients with postoperative respiratory failure. Patients who were admitted to the ICU for postoperative respiratory failure (defi ned as oxygen saturation as measured by pulse oximetry <96% and/or respiratory rate>24 beats/minute while receiving more than 6 l/ minute oxygen through a facemask) were eligible in this study. Pre and 1 and 6 hours after NHF treatment, we collected PaO 2 , PaCO 2 , respiratory rate, heart rate, and blood pressure. Introduction Use of extracorporeal life support (ECLS) in trauma casualties is limited by concerns regarding hemorrhage, particularly in the presence of traumatic brain injury (TBI). We report usage of ECMO/ interventional lung assist (iLA) as salvage therapy in 13 trauma patients. A high-fl ow technique without anticoagulation was used in cases with coagulopathy or severe TBI. Methods Data were collected from all adult trauma cases referred to one center for ECMO/iLA treatment due to severe hypoxemic respiratory failure. Thirteen consecutives cases are reported. The type of assistance was chosen based on a fl owchart. Type of study: therapeutic, level of evidence IV. We analyzed patient data, injury data, blood gases before connection, methods of assistance, coagulation study, complications, survival and neurological outcome. Results Thirteen casualties had an average Injury Severity Score of 50.3 ± 10.5 (age 27.7 ± 8.6 years, 69.2% male) and were supported 9.9 ± 4.8 days on ECMO (n = 7) and 7.16 ± 5.9 days on iLA (n = 6). All suff ered severe chest injuries, including one cardiac perforation. Most were coagulopathic prior to initiation of ECMO/iLA support. Among the seven patients with TBI, four had active intracranial hemorrhage. Only 30% of the patients received continuous anticoagulation during the fi rst 24 hours of support without clotting of the system or diagnosis of a thromboembolic event. Complications directly related to support therapy were not lethal; these included hemorrhage from a cannulation site (n = 1), accidental removal of a cannula (n = 1) and pressure sores (n = 3). Deaths occurred due to septic (n = 3) and cardiogenic shock (n = 1). Survival rates were 57 and 83% on ECMO and iLA, respectively. Follow-up of survivors detected no neurological deterioration. Conclusion ECMO/iLA therapy can be used as rescue therapy in adult trauma cases with severe hypoxemic respiratory failure, even in the presence of coagulopathy, bleeding and/or brain injury. The benefi ts of oxygenation and circulatory support must be weighed individually against the risk of hemorrhage. Further research should determine whether ECMO therapy also confers survival benefi t. Introduction While complement protein defi ciencies are associated with severe and recurrent pulmonary infections, excessive complement activation plays a role in the pathogenesis of lung injury. We hypothesized that inhibition of the complement system by repetitive treatment with nebulized plasma-derived human C1-esterase inhibitor (C1-INH) reduces pulmonary complement activation and subsequently attenuates lung injury and lung infl ammation in a model of severe Streptococcus pneumoniae pneumonia. Methods Thirty-two male rats were intratracheally challenged with S. pneumoniae to induce pneumonia. Rats were repeatedly exposed to nebulized C1-INH or saline, 30 minutes before induction of pneumonia and every 6 hours thereafter. Rats were sacrifi ced 20 or 40 hours after inoculation to investigate early and late eff ects. BALF and lung tissue were obtained for measuring levels of complement activation (C4b/c in BALF), lung injury (total protein levels in BALF), and infl ammation (IL-6 levels in lung tissue). Results Pneumonia was characterized by bilateral macroscopic infi ltrates, bacterial outgrowth in the lung and clinical signs of illness. Pneumonia was associated with pulmonary complement activation. In rats treated with nebulized C1-INH, a functional fraction of C1-INH was detectable in BALF. However, C1-INH treatment did not aff ect Introduction Management of acute respiratory diseases must avoid ventilator-induced lung injuries along with the rise of PaCO 2 and respiratory acidosis [1] . Eff orts are made to fi nd devices assisting protective ventilation and able to remove arterial CO 2 and correct acidosis [2] . An extracorporeal CO 2 removal device driven by the widely used Prismafl ex® platform called PrismaLung® was tested in vivo. Methods Five hypercapnic ventilated pigs were equipped with the PrismaLung® system designed to remove CO 2 from the bloodstream through a decarboxylation membrane mounted on a renal replacement device without any hemofi lter. Experiments examined the potential for blood decarboxylation by gas-exchanger membrane with diff erent sets of parameters (blood fl ow rate: 200, 300 and 400 ml/minute, sweep gas fl ow: 2, 5, 10 and 50 l/minute, FiO 2 : 21 and 100%). Statistical analysis was performed with the Student t test. The extracorporeal device allowed effi cient CO 2 removal rates at FiO 2 1 (Figure 1 ) and 0.21 (Figure 2) , ranging from 40 to 60 ml/ minute. Effi ciency was increased with blood and sweep gas fl ows. Carbia and pH of animals were signifi cantly modifi ed after 10 minutes Introduction Referral for ECMO has been demonstrated to reduce mortality in severe hypoxic respiratory failure [1] [2] [3] . The numbers of patients that undergo ECMO is still small and the service depends on timely referral from regional ICUs. There is evidence that intensivists' views on the role of ECMO are mixed [4] . The purpose of this study is to determine whether there are variations in the geographical distribution of patients that receive ECMO. Methods NHS England provided the home primary care trust (PCT) of all adult patients referred for ECMO for potentially reversible respiratory failure from 2008 to 2012. The referrals from each PCT were indexed to the population of each area to produce a referral rate per 1,000,000 people. Results See Figure 1 . ECMO services have expanded rapidly in the last 5 years in England following the publication of evidence for its effi cacy and concerns regarding an infl uenza pandemic. The referral rates for ECMO for severe hypoxic respiratory failure vary greatly around the country from 88 per 1,000,000 population in Leicester City to no referrals in 32 PCTs. Possible explanations could include: the distributions of swine fl u around the country, referring doctors' beliefs about the effi cacy of ECMO, local access to high-frequency oscillation ventilation and possible reluctance of teaching hospitals to refer to specialist centres. Further investigation to account for this variation appears indicated. Conclusion The referral rates for ECMO vary greatly around the country. References Introduction Using micro-computed tomography (MicroCT), we assessed the eff ectiveness of a cleaning closed-suctioning system (CSS) to remove secretions from the endotracheal tube (ETT) lumen. Biofi lm growing within the ETT, soon after intubation, increases the patient's risk to develop ventilator-associated pneumonia, and new cleaning devices have been designed to keep the ETT clean from secretions [1] . Methods In a bench test, we injected a water-based gel into unused ETTs to evaluate MicroCT scan (SkyScan 1172; Bruker, Belgium) eff ectiveness to measure secretions. In six critically ill patients, a cleaning CSS (Airway Medix Closed Suction system; Biovo, Tel Aviv) was used three times a day to keep the ETT clean. After extubation, we measured ETT secretions volume by MicroCT scanning over a length of 20 cm from the ETT tip. We also collected ETTs from 11 patients treated with a standard CSS as controls, and evaluated ETT microbial colonization. The volume of gel measured by MicroCT strongly correlated with the volume of injected gel (P <0.001, R 2 = 0.99). At extubation, a lower amount of secretions was measured in the ETTs treated with the cleaning CSS as compared with controls (0.031 ± 0.029 vs. 0.350 ± 0.417 mm 3 , P = 0.028), corresponding to a smaller occupation of the cross-sectional area (average 0.3 ± 0.4 vs. 3.8 ± 4.5% respectively, P = 0.030). Microbial colonization tended to be reduced in the ETTs treated with the cleaning CSS (total bacterial charge 1.3 ± 1.7 vs. 3.6 ± 2.7 log(CFU/ml), P = 0.08). Conclusion MicroCT scan showed high precision and accuracy in measuring the volume of secretions in bench tests and can thus be used to evaluate the eff ectiveness of actions or devices studied to reduce ETT biofi lm accumulation. In a small nonrandomized population of critically ill patients, the use of an ETT cleaning device appeared eff ective to reduce the volume of secretions present in the ETT at extubation. Reference Introduction Today's healthcare environment has forced providers to constantly evaluate the materials used, and to fi nd less expensive alternatives. Recently, low-cost endotracheal tubes (ETTs) have been introduced to the market. The aim of this study was to test these tubes (Portex AirCare and Cardinal Health ETT) compared with endotracheal tubes with known performance (Hi-Lo Covidient pre ISO standard and Taper-Guard Covidient, post ISO standard). Methods We used the required test setups according to the ISO standard for cuff sealing performance. The tubes were tested versus each other in size 7.0, 7.5 and 8.0 mm. Kink performance was done in the routine manner. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Results The leak test showed that the Portex AirCare ETT leaked signifi cantly more compared with both the Hi-Lo (P <0.05) or Taper-Guard (P <0.001) ETTs. The standard deviation leak rate for AirCare was large, suggesting varying seal performance between tubes from various lots. The kink test showed no diff erence among tubes. Conclusion Although tubes look the same there may be diff erences in performance. This study demonstrated that the new cheaper ETTs had greater cuff leak compared with the two tubes used for comparison. We can only speculate whether diff erences found in this study is a result of cost cutting. However, the great variability in sealing performance between ETTs of the same size from diff erent lots would indicate that manufacturing controls may be less stringent. Although we cannot demonstrate that the higher incidence of leak will result in adverse patient outcomes, one can surmise that the possibility exists. Thus, selection of endotracheal tubes should not be based on purchase price alone but should take into account documented performance in standardized tests. Methods A retrospective review was performed using data in the last 5 years. All OPs were from the ICU of our university hospital. Only OPs who underwent a PDT were selected with BMI >30 kg/m 2 . All OPs needed prolonged mechanical ventilation. A total of 67 OPs were identifi ed, with 60 PDTs placed using the Ciaglia Blue Rhino (CBR) Introducer Kit and seven PDTs with the UniPerc PDT Kit. All PDTs were performed by dedicated staff including residents. Valuation of clinical anatomical and physio-pathological features of the OPs and US scan of the neck came before the procedure. At the beginning of the procedure we placed a 5 mm ID orotracheal tube by tube exchange with video-FBS assistance, as already described [3] . An 8 to 9 mm ID wire-reinforced silicone tracheostomy tube (rTT) with adjustable fl ange was chosen instead of a standard PVC or silastic TT (sTT) in all OPs treated with CBR because of the anatomical particularities of OPs and because of external traction by the weight of the tubing attached to the TT. An extralong TT (eTT) was chosen because the pretracheal tissue was too thick for a regular-sized TT in 16 OPs with BMI >40 kg/m 2 treated with CBR. For three OPs treated with CBR we needed to change rTT to eTT because of tube dislodgement and subocclusion X-ray and video-FBS diagnosis. The UniPerc technique was chosen for OPs with BMI >40 kg/m 2 . Results No major complications (aborting procedure, >50 ml bleeding, TT misplacement, death) were observed. We had only minor complications (<50 ml bleeding: 3%; ring fracture: 2%; diffi cult insertion: 21% only with CBR rTT because of the step between the tip of the rTT and its introducer). UniPerc eTT placement has always been easy. Conclusion In our experience, the data do not support what previous studies have shown suggesting increased risk of complication in OPs [1, 2] . We know that the sTT could not be eff ective in OPs. The use of US, video-FBS assistance [3] , and rTT with an adjustable fl ange allows a safe and eff ective adjustment to anatomical OP particularities, avoiding collected risks. References Introduction High-fl ow oxygen therapy (HFOT) delivered through nasal cannulas can improve oxygenation as compared with low-fl ow oxygen devices. It has been shown that nasal HFOT can generate a positive airway pressure, which increases linearly with the gas fl ow rate. Data on the use of HFOT delivered through a tracheostomy are scarce. The aim of the present randomized, controlled, cross-over trial was to assess the eff ects of HFOT delivered through a tracheostomy on arterial blood gases and endotracheal pressure in critically ill patients. Methods Tracheostomized patients underwent HFOT with three gas fl ow rates (10 l/minute, 30 l/minute and 50 l/minute), randomly applied for 20-minute periods. At the end of each period, arterial blood gases, respiratory rate, and endotracheal pressure (Ptrach) were measured. Ptrach was recorded over the last 3 minutes of each study period: the maximum expiratory pressure (MEPtrach) and mean expiratory pressure were measured and averaged for all respiratory cycles during 1-minute recording with stable breathing. FiO 2 was kept constant during the whole study. Results Seventeen tracheostomized patients were enrolled (SAPS II 52 ± 10, PaO 2 96 ± 27 mmHg, PaCO 2 33 ± 10 mmHg). Increasing the gas fl ow rate from 10 l/minute to 30 l/minute was associated with an increase in PaO 2 /FiO 2 that did not improve further when 50 l/minute was used (259 ± 66, 317 ± 79, and 325 ± 76, respectively, P <0.001). The same trend was observed with PaO 2 (89 ± 19 mmHg, 109 ± 26 mmHg, and 113 ± 29 mmHg, respectively, P <0.001) and SaO 2 (96 ± 3%, 98 ± 2%, 98 ± 2%, respectively, P <0.001). PaCO 2 (32 ± 8 mmHg on average) and respiratory rate (27 ± 7 breaths/minute on average) did not change with diff erent gas fl ow rates. MEPtrach (0.96 ± 0.43 cmH 2 O, 1.32 ± 0.4 cmH 2 O, and 1.89 ± 0.5 cmH 2 O at 10 l/minute, 30 l/minute and 50 l/minute, respectively, P <0.01) and mean expiratory pressure (0.54 ± 0.27 cmH 2 O, 0.91 ± 0.29 cmH 2 O, and 1.36 ± 0.35 cmH 2 O, at 10 l/ minute, 30 l/minute and 50 l/minute, respectively, P <0.01) increased with fl ow. Changes in PaO 2 /FiO 2 were not correlated with changes in expiratory pressures. Conclusion When HFOT is used through a tracheostomy at increasing gas fl ow rate, oxygenation increases up to 30 l/minute while CO 2 clearance and the respiratory rate do not vary. Tracheal expiratory pressure increases with fl ow, but changes are small and probably of limited clinical relevance. Changes in oxygenation are not related to the variations of tracheal expiratory pressure. HFOT through a tracheostomy has diff erent eff ects from when a nasal interface is used. The 4th National Audit project, conducted by the Royal College of Anaesthetists in 2011, reviewed major airway complications in the UK [1] . It highlighted tracheostomy tube displacement as a signifi cant risk in critical care patients. Prior to 2011, at the Royal London Hospital ICU, the Tracoe Twist (marketed by Kapitex) was primarily used as the default choice of tracheostomy tube. However, over a 5-year period, there were a signifi cant number of critical incidents related to tracheostomy tube displacement or blockage. On investigation of these incidents, root-cause analysis identifi ed inadequate length of the Tracoe Twist as a major contributing factor. As a consequence of this, there was also an increasing requirement for the use of adjustable fl ange tubes. This was particularly apparent in obese patients. Our objective was to develop a new tracheostomy tube that would reduce the number of incidents related to inadequate length. An additional aim was to maximise the inner diameter for a given external diameter, thus reducing airway resistance and potentially aiding weaning. Methods We formed a consultation committee, consisting of consultants in critical care, a consultant surgeon and a consultant anaesthetist with a specialist interest in head and neck anaesthesia and tracheostomy care. We collaborated with the Kapitex design team to develop a new tracheostomy tube which addressed some of the perceived defi cits of existing devices. We developed a prototype tracheostomy tube that had an increased length for a given internal diameter. We also increased the mobility to fl ange, in order to reduce pressure areas and assist with fi xation. A third modifi cation was to ensure a maximal internal diameter for a given external diameter. We piloted its use in 20 patients. No adverse events were observed and the clinical impression was that the increased length was benefi cial to the patients. Conclusion Following the success of the pilot study, this new tracheostomy tube was developed and then marketed by Kapitex. This tube was named the Tracoe Twist Plus. We have now used this tube for 2 years. We have since performed an extensive retrospective audit of tracheostomy usage and the eff ect of the introduction of the novel Tracoe Twist Plus, and the complication rate was reduced in terms of displacement and obstruction. Reference in ICU patients, MV with a defl ated cuff in patients with a tracheostomy can be provided safely and comfortably, by use of a BiPAP Vision®. Air leakage to the upper airway enables speech [2] . By adding a Passy-Muir® speaking valve as second step, the quality of speech and cough will improve. The ability to speak provides an important improvement in communication. Methods The aim of this study was to compare weaning from MV by gradually decreasing the level of support in cuff -defl ated ventilation with use of a BiPAP Vision® and a Passy Muir® speaking valve, or by trials of spontaneous breathing with use of a speaking valve, both for progressively longer periods of time. We examined the diff erences in the ability to speak, the duration of the weaning period, the occurrence of delirium and the frequency of tracheal suctioning. We performed a single-centre retrospective and prospective observational study in a 22-bed mixed ICU during 1 year. Data were collected using the patient data management system. Baseline criteria were age, gender, APACHE IV score, ICU length of stay and duration of MV before placement of the tracheostomy. Results Ten patients were included, fi ve in the BiPAP group and fi ve in the spontaneous group. There were no signifi cant diff erences in the baseline criteria. On the second day after tracheostomy, three out of fi ve patients in the BiPAP group were able to speak compared with one in the spontaneous method group. A diff erence in speaking ability remained until day 9 (see Figure 1 ). At fi rst time of speaking, the BiPAP group had higher PEEP level (10 vs. 7.5 cmH 2 O) and higher SOFA score (6.2 vs. 4.6) compared with the spontaneous group. There was no signifi cant diff erence in delirium, duration of weaning and tracheal suctioning between both groups. Conclusion Cuff -defl ated MV in ICU patients enables speaking during ventilator dependence. With this technique the ability to speak started in an earlier phase of weaning compared with weaning with spontaneous breathing trials and a speaking valve. References In vitro evaluation showed that the DLET had the lowest K 2 (7 = 11.33; 7.5 = 8.74; 8 = 7.57; 7f = 6.13; 7.5f = 10.52; 8f = 12.28; F4 = 130.0; F5 = 11.12; DLET = 5.25 cmH 2 O/l/minute). During in vivo evaluation, PT was performed with the conventional ETT with FOB and DLET for fi ve patients in each group (age 69 ± 13 vs. 71 ± 16; SAPS II: 56 ± 14 vs. 52 ± 20; GCS 3 vs. 4). Gas exchange before and after the procedure did not diff er between the groups, but the Δ values of pH, PaO 2 and PaCO 2 measured before and after the procedure were, ETT+FOB versus DLET: ΔpH: -0.05 ± 0.05 versus 0.01 ± 0.02, P = 0.04; ΔPaO 2 : -112.6 ± 112.6 versus 41.6 ± 25.3, P = 0.01; ΔPaCO 2 : 14.5 ± 10.8 versus 0.6 ± 1.1, P = 0.02; ΔHCO 3 : 0.5 ± 2 versus -0.04 ± 0.3, P = 0.6. Conclusion The DLET resulted in adequate airway patency and minimal obstruction due to the lower channel exclusively dedicated to patient's ventilation. Gas exchange in PT with the DLET remained stable without any variation in oxygenation and carbon dioxide levels, although the same settings of mechanical ventilation. Introduction Respiratory weaning in ICUs can be a lengthy and expensive process [1] , but may be facilitated by the use of tracheostomies. Discharging patients with tracheostomies to general wards improves ICU bed availability but raises potential patient safety issues. This is demonstrated by the increased mortality compared with patients decannulated before discharge from the ICU [2] . We investigated how often ICUs in the UK discharge patients with tracheostomies to wards, which wards these are and whether systems are in place to ensure adequate safety on discharge. Methods We telephoned 217 ICUs in the UK. Nursing staff answered a series of questions regarding the discharge of patients with tracheostomies to the wards and their follow-up. We obtained information from 203 ICUs. A total of 201 units used tracheostomies for respiratory weaning. In total, 151 routinely discharged patients to wards with tracheostomies, 11 never did and 39 did occasionally. Five discharged to the high dependency unit only, 60 to respiratory wards only, 70 to specialist wards and 15 to any or most wards. Eighty-fi ve out of 190 units discharged patients with tracheostomy cuff s both up and down, 72 discharged with the cuff down or cuffl ess and 16 with the cuff 'usually down' . A total of 141 hospitals had routine follow-up for tracheostomy patients from critical care outreach or other services. Critical care outreach was available 24 hours a day in 65 hospitals. Conclusion The vast majority of ICUs in the UK perform tracheostomies for respiratory weaning and many routinely discharge patients to the wards prior to decannulation. Routine follow-up is usually available, but cover may only be available during the day. Patients may go to a specialist ward with trained nurses but this is not always the case. Patients are often discharged to wards with their tracheostomy cuff up, raising major safety issues if their tracheostomy tubes block and nurses are not trained for such emergencies. Twenty-four hours a day critical care outreach cover may improve patient safety, but further research and the production of guidelines is needed to facilitate the safe discharge of patients with tracheostomies from ICU to the wards. Introduction Percutaneous dilatational tracheostomy (PDT) is the standard airway access in critically ill patients who require prolonged mechanical ventilation. However, patients with severe coagulopathy or thrombocytopenia might have an increased risk of periprocedural bleeding. Methods We retrospectively reviewed the records of all patients who underwent PDT (using the Ciaglia technique with bronchoscopic guidance) on our cardiothoracic ICU between January 2004 and February 2013. Patients were stratifi ed into two groups: no coagulopathy (group 1), and coagulopathy/thrombocytopenia defi ned as international normalized ratio >1.5, partial thromboplastin time >50 seconds and/or platelet count <50 × 10 9 /l (group 2). Results From a total of 1,001 patients (46% male, mean age 68.1 years) that underwent PDT, we identifi ed 441 patients (44.1%) with a severe coagulopathy (group 2). There were no procedure-related deaths. Major procedure-related complications included a severe bleeding (requiring transfusion and/or surgery) in two patients in each group (one laceration of the brachiocephalic trunk, one venous bleeding, two bleedings from a thyroid vessel), injury of the membranous wall of the trachea in two patients in group 2 as well as a pneumothorax and a device failure in group 1. The incidence of moderate periprocedural bleeding was comparable between the two groups (n = 43 (9.75%) vs. n = 41 (7.3%), P = NS). Conclusion Periprocedural bleeding complications during and after PDT are rare, even in patients with a severe coagulopathy, and thus PDT can be safely performed in these patients. Introduction Percutaneous tracheostomy has become an established procedure in airway management of critically ill patients. Repeat PDT is considered a (relative) contraindication as a result of distorted anatomy. Methods A retrospective review of all repeat bedside percutaneous dilatational tracheostomies (Ciaglia technique with direct bronchoscopic guidance) performed on our cardiothoracic critical care unit from January 2004 to February 2013 was conducted. Results From a total of 1,001 patients undergoing PDT, we identifi ed 36 patients with repeat PDT. Patients' previous tracheostomies dated back between 5 days and 2.7 years (mean 122 days). Mean age was 60.3 ± 14.3 years, 42% of patients were female. The mean time from intubation to PDT was 3.7 ± 3.9 days. There were no deaths associated with PDT but one major procedure-related complication: one patient suff ered from a periprocedural laceration of the brachiocephalic trunk. After emergency surgery and surgical tracheotomy (ST), the patient recovered completely. In all other patients, no conversion to ST, no loss of airway, no paratracheal insertion, and no accidental tracheal extubation was observed. No pneumothorax, pneumomediastinum, hypotension, hypoxemia, or arrhythmias were recorded. A mild bleeding was observed in 13 patients (36%). A moderate but not signifi cant bleeding was observed in only one patient (2.8%). A tracheal ring fracture occurred in six patients (16.7%). Fourteen patients (38.9%) could be weaned successfully from the respirator and the tracheostomy could be removed. Thirteen patients (36.1%) were transferred to another ICU with the tracheostomy in place. The functional and cosmetic outcomes of PDT were excellent. Conclusion Repeat PDT is a safe procedure in experienced hands and should not be generally considered a contraindication. However, special attention should be paid to the anatomical situation with an increased risk of vascular complications. The tracheostomy is an ancient technique that more recently has developed a percutaneous technique. Percutaneous tracheostomies (PCT) have been shown to be safer and reduce infection, cost and other complications over surgical techniques [1] [2] [3] . Ventilatorassociated pneumonia (VAP) is a serious complication resulting from the use of endotracheal tubes (ETT) and tracheostomies. Changes in design of these tubes by the addition of a subglottic suction port have been shown to improve VAP rates in mechanically ventilated patients [4, 5] . A large meta-analysis review showed that subglottic drainage reduced the number of days of mechanical ventilation required and reduced the number of days stayed on the ICU [4] . Methods We contacted all ICUs in the UK by telephone and spoke to the nurse-in-charge to ascertain their normal practice with regards to PCT and subglottic suction use. We contacted a total of 246 general ICUs, 72% of which we received a response. The average number of beds per ICU from all units who responded was 11. Ninety-eight per cent of ICUs that we questioned did use PCT. For three units, the average bed number per unit was 11 and the other 2% of ICUs who did not use PCT had fi ve beds per unit on average. The proportion of ICUs that employed subglottic suction ports on their ETTs was 43% having on average 11 beds per unit, whilst the proportion of ICUs that did not employ subglottic suction ports was 57%, also with 11 beds per unit on average. Regarding PCT subglottic suction ports, 38% of ICUs did utilise these tubes whilst 62% did not. Of the group of ICUs that did use subglottic suction ports on their tracheostomy tubes, the average beds per unit was 12. Of the group of ICUs that did not use subglottic suction ports on their tracheostomy tubes, the average beds per unit was 10. Conclusion Signifi cant diff erences in practise exist with PCT and subglottic suction ports on tubes. The size of the ICUs in these groups is variable. The larger units are more likely to use PCT over the smaller units. Regarding subglottic suction ports on ETT and tracheostomy tubes, the size of the ICU does not necessarily dictate their use. We propose that all ICUs review their policy on the use of PCT and subglottic suction-assisted tubes to help improve surgical complications, cost, VAP rates and ICU stays. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Introduction Over 5,000 tracheostomies are performed in the UK per year [1] . The 4th National Audit Project identifi ed signifi cant morbidity and mortality associated with tracheostomy care [1] . The National Tracheostomy Safety Project (NTSP) 2013 manual highlighted the need for: local policy; an appropriate care environment; immediate availability of emergency equipment; trained staff and local training programmes; and bed-head sign and emergency algorithms for tracheostomy patients [2] . Following these guidelines, we asked: how are adult tracheostomy patients managed post discharge from the intensive care/high dependence unit (ICU/HDU) throughout the UK? Methods In November 2013, 200 adult ICU/HDUs throughout the UK were contacted to take part in a telephone survey. Data were collected on tracheostomy weaning, post-ICU/HDU care, safety guidelines, emergency protocols and training for clinicians and nurses. Results Out of the 200 adult ICU/HDUs contacted, 134 took part in the survey. Out of these, 44% have a tracheostomy weaning protocol, 69% initiate weaning whilst the patient is mechanically ventilated, and 92% use a speaking valve in their weaning process. Also, 87% allow tracheostomy patients to have oral nutritional intake and in 59% of these the decision involves speech and language therapy. Post ICU/ HDU, 67% of units discharge to specialised wards, 22% to nonspecialised wards, 4% to dedicated step-down units and 6% do not step down their tracheostomy patients. A critical care outreach team reviews the patients in 73% of the hospitals surveyed. Furthermore, only 11% of the hospitals have a consultant lead tracheostomy ward round and 17% have a tracheostomy multidisciplinary team (MDT). Also within these hospitals, 57% have their own tracheostomy safety guidelines and 70% have emergency tracheostomy management protocols. On the wards: 34% have tracheostomy bed-head information signs, 93% have emergency bed-side tracheostomy equipment, 89% have a tracheostomy training programme, and 50% have a MDT approach to decannulation. Conclusion There is a wide variation in post-ICU/HDU management of tracheostomy patients throughout the UK. Although there are well established UK national guidelines for the management of tracheostomy patients, outside the ICU/HDU environment there is a lack of full implementation of the NTSP recommendations, increasing the risk of tracheostomy-related morbidity and mortality. Introduction Ventilator-associated pneumonia (VAP) is the most common nosocomial infection among ventilated patients and is associated with increased mortality and morbidity [1] . Oral chlorhexidine has been used to decontaminate the airway in critically ill patients, as studies suggest a risk reduction in VAP [2] . Chlorhexidine reacts with soaps in toothpaste to form inactive insoluble salts [3] . A minimum delay of 30 minutes between tooth brushing and the subsequent application of chlorhexidine is therefore recommended [4] . Methods A telephone questionnaire was conducted on all ICUs in the UK to assess current oral decontamination procedures with regards to chlorhexidine use and the timing of tooth brushing with toothpaste. Results Sixty-fi ve per cent of ICUs in the UK responded to our survey (n = 157). Ninety-seven per cent (n = 152) used chlorhexidine and 96% (n = 150) used it as part of a ventilator care bundle. Forty-six per cent (n = 70) used a gel, 32% (n = 48) used a mouthwash and 23% (n = 34) used both preparations. The frequency of chlorhexidine application varied between ICUs; 15 (9.9%) applied 4-hourly, 91 (59.9%) 6-hourly, 20 (13.2%) 8-hourly, 19 (12.5%) 12-hourly and seven (4.6%) applied at variable times. Ninety-seven per cent (n = 152) brushed patient's teeth; 86% (n = 130) used toothpaste, 3% (n = 5) used chlorhexidine gel and 11% (n = 17) used both. Ninety-seven per cent (n = 147) of ICUs using chlorhexidine also brushed patient's teeth with toothpaste. Forty-eight per cent (n = 70) administered chlorhexidine within 30 minutes of toothpaste application (Table 1) . Conclusion Chlorhexidine is being used too soon after the application of toothpaste in 48% of ICUs in the UK. This results in attenuation of its eff ect and may remove its benefi cial risk reduction in VAP. Awareness of this interaction should be emphasised. Introduction We aimed to assess the eff ect of continuous drainage of subglottic secretion in the prevention of ventilator-associated pneumonia (VAP) in patients requiring prolonged mechanical ventilation for more than 48 hours in the ICU as a prospective, randomized, controlled study. Methods Our study was performed with a document from the ethics committee and written informed consent from the relatives of patients between April 2011 and February 2012 in our 14-bed ICU. Fifty-four patients whose mechanical ventilation requirements were expected to be longer than 72 hours were included in our study. Patients were randomly divided into two groups. These were formed as the group using a conventional intubation tube (Group C) and the group using an intubation tube allowing aspiration of subglottic secretions (Group S). In Group S, continuous subglottic aspiration occurred under constant pressure with a special device. In both groups, the cuff pressure was maintained at a constant pressure of 20 to 30 using a digital cuff pressure device [1] . Results In Group C, VAP was developed in 10 (35.7%) of 28 patients. In group S, VAP was developed in fi ve (21.7%) of 23 patients. In both groups when compared according to the development of VAP, no Introduction Subglottic secretion drainage (SSD) has been shown to reduce the incidence of ventilator-associated pneumonia (VAP) [1] . We reviewed current UK practice and practicalities surrounding implementation of SSD using a survey. Methods We constructed a survey of 10 questions using SurveyMonkey and circulated it via an email link to the Linkmen of the UK Intensive Care Society. Responses were received between August and November 2013. We had 77 responses. The majority were from doctors (88%) and the rest from nurses. Of respondents, 63% worked in district general hospitals, 28% in teaching hospitals and the rest in specialist units. Overall, 54% of respondents worked in units using SSD. From these responses, the types of patients receiving SSD are summarised in Table 1 . One hundred per cent of units used intermittent SSD. Seventyone per cent of respondents reported that SSD tubes were stored only on their ICU, with 26% reporting availability in acute areas and the rest hospital wide. Twenty-eight per cent of respondents indicated it was unit policy to reintubate patients to facilitate SSD. More than 90% of units had a ventilator care bundle and regularly measured cuff pressures. Overall, 83% of those surveyed thought SSD was benefi cial in the prevention of VAP. Conclusion Despite specifi c recommendations from the UK Department of Health [2] , only about one-half of respondents work in ICUs where SSD has been adopted. Most studies show benefi t in patients expected to be ventilated for greater than 72 hours [1] , but most units used SSD in all intubated patients. The reintubation rate to facilitate SSD was also reasonably high, despite a lack of evidence to support this practice. In the vast majority of hospitals, SSD endotracheal tubes are stored only on the ICU and so the need for reintubation may result from a lack of available appropriate tubes at the point of fi rst intubation. Introduction Emulsifi ed perfl uorocarbons (PFC) are synthetic hydrocarbons that can carry 50 times more oxygen than human plasma. Their properties may be advantageous in applications requiring preservation of tissue viability in oxygen-deprived states, which makes them a potential candidate for combat and civilian prehospital resuscitation. Our hypothesis was that an intravenous dose of PFC increases vital organ tissue oxygenation, improves survival and reduces or prevents the development of ventilator-associated ARDS. Here we report data from the second part (ARDS only) of a multiphase swine study to investigate the benefi ts of PFC in treating hemorrhagic shock and preventing ARDS. Methods Anesthetized Yorkshire swine were randomized (n = 6/ group) to receive a bolus of the PFC Oxycyte™ either 45 minutes before (PFC-B) or after (PFC-A) induction of ARDS or nothing as a control (NON). ARDS was induced via intravenous oleic acid infusion (time 0 (T0)) over 30 minutes. Animals were monitored for physiological and hematological parameters. They were euthanized at T180 minutes and a full necropsy and histopathological analysis was performed. Results Survival was 100% in the NON group, 80% in the PFC-A group and 20% in the PFC-B group. Mean arterial pressure (MAP) and mean pulmonary artery pressure (MPAP) were signifi cantly increased during infusion of PFC and during ARDS in the PFC-B group, while cardiac output (CO) was signifi cantly reduced. In the PFC-A group it was observed that MAP and MPAP increased and CO decreased during ARDS induction, but not during PFC infusion. Those changes were signifi cant in comparison with the NON group. Oxygen delivery and consumption in the PFC-A group were signifi cantly increased. Histopathological analysis is currently being performed. Interim analysis showed a trend to reduced alveolar damage in PFC-A animals. Conclusion Administration of PFC before induction of ARDS was detrimental, while giving PFC after ARDS improved oxygen delivery and increased oxygen consumption. Although survival in this group was lower than in the NON control group (80% vs. 100%, not signifi cant), a reduction in alveolar damage was observed. This might improve longterm outcome after ARDS. Based on these data we will continue to the fi nal phase of this project and evaluate the capacity of PFC to prevent ARDS in combination with HS. Introduction Venovenous extracorporeal membrane oxygenation (VV ECMO) is a treatment option for acute respiratory distress syndrome (ARDS) to minimize ventilator-induced lung injury including lifethreatening pneumothorax. The purpose of our study was to investigate the safety and effi cacy of VV ECMO for preventing pneumothorax in ARDS patients who were complicated with emphysematous/cystic changes in the lung. Methods We have retrospectively analyzed data of ARDS patients complicated with emphysematous/cystic changes in the lung who were admitted to our ICU from 2006 through 2012. We divided the subjects into two groups, patients treated with VV ECMO (ECMO group), and those treated only by conventional ventilator management (non-ECMO group). Correlations between age, sex, underlying disease, PaO 2 / FIO 2 ratio on admission, duration of ICU stay, survival and incidence of pneumothorax were evaluated. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Results Forty-one patients were included in this study (ECMO and non-ECMO group, 21 and 20 patients, respectively). There were no signifi cant diff erences between ECMO and non-ECMO groups as regards age, sex, underlying disease, PaO 2 /FIO 2 ratio, duration of ICU stay, and survival. In the ECMO group, the mean duration of ECMO use was 17 ± 13 days, and bleeding due to anticoagulation was observed in fi ve patients. The mean airway pressure in the ECMO group was signifi cantly lower than in the non-ECMO group (12 ± 6 cmH 2 O, 22 ± 6 cmH 2 O, respectively; P <0.0001). The incidence of pneumothorax was also signifi cantly lower in the ECMO group than the non-ECMO group (10%, 45%, respectively; P = 0.015). In Kaplan-Meier analysis, the proportion of pneumothoraxfree patients was signifi cantly higher in the ECMO group (P = 0.014). In multivariate analysis, conventional ventilator management, presence of interstitial pneumonia and the duration of intubation were the independent risk factors of pneumothorax (hazard ratio (HR), 18.0, P = 0.010; HR 33.3, P = 0.025; HR 1.05, P = 0.041, respectively). Conclusion Although the survival rate was not statistically diff erent, the use of ECMO for ARDS patients complicated with emphysematous/ cystic changes in the lung markedly reduced the incidence of pneumothorax. The underlying molecular mechanisms for the association between aging and a higher susceptibility to develop ARDS are poorly understood. The pulmonary renin-angiotensin system (RAS), with a lung-protective (angiotensin-converting enzyme (ACE)2-angiotensin (Ang)-1,7-Mas receptor) axis and a lung-injurious (ACE-Ang II-Ang II receptor (AT1)) axis, has been implicated in the pathogenesis of ARDS and changes with age. We hypothesized that injurious ventilation has an age-dependent eff ect on the pulmonary RAS in LPS-challenged rats that is associated with increased lung injury. Methods Infant (~1/2 month), juvenile (~1 month), adult (~4 months) and older (~19 months) Wistar rats were challenged with intratracheal LPS and injurious ventilation using tidal volumes of 15 ml/kg for 4 hours. Lung injury was assessed by wet-to-dry ratio and changes in P/F ratio. Levels of infl ammatory mediators were measured in bronchoalveolar lavage fl uid; mRNA expression of key genes of the pulmonary RAS was determined in lung homogenates. Results LPS-challenged and ventilated older rats showed higher mortality, larger change in wet-to-dry ratio and larger decline in P/F ratio, compared with other age groups. Increases in neutrophil infl ux and infl ammatory mediators were age dependent, with higher levels with increasing age. Compared with controls, ventilated LPSchallenged rats showed a decrease in mRNA expression of ACE, ACE2, AT-1 and Mas receptor in all age groups, except for infants. The relative decrease in the mRNA expression of the Mas receptor was most extensive in older rats, thereby shifting the balance towards the lung injurious axis in this age group. Conclusion The eff ects of injurious ventilation on lung injury are age dependent in a model of LPS-challenged rats, which is associated with a more pronounced imbalance of the pulmonary RAS at the expense of the lung-protective axis with increasing age. Th17 (IL-6, IL-8, IL-9, IL-17) and Th1 (TNFα, IL-15, IL-12p70) cytokines [1, 2] . However, the exact role of T-helper cells (Th) in the ALI model remains unknown. We hypothesized that there might be Th imbalance within the lung in the early phase of LPS-induced ALI. This study was to assess the role of lung Th polarization response in ALI mice. Methods C57BL/6 mice were randomly divided into two groups: control group and ALI group. ALI animals received 2 mg/kg LPS. Lung wet weight/body weight (LW/BW) was recorded to assess lung injury. The pathological changes were examined under an optical microscope. The mRNA expression levels of T-bet, GATA-3 and RORγt were determined by quantitative real-time reverse transcriptase-polymerase chain reaction. Meanwhile, levels of IL-6, IFNγ, IL-4 and IL-17 in lung homogenates were assessed by enzyme-linked immunosorbent assay. The increase in LW/BW was induced in ALI mice. Histologically, widespread alveolar wall thickening caused by edema, severe hemorrhage in the interstitium and alveolus, and marked and diff use interstitial infi ltration with infl ammatory cells were observed in the ALI group. Meanwhile, the levels of IL-6 in lung tissue were signifi cantly enhanced in the LPS-induced ALI mice. The mRNA expression of T-bet and RORγt was upregulated in ALI mice at 24 hours and 48 hours relative to normal mice (P <0.05 vs. Con). There was no signifi cant diff erence in the expression of GATA-3 among groups at 24 hours and 48 hours. Meanwhile, the levels of IFNγ, IL-17 and IL-6 in lung tissue were signifi cantly enhanced at 24 hours and 48 hours in the LPS-induced ALI mice. In addition, the levels of IL-4 in lung tissue were signifi cantly enhanced at 48 hours in the LPS-induced ALI mice. The expression of T-bet mRNA and RORγt mRNA had a strong correlation with the IL-6 concentration. However, there was no signifi cant correlation of GATA-3 with the IL-6 concentration. In addition, there was a signifi cant correlation of IFNγ, IL-4 and IL-17 with the IL-6 level in LPS-induced ALI at 24 hours and 48 hours. Conclusion ALI provokes Th1 and Th17 polarization response. Th1 and Th17 may participate in the early infl ammatory response to ALI. Introduction Dendritic cells (DC) may play an important role in acute lung injury (ALI) [1] . CD80 is the crucial co-stimulatory molecule that is expressed on the surface of DCs. However, little is known about the expression of CD80. The purpose of this study was to observe the expression of CD80 on circulating, lung and splenic dendritic cells (DC) in ALI mice. Methods Twelve C57BL/6 mice were randomly divided into two groups: control group and ALI group. Blood, lungs and spleens were harvested at 6 hours after LPS or PBS administration. The level of CD80 on DC was assessed by fl ow cytometry (FCM). The IL-6 level in the lung was measured by enzyme-linked immunosorbent assay. Lung wet weight/ body weight (LW/BW) was recorded to assess lung injury. Meanwhile, pathological changes were examined under an optical microscope. Results LPS-ALI resulted in a signifi cant increase in lung W/D ratio. Histologically, widespread alveolar wall thickening caused by edema, severe hemorrhage in the interstitium and alveolus, and marked and diff use interstitial infi ltration with infl ammatory cells were observed in the ALI group. Meanwhile, the levels of IL-6 in lung tissue were signifi cantly enhanced in the LPS-induced ALI mice. FCM analysis showed that the level of CD80 on circulating DC in control group was (3.3 ± 1.5)%, CD80 expression on lung DC was (3.6 ± 1.2)%, and expression of CD80 on splenic DC was (9.0 ± 3.6)%, which was signifi cantly higher than that on circulating DC and lung DC (P <0.05). In the ALI mouse, the level of CD80 on peripheral blood DC was (5.1 ± 2.1)%; the CD80 level on lung DC was (9.6 ± 2.50)%, which was signifi cantly higher than that on peripheral blood DC (P <0.05); and the level of CD80 on splenic DC was (25.2 ± 4.7)%, which was signifi cantly higher than CD80 levels on the peripheral blood and lung DC (P <0.05). The CD80 level on lung and splenic DC in ALI mice was signifi cantly higher than that on lung and splenic DC in control mice (P <0.05 vs. Con). Conclusion There is a dynamic characteristic in the expression of CD80 on DC populations in normal and ALI mice. Elevated expression of CD80 on DC seems to play an important role in the pathogenesis of ALI. Acknowledgements Supported by the Research Project CPSFG 2013M542578, JSPSFG 1301005A, SYS201251 and 2013NJZS50. Reference Introduction Two recent RCTs (OSCAR and OSCILLATE [1, 2] ) showed that high-frequency oscillatory ventilation (HFOV) had no positive impact on mortality. We present our experience over 5 years. Methods Adult ARDS patients who received HFOV from 2008 to 2012 were included. Demographics, illness severity and outcomes were collected retrospectively. Results A total of 118 patients were included; 56.8% were male, mean age was 54.8 years. RRT use was 45% during admission. Vasoactive agent use and neuromuscular blockade infusion rate was 81.9 and 29.7% pre HFOV respectively. The 28-day and 6-month mortality was 61.9 and 70.3%. A total of 60.1% had less than 48 hours conventional ventilation (CV) pre HFOV. The 6-month mortality was 64.8% for this group. Patients who had over 48 hours CV pre HFOV had a 6-month mortality of 76.6%. See Table 1 . Conclusion Mortality rates were higher than in recent trials [1, 2] . Our patients represent a more critically unwell group with lower PF ratios pre HFOV and high vasoactive and RRT use. HFOV may still have a role in the treatment of these very sick patients with treatment refractory to conventional ventilation. Introduction Losartan, an antagonist of angiotensin II (Ang II) type 1 receptor, is a potential therapeutic drug for acute lung injury (ALI). Recent reports suggest that losartan inhibits T-helper (Th)-1 immune response and ultimately attenuates infl ammation in several angiotensin II-mediated infl ammatory diseases [1, 2] . However, the possible protective mechanisms of losartan in ALI remain poorly understood. This study was to assess the eff ect of losartan on the lung Th polarization response in ALI. Methods C57BL/6 mice were randomly divided into three groups: control group, ALI group and ALI + losartan group. ALI animals received 2 mg/kg LPS; ALI + losartan animals received 2 mg/kg LPS and 15 mg/kg losartan 30 minute before intratracheal injection of LPS. The pathological changes were examined under an optical microscope. The mRNA expression levels of T-bet, GATA-3 and RORγt were determined by quantitative real-time reverse transcriptase-polymerase chain reaction. The increase in LW/BW induced by LPS was partly prevented by pretreated with losartan. Histologically, losartan eff ectively attenuated the LPS-induced lung hemorrhage, and leukocyte cell infi ltration in the interstitium and alveolus. Meanwhile, the levels of IL-6 in lung tissue were signifi cantly enhanced in the LPS-induced ALI mice. With pretreatment of ALI mice with losartan, the level of IL-6 in lungs markedly decreased. The mRNA expression of T-bet and RORγt was upregulated in ALI mice at 24 hours and 48 hours relative to normal mice (P <0.05 vs. Con). There was no signifi cant diff erence in the expression of GATA-3 among groups at 24 hours and 48 hours. Of note, pretreatment of ALI mice with losartan resulted in signifi cantly reduced mRNA expression of T-bet at 24 hours and 48 hours and RORγt mRNA expression at 48 hours (P <0.05 vs. ALI). Meanwhile, the levels of IFNγ, IL-4, IL-17 and IL-6 in lung tissue were signifi cantly enhanced at 24 hours and 48 hours in the LPS-induced ALI mice. In addition, both IFNγ and IL-17 in lung tissue at 24 hours and 48 hours decreased signifi cantly in losartan-pretreated mice compared with the ALI mice. With pretreatment of ALI mice with losartan, the level of IL-4 in lungs was not changed. Conclusion Ang II-induced Th1 and Th17 polarization response could upregulate infl ammatory response and induce lung injury, and losartan may be a promising substance for clinical use in LPS-induced ALI. The increasing prevalence of infections by multi-resistant organisms (MDR) has increased over the last decades, with implications not only in the overall level of therapeutic success, but also in the selective pressure exerted by the use of broad-spectrum antibiotics to defeat increasingly resistant agents, thereby creating a vicious cycle [1, 2] . The aim of this study is to describe risk factors associated with infection by MDR organisms among septic patients. Methods A retrospective cohort study including all adult patients with microbiological documented sepsis, admitted to the emergency room of a tertiary care, university hospital between 1 July 2011 and 30 June 2012. Results During the study period, 162 patients were admitted to the emergency room with severe sepsis; 79 (49%) had microbiological documentation, and were included in this study. The mean (SD) age was 71 (15) Categorical variables were compared using the Fisher test. Odds ratios (ORs) and 95% confi dence intervals (CIs) were calculated to evaluate the strength of any association. Continuous variables were compared using the Student t test or the Wilcoxon test where appropriate. Multivariable analysis was performed using multiple logistic regression. Variables with P <0.20 in bivariable analyses were considered for inclusion in a multivariable model. Introduction Clostridium diffi cile infection is becoming more common worldwide. Critically ill patients are at particularly high risk for this disease due to multiple risk factors in this population. Accurate diagnosis is essential for patient management, infection control and epidemiology. There are a variety of methods to detect the presence of toxigenic C. diffi cile in stools [1, 2] . The aim of this study was to evaluate the incidence of C. diffi cile infection in our ICU. Laboratory results of C. diffi cile toxin detection performed by the methods available in our institution are presented. Methods During the last year, all stool specimens received in the microbiology department from patients hospitalized in the 30-bed, multidisciplinary ICU of a tertiary-care hospital were evaluated. Specimens were ordered by physicians in the presence of clinical features compatible with C. diffi cile-associated infection. Each specimen was subjected to diagnostic tests for C. diffi cile infection including toxin enzyme immunoassays for C. diffi cile toxins A and B detection (DUO Toxin A&B; VEDA.LAB, France), and glutamate dehydrogenase (GDH) for cell wall antigen detection (C. DIFF Quik Chek Complete®; USA). Results During the study period, 335 stool specimens were evaluated. Results obtained with the two-stage immunoassay tests are shown in Figure 1 . All infected patients were treated with metronidazole or vancomycin. Following a course of therapy, 2% of the infected patients had recurrence or relapse. The crude mortality rate was 17%. Conclusion GDH antigen was positive in 12% of the stool specimens received from ICU patients with suspected C. diffi cile-associated infections. The majority of these specimens (51.4%) produce both C. diffi cile toxins A and B, whereas toxin B is produced in 31.4% and toxin A in the remaining 17.2%. Thirty-seven arterial lines returned no growth (77.08%). Seven cultures grew organisms likely to be contaminants (14.58%). Four cultures grew signifi cant organisms (yield of 8.33%). There were two cases with documented clinical signs of catheter-related local infection (CRLI) at the arterial line puncture site. In one case of CRLI the primary source of infection was felt to be remote from the arterial line. The second represented a local infection with organisms that are typically skin commensals. Of the four cultures likely to represent invasive pathogens, three had clinical suspicion that the primary source was a site remote from the arterial line. In two of these cases this was confi rmed by growing the same organism at an alternative site more likely to be the source of infection. Conclusion These results suggest that CRLI rates for arterial lines are low at 1.003 per 1,000 arterial line-days. However, there is a signifi cant bacterial colonisation rate of the arterial lines sampled. Three arterial lines (6.25%) grew organisms that could represent an important potential source of ongoing bacteraemia. There is evidence to suggest that the risk of infection and colonisation of arterial lines may be similar to that of central lines [1] . Further prospective work is needed to assess the impact of catheter care bundles on colonisation and infection rates of arterial lines in the ICU. Introduction Central venous catheters (CVCs) are essential for the delivery of medications and fl uids in the ICU patient; however, they carry a substantial infection risk. Evidence from a collaborative, cohort study suggests bundled interventions can provide a sustained decrease in infection [1] . Methods Since December 2009, data have been collected daily on the number of patients with one or more CVC. All positive blood cultures are reviewed monthly against predefi ned criteria to judge whether these are genuine bacteraemic episodes and thus classifi ed as laboratory-confi rmed bloodstream infections. A monthly rate for CR-BSI per 1,000 dwell-days is calculated from these data. Since July 2010, a number of interventions aimed at reducing CR-BSI have been introduced to the ICU. Results Data are presented on 15,644 CVC dwell-days over 47 months. Table 1 presents data for three full years and one part year* (January to November 2013). Despite an increase in bed-days per year, there has been a sustained reduction in infection rates and a reduction in dwell-days. Conclusion CR-BSI rates of 1.5/1,000 dwell-days in the fi rst year were similar to the post-intervention rate of 1.4 in the Pronovost study [1] . Subsequent rates have reduced, suggesting we are outperforming the secular trend [2] . The proportion of dwell-days to bed-days was reduced, which may suggest a reduction in duration and/or quantity of CVC placements. The prediction for and the mortality of patients with candidemia are highly adverse. The aim of this study was to identify the risk factors for the development of candidemia in postoperative patients. Methods From 1 July 2010 to 30 June 2013 all postoperative patients (n = 588) admitted to the multivalent ICU of our hospital were enrolled in this study. We recorded the age, sex, length of stay in the ICU, APACHE II score upon admission to the ICU, adjusted mortality score, underlying conditions, recent operations, invasive therapeutic procedures and prior usage of antimicrobial agents. Initial bivariable statistical comparisons were conducted using the χ 2 test for categorical data and the Student t test or Wilcoxon test for continuous data. Relative risks (RRs) and their 95% confi dence intervals (CIs) were calculated. Statistical signifi cance was set at P <0.05. To identify patient characteristics associated with candidemia we used multivariable logistic regression. In the multivariant analysis we also included the independent risk factors reported in recent medical literature. Results of the logistic regression analysis are reported as adjusted odds ratio (OR) with 95% CI. age and birth weight) and place of birth (NICUs, where the baby was born) showed that only the place of hospitalisation had a signifi cant eff ect on the ECO infection risk. The highest levels of resistance among all ECO isolates were observed against ampicillin (88.8%) and amoxicillin/clavulanic acid (62.2%). ECO isolates showed very diff erent pulsotypes and dominant epidemic clones were not detected. Cluster analysis based on PFGE of the 90 isolates showed 71 unique types, some of which were less than 70% similar, suggesting a genotypically variable population. Isolates that have identical pulsotypes usually were derived from the same patient (as in the case of 11 isolates) or were isolated from diff erent patients of the same NICU in the same period of time (in the case of seven isolates). The location of the NICU and the site of the isolation did not appear to have a correlation in the cluster analysis Conclusion Unfortunately, the presented data indicate that antibiotic prophylaxis in the presence of symptoms such as chorioamnionitis and PROM did not help to reduce the risk of ECO infection in the group of examined infants. In addition, multivariate analysis demonstrated only one signifi cant risk factor for ECO infection among infants with a birth weight <1,500 g; that is, the impact of the NICU. Epidemiology of ECO infections clearly indicated that observed cases of infections have no connection with the horizontal transmission (thus no proof of a link between the observed ECO infections with possible negligence in hand hygiene or excessive congestion on NICUs). This is confi rmed by the fact that no epidemic clones were observed (DEC-2011/01/D/ N27/00104). Introduction Healthcare-associated infections (HAI) in ICU patients are related to intubation, mechanical ventilation, and central venous and/ or urinary catheters. The incidence of HAI is too high, and antibiotic strategies suff er from resistance to the common classes of antibacterial agents [1] . Methods A complex program including staff teaching on the basic approaches of hand hygiene, microbiological passport of the ICU departments, and detection of the sources of the pathogens polluting the treatment area of the ICU was implemented at the Lugansk Regional Clinical Hospital. The incidence of the HAI was studied in polytrauma patients in 1999 to 2003 (before the implementation of the program) and in 2008 to 2012 (after its implementation). Results Before the implementation of the infection control program, the incidence of respiratory tract infections in polytrauma patients staying in ICUs was 57.4% of patients. Urinary infections occurred in 51.9% of patients. Surgical site infections were found in 32.8% of the patients. Combination of the infections was detected in 33.8% of patients. The incidence of the multiple-drug-resistant bacterial colonies was 5.8%. In contrast, recent data obtained after the implementation of the program of the infection control showed that the respiratory infections occurred in 29.3% of the patients, catheter-related urinary tract infections were detected in 32.8%, and surgical site infections were detected in 8.6% of the polytrauma patients. On the other hand, the incidence of the multiple-drug-resistant bacterial colonies increased signifi cantly up to 14.2% (P <0.001). Other types of HAI changed nonsignifi cantly. Conclusion In the era of the total antibacterial resistance, the education strategies and organization approaches (for example, infection control, antibiotic susceptibility, hand hygiene, and so forth) become more potent and eff ective than pharmacology innovations. Polytrauma patients, as one of the most severe categories suff ering from the HAI, demand a high level of compliance of the infection control approaches from the side of doctors as well as staff of and visitors to ICUs. Reference Introduction The use of contact precautions is recommended to reduce the transmission of these pathogens. However, there is little research regarding the relationship between the rate of patients with culturepositive fi ndings for Acinetobacter baumannii and the consumption of hand disinfectant. The objective of this study was therefore to evaluate trends in nosocomial bacterial detection, including A. baumannii, and the use of hand disinfectant in our ICU. Methods A single-center, retrospective, observational study was carried out. The results of all cultures (sputum, urine, blood, and so forth) were used to examine trends in the detection of microbiology in our ICU over No patients with culture-positive fi ndings for multidrug-resistant A. baumannii were identifi ed in this study. The rates of patients with culture-positive fi ndings for multidrug-resistant Pseudomonas aeruginosa (0‰ vs. 0.05‰ per 1,000 patient-days, P = 0.99) and P. aeruginosa (6.96‰ vs. 7.21‰ per 1,000 patient-days, P = 0.7) were not signifi cantly diff erent between the early period group and the late period group. The rates of patients with culture-positive fi ndings for MRSA (6.96‰ vs. 8.94‰ per 1,000 patient-days, P <0.05) and A. baumannii (4.98‰ vs. 6.51‰ per 1,000 patient-days, P <0.05) were signifi cantly higher in the late period group than in the early period group. Observational studies reported an association with worse clinical outcomes [1] but the eff ect of antifungal therapy in these patients remains unclear. We designed this pilot study to assess the feasibility of a larger trial and to evaluate infl ammatory profi les and clinical outcomes in these patients. We conducted a double-blind, placebo-controlled, multicenter, pilot randomized trial of antifungal therapy in critically ill patients with a clinical suspicion of ventilator-associated pneumonia with positive airway secretion specimens for Candida spp. We also included an observational group without Candida spp. in their airway secretions. We measured the recruitment rate, infl ammatory profi les over time and clinical outcomes. We recruited 60 patients into the randomized trial; 29 patients into the observational study. Recruitment was halted before the end of the study because of diffi culty in recruiting patients. Markers of infl ammation and all clinical outcomes were comparable between placebo and antifungal treatment groups at baseline and overtime. At baseline, TNFα levels were higher in the VAP with Candida compared with the observational group (mean ± SD) ( [1] . Interindividual variability in PK parameters was ascribed to an exponential model according to the equation: θj = θp × exp(nj), where θj is the estimate for a pharmacokinetic parameter in the jth patient, θp is the typical population PK parameter value (ka, CL/F, V/F), and n is a random variable from a normal distribution with zero mean and variance ω 2 . Residual variability was estimated using additive and additive-proportional error models; Cij = Cj + εadd and Cij = Cj(1 + εp) + εadd, where Cij and Cj are observed-predicted concentrations for the jth patient at time i, respectively, and ε is the error, a random variable with a normal distribution with zero mean and variance σ 2 . Bayesian estimates were obtained and the pharmacokinetic parameters Cmax, Tmax and AUC0-24 hours were calculated. Results Cmax of PZA was above the recommended concentration (>20 mg/l). For RIF the Cmax was below the recommended level (>8 mg/l), and the Cmax of INH was below the recommended levels (>3 mg/l). See Table 1 . Conclusion Large interindividual pharmacokinetic variability and concentrations below the recommended levels for RIF and ISO. We need to monitor drugs and to re-evaluate the doses. Introduction Staphylococcus epidermidis (SE) is the most often isolated species of coagulase-negative staphylococci, which are recognized as one of the main causes of ICU infections [1] . In this study we aimed to study the resistance profi le of SE clinical isolates against last-line antibiotics (vancomycin (VA), teicoplanin (TEC), linezolid (LZ) and daptomycin (DA)) for treating CNS infections, during an 8-year period. Methods From January 2005 until December 2012 we examined 518 nonduplicated SE isolates recovered from blood cultures of 421 patients hospitalized in a surgical ICU of our hospital. Species identifi cation and susceptibility testing were performed using the automated VITEK II system (Biomerieux). Additionally we used the E-test method (Biomerieux, ABI-Biodisk) in order to confi rm some isolation resistances against TEC and LZ found by the VITEK II system and to estimate the MIC levels of DA and VA. Mueller-Hinton agar adjusted to contain physiologic levels of free calcium ions (50 μg/ml) was used when testing DA susceptibility. Isolates with MIC >4 mg/l were considered resistant to TEC and LZ and those with MIC <1 mg/l and MIC <4 mg/l susceptible to DA and VA, respectively. Results The percentage resistance rate of the examined SE isolates is shown in Table 1 . Methicillin resistance was observed with an overall prevalence of approximately 84.6%. All of the resistant isolates to TEC and LZ were also resistant to methicillin. The MIC values of VA were lower than 2 mg/l (Table 1) . Conclusion The examined SE isolates present a scattered resistance to TEC and they show a remarkable continuing increase of resistance to LZ. These fi ndings enforced the necessity to take the appropriate measures in the ICU environment and during the clinical practice to limit the dissemination and the amplifi cation of these resistances. DA and VA possess an excellent in vitro activity against SE isolates and they could be very good alternative solutions for treating ICU infections caused by this species. Introduction Patient-to-patient transmission enables vancomycinresistant enterococci (VRE) outbreaks. Outbreak management is expensive and time consuming, and therefore the possibility of VRE eradication is desirable. Since vancomycin is scarcely absorbed in the gastrointestinal tract, treatment with vancomycin per os may result in very high gastrointestinal concentrations (many times the minimum inhibiting concentration (MIC)). The purpose of this study is to measure in vivo gastrointestinal concentrations of vancomycin in patients that are treated with a standard dose orally, and to investigate in vitro whether vancomycin is able to kill VRE at concentrations up to 2,000 times the MIC. Methods The faecal vancomycin concentration was measured in eight patients who suff ered a Clostridium diffi cile infection and were treated with 4×500 mg vancomycin orally per day. In vitro, a (1:2) dilution series of vancomycin (range 6,250 to 0.4 μg/ml) was created and 1 ml vancomycin solution was then added to 1 ml standardized inoculum. One vancomycin-susceptible enterococcus isolate (VSE, MIC = 3 μg/ ml) and two VRE isolates (MIC = 16 μg/ml) were studied. After 1, 7 and 14 days incubation at 35°C, growth was defi ned as macroscopic visible turbidity. To test for surviving bacteria, all inocula were cultured to sheep blood agar plates, which were read after 24-hour incubation at 35°C. E-tests to measure MIC were performed on relevant samples. The in vitro experiment was performed twice. Results The faecal vancomycin concentration in patients treated orally with vancomycin was 8,000 μg/ml on average. VSE growth at day 14 was detected at up to 1.5 μg/ml vancomycin, whereas VRE growth was detected at up to 98 μg/ml. The MIC of these VRE species growing at 98 μg/ml vancomycin was increased (≥256 μg/ml). For both VSE and VRE, surviving bacteria were detected at very high concentrations of vancomycin (>98 μg/ml): the MIC of these survivors was not increased. Conclusion Oral treatment with vancomycin results in extremely high faecal concentrations. At these high concentrations, VRE bacteria are killed in vitro; however, a minority of the VRE is able to survive. Vancomycin thus seems unsuitable for eradication. However, high concentrations of vancomycin dramatically reduce the bacterial VRE load. Therefore oral treatment with vancomycin may help to terminate VRE outbreaks: a dramatic reduction in bacterial load of the colonised patient will minimise the risk of patient-to-patient transmission. Introduction The optimal duration of antibiotic treatment in critically ill patients remains a subject of debate. In our multidisciplinary ICU, a short course of antibiotic monotherapy (5 to 7 days) is generally used as Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 a strategy to treat bacteraemia, unless specifi cally indicated otherwise (for example, endocarditis, osteomyelitis). We aimed to determine the impact of this strategy on antibiotic resistance patterns and patient outcomes compared with a similar exercise we conducted in 2000 [1] . Methods We conducted a retrospective study of all patients with bacteraemia or fungaemia (community-acquired, hospital-acquired, and ICU-acquired) treated in our university hospital ICU over a 6-month period (December 2012 to May 2013). We compared this against data from blood culture-positive patients admitted between February and July 2000. Information was collected on bacteraemia episodes, causative pathogens, antimicrobial resistance patterns, antibiotic use and duration, and patient outcomes. Notably, our ICU admits many immunosuppressed patients (for example, haemoncology). Results Table 1 presents demographics and incidence of bacteraemia. Antimicrobial resistance remained low in the 2013 cohort with few multi-resistant Gram-negative organisms, few fungaemia episodes and a marked decrease in methicillin-resistant Staphylococcus aureus (MRSA) (Figure 1 ). The number of relapses and breakthrough bacteraemias remained low. Conclusion A strategy of short-course antibiotic monotherapy is associated with low breakthrough and relapse rates and a low rate of antibiotic resistance. Community-acquired bacteraemia 57%; 6 (5 to 6) 65% 5 (3 to 5) Hospital-acquired bacteraemia 78%; 6 (5 to 8) 63%; 5 (4 to 7) ICU-acquired bacteraemia 80%; 5 (5 to 7) 62%; 4 (3 to 6) Introduction The administration of timely and appropriate antibiotic therapy is a well-known prognostic factor among severe sepsis patients [1] [2] [3] [4] . The purpose of this study is to describe the magnitude of the impact of early and appropriate empirical antibiotic therapy on hospital mortality. Conclusion Contrarily to what has been described previously, early and appropriate empirical antibiotic therapy was not associated with better prognosis. The most probable explanation is the higher compliance found with the current recommendations, reinforcing the need for period audits and feedback to the team. References Introduction Candida spp. are increasingly isolated in the critically ill, but the clinical signifi cance hereof is hard to establish [1] . Candida spp. colonization has been suggested as a risk factor for ventilator-associated pneumonia (VAP) [2] . The effi cacy and safety of inhalational amphotericin B (AB) is unknown [3] . The hypothesis was that inhalational AB deoxycholate is a safe and eff ective treatment for Candida spp. colonization of the respiratory tract and thereby prevents VAP and prolonged need for mechanical ventilation. Results Administration of IT as an adjunct to systemic antibiotics was associated with a decrease of systemic infl ammation and acute respiratory insuffi ciency signs 2.3 ± 1.2 days after the treatment onset (vs. 6.3 ± 1.5 days in group 2, P = 0.03). The decrease of microbial titer to 10 3 to 10 4 CFU/ml was detected in both groups by days 5 to 7, but it Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 was reliable in 80% of the patients of group 1 (P <0.02). It is noteworthy that 21% of group 1 patients were in vitro resistant to tobramycin, but it was clinically eff ective, probably due to a local superconcentration. Treatment with IT was associated with an increase of sensitivity of microbes to antibiotics they were prior resistant to (32% of patients). This is probably due to IT eff ects on biolayers. De-escalation of antibiotic therapy was possible in group 1 by day 5 in 42% of patients. The treatment with IT made it possible to wean 40% of patients by day 5.3 ± 1.8 after treatment cessation (vs. 35% and 11.2 ± 1.3 days in group 2, P = 0.02). Hearing loss and tinnitus was detected only in three patients of group 1. There were no cases of bronchospasm. The mortality was 12% (n = 3) in group 1 and 16% (n = 4) in group 2 (P >0.05), and was not related to a progression of NP [1] . Conclusion Administration of IT as an adjunct to systemic antibiotics is effi cient in treatment of NP caused by multiresistant Gram-negative bacteria in sepsis. Introduction Appropriate antibiotic prophylaxis plays a crucial role in preventing sternal wound infection after cardiac surgery [1] . In institutions with high prevalence of methicillin-resistant Staphylococci species, vancomycin prophylaxis is recommended either as a monotherapy or as an adjuvant agent [2] . In our study, we assessed sternal wound infection rates before and after the introduction of a vancomycin prophylaxis protocol. Methods Twenty-six of a total 227 consecutive cardiac surgical patients, between July and December 2012, developed sternal wound infection (Group A). All of the patients received a standard empirical antibiotic prophylaxis. From January to July 2013, 308 patients underwent cardiac surgery (Group B). In this group, we applied a more restricted antibiotic protocol, considering the resistance patterns and the results of microbiological tests of group A. We also evaluated the results of MIC susceptibility testing of fi ve antibiotics: oxacillin, linezolide, daptomycin, teicoplanin and vancomycin. In the new protocol the fi rst vancomycin dose was given 1 hour before sternal incision followed by three additional doses (48 hours duration). Thirty-three of these yielded signifi cant results (6.1%) but only nine (1.7%) were deemed clinically useful. A total of 102 cultures were taken in the fi rst 24 hours of the ICU, of which 17 were positive (16.7%) and six were useful (5.9%). Fifty-three were taken over the next 24 hours, of which two were positive and two were useful (3.8%). Forty-four cultures were taken over the following 24 hours, of which two were positive (4.5%) and one was useful (2.3%). A total of 343 cultures were taken subsequent to this in the ICU, of which 12 were positive (3.5%) and three were deemed useful (0.9%). Of 22 blood cultures taken in the 24 hours post ICU, one was positive but deemed useful (4.5% yield). Conclusion These results demonstrate overall low clinical utility of blood cultures but specifi cally that utility of ICU cultures is signifi cantly lower than pre ICU (1.7% vs. 6.7%; P = 0.0001). The yield of positive blood cultures and their clinical utility also decrease during ICU stay. This may refl ect appropriate empirical antibiotics and lower bacteraemia burden in later illness. Given the low clinical yield and a lack of established sensitive or specifi c triggers [1] , we suggest further work in an ICU setting to maximise utility whilst minimising harm. Reference Introduction The Surviving Sepsis Campaign (SSC) has developed guidelines to promote evidence-based management for patients with severe sepsis [1] . Improvements in bundle compliance have been demonstrated over time, but compliance remains below 40%. Sepsis has been studied in acute and critical care environments, but little research has focused on the management in level 1 wards. An initial audit of patients with sepsis who were referred to the GSTT critical care outreach team revealed very low overall compliance to the SSC 3-hour bundle. A novel quality improvement campaign was instituted with the aim of improving bundle compliance. Methods A retrospective cohort study in a university hospital was performed. Patients on level 1 wards with severe sepsis registered in the adult critical care response team (CCRT) database in November 2012 were identifi ed (Cohort A). Physiological observation, track and trigger scores, compliance with the 3-hour bundle elements (measured lactate, blood cultures before antibiotics, fl uid challenge, early antibiotics), antimicrobial stewardship and 28-day mortality were recorded. Following this, a quality improvement project was initiated: central to this was an electronic 'SEPSIS' order set, containing appropriate investigations and a step-by-step management guide for use on level 1 wards. A 'viral' print and social media campaign were also undertaken. Compliance to the SSC early care bundle was re-examined in two cohorts of patients in July 2013; patients that were referred to the CCRT as before (Cohort B) and also patients who had the electronic order set activated (Cohort C). The mean age of all patients studied (n = 79) was 66.5 years. Fifty-three per cent of the patients were male. Thirty-one per cent were in septic shock at the time of sepsis identifi cation. Overall SSC bundle compliance was 6.60% (Cohort A), 24% (Cohort B) and 45.5% (Cohort C). Improvements in other bundle parameters were also seen, including blood culture (54%, 72%, 91%), antibiotic administration (50%, 69%,76%) and fl uid administration in septic shock (50%, 42%, 75%) in Cohort A, Cohort B and Cohort C respectively. Conclusion Baseline compliance with the SSC 3-hour bundle on level 1 wards was very low. An electronic sepsis order set was associated with marked improvement. Novel quality improvement methodology may be important to achieve optimal compliance with evidence-based guidelines and an electronic sepsis order set is recommended. Reference Introduction Lung and cardiac operations cause signifi cant changes in the fl uid balance, and thus have a high incidence of development of postoperative acute kidney injury (AKI). The renal resistive index (RRI) calculated by the pattern of the renal artery fl ow is an indicator of renal artery fl ow. In this research work, we aimed to evaluate the effi ciency of the RRI on the early prediction of postoperative kidney injury in major lung and cardiac operations. Methods Twenty-two patients who have undergone lung or cardiac surgery were included in the study. After the kidneys were localized by ultrasonography, the best regions of blood fl ow were detected using color Doppler and then the arterial waveforms of these regions were obtained and optimized by Doppler. The measurements taken from three diff erent regions were averaged. The RRI was calculated at the preoperative and postoperative fi rst and 24th hours respectively. Results A signifi cant correlation was established between RRI and postoperative creatinine levels (P <0.01). RRI values reached their highest point at the postoperative fi rst day whereas the creatinine levels reached their highest level at the postoperative third day. Although there was no correlation between postoperative creatinine level and duration of staying in hospital, a signifi cant relationship was detected between duration of staying in ICU and the creatinine levels (P <0.01). When the cases were divided into two groups as RRI is less (n = 13) and larger (n = 9) than 0.7, signifi cant diff erences were present with regard to age and creatinine levels. Conclusion The RRI, which is used to evaluate renal arterial fl ow, is directly related with increasing renal vascular resistance in the case of AKI. The usefulness of RRI for prediction of AKI was shown both clinically following a renal allograft and experimentally in the acute tubular necrosis modeling [1, 2] . Also in septic patients this was asserted, as RRI is a better marker than cystatin C, which is one of the popular markers of recent times for prediction of development of AKI [3] . Our results show that RRI could be a simple, non-invasive and useful technique for early diagnosis of AKI in patients undergoing major operations such as lung and cardiac surgeries. We assessed the eff ect of fl uid balance (FB) on acute kidney injury (AKI) classifi cation/prognosis in cardiac surgical patients by comparing patients classifi ed with AKI, before and after adjusting the creatinine (used to classify AKI) for FB. Fluid accumulation is associated with negative outcomes including development of AKI in critically ill patients [1] . Cardiac surgical patients commonly receive large volumes of fl uid postoperatively and could be at risk for the harmful eff ects of fl uid accumulation. Furthermore, fl uid accumulation may infl uence serum creatinine concentration and mask AKI [2] . Methods We performed a retrospective analysis of prospectively collected data on all cardiac surgical patients admitted to St Vincent's Hospital ICU, Melbourne, Australia from 1 July 2004 to 30 June 2012. AKI Network creatinine criteria were used to classify AKI in the usual method and then using FB-adjusted creatinine (FB at 18 hours and an assumption that total body water is 60% of weight involved). FB (total i.v. input minus (total urine output + chest drain losses)) was calculated for 18 hours post surgery as most patients were in the ICU for this period. Results Patients classifi ed with AKI increased from 27.7% to 37.2% (n = 2,171) after adjusting creatinine for FB. Patients were categorised into four groups based on presence or absence of AKI before and after adjustment for FB: group A, no AKI before or after adjustment for FB; group B, no AKI before/AKI after; group C, AKI before/no AKI after; and group D, AKI before and after. Group B (n = 209) had an in-hospital mortality rate similar to patients in group D (n = 599) (3.4% vs. 4.3%, P = 0.53) and greater than those in group A (n = 1,333) (3.4% vs. 1.6%, P = 0.07). Group B also had an ICU mortality rate similar to patients in group D (2.9% vs. 2.7%, P = 0.88) and signifi cantly greater than those in group A (2.9% vs. 0.7%, P = 0.003). The need for renal replacement therapy (RRT) in group B was also high as for patients in group D (7.7% vs. 12.4%, P = 0.06) and was signifi cantly greater than those in group A (7.7% vs. 1.6%, P <0.001). Thus, hospital and ICU mortality and use of RRT in patients classifi ed with AKI only after adjustment for FB were similar to patients with AKI before and after adjustment for FB and were notably higher than those of patients without AKI. Conclusion Lack of adjustment for FB post cardiac surgery may mask the presence of AKI that is associated with increased risk for death and RRT, which could hinder optimal treatment. References Introduction Acute kidney injury (AKI) complicates over 50% of ICU admissions and is associated with signifi cantly increased mortality, length of stay, and costs across a broad spectrum of conditions [1] . Methods We performed a single-centre, retrospective analysis of AKI diagnosis in patients with ICU admissions of 5 days or more who survived to hospital discharge between 2009 and 2011. We examined the relationship between hospital length of stay, AKI diagnosis, demographics and clinical characteristics in a multivariable Cox-hazard analysis. Results We identifi ed 700 cases, with a 66% incidence of AKI. The AKI was associated with older age, greater initial illness severity and longer ICU and hospital length of stay in univariate analysis (Table 1 ). In Cox-hazard analysis, only AKI category and ICU length of stay were signifi cantly associated with lower probability of discharge over time ( Figure 1 ). AKI-1 was associated with a hazard ratio for hospital discharge of 0.66 (0.55 to 0.79), AKI-2 with 0.55 (0.42 to 0.71) and AKI-3 with 0.54 (0.44 to 0.66). Introduction Acute kidney injury (AKI) is a signifi cant complication following cardiac surgery associated with an increase in morbidity, hospital stay and mortality. Although the estimated incidence of AKI following cardiac surgery is 30%, few studies have identifi ed the incidence of AKI following cardiac surgery as defi ned by the Kidney Disease: Improving Global Outcomes (KDIGO) group [1] . We conducted a prospective observational study at our institution to identify the incidence and staging of AKI according to the KDIGO defi nition. We also aim to identify the factors that predispose adult patients to developing AKI post cardiac surgery. Methods A prospective analysis was performed on 103 adult patients admitted to ICU post-cardiac surgery from September to October 2013. Data for perioperative risk factors and renal biochemical markers were collected up to the sixth postoperative day and are expressed as mean (SD). Results Ordered logistic regression was used to analyse the data. Thirtythree per cent of cardiac surgery patients at our institution developed AKI. Factors such as poor left ventricular (LV) function, low preoperative haematocrit and low preoperative glomerular fi ltration rate (GFR) were signifi cant risk factors (P = 0.03, P = 0.04 and P <0.01 respectively) for developing postoperative AKI following cardiac surgery. See Table 1 . Our results suggest that we should focus on LV function and GFR as predictors for developing AKI following cardiac surgery. Strategies to increase preoperative haematocrit should be investigated to reduce in the incidence and severity of postoperative AKI. Reference Introduction Critically ill patients cared for in ICUs often require radiological investigation using iodinated contrast agents. Contrastinduced nephropathy (CIN) -a form of acute kidney injury (AKI) -is a complication following the use of such contrast. Although CIN has been thoroughly studied in some populations (for example, those undergoing coronary angiography), it has not been investigated in large numbers of ICU patients [1] . is an infl ammatory condition [1] . Our aim was to describe the immune phenotype in human AKI. Methods We enrolled patients with: AKI grade II/III (defi ned by KDIGO criteria) and systemic infl ammatory response syndrome (SIRS) without sepsis; SIRS without AKI; and AKI II/III without SIRS. A healthy control population was used for baseline comparison. Serial blood samples were taken on days 0, 2 and 7. Cells were separated using Percoll gradients and phenotyped using fl ow cytometry. The results from 24 day 0 samples identifi ed statistically signifi cant diff erences between SIRS, AKI, AKI + SIRS and healthy controls amongst: CD8 + cytotoxic T cells, CD45 -CD25 +++ regulatory T cells, and CD45 -CD25 ++ cytokine secreting non-T-regulatory cells ( Table 1 ). The percentage of CD69-positive neutrophils was signifi cantly increased across all three groups relative to controls, with little variation between AKI, SIRS and AKI + SIRS patients. Introduction Approximately 50% of acute kidney injury (AKI) is associated with sepsis. Neutrophil gelatinase-associated lipocalin (NGAL) and cystatin C are the two most widely used biomarkers for AKI. However, these two markers are also aff ected by the systemic infl ammatory response, and their diagnostic value in sepsis-induced AKI is disputed [1, 2] . Unlike clinical AKI, animal models can be used to explore single etiology. The purpose of this study is to examine the relationship between infl ammatory mediators and biomarkers for AKI in a sepsis model in rats. Methods Sepsis was induced by cecal ligation and puncture (CLP) in 60 adult SD rats and then observed for AKI and survival. Blood and urine samples were collected at baseline, and 18, 22, and 48 hours after CLP. AKI severity was assessed by RIFLE criteria (creatinine only). The associations between plasma IL-6 and plasma NGAL, plasma cystatin C, urine NGAL and urine cystatin C were analyzed. The area under the receiver-operator characteristic curves (AUC) was used to evaluate the diagnostic capability between severe AKI (RIFLE-I or RIFLE-F) and no AKI (includes RIFLE-R) for diff erent biomarkers. The changes of plasma NGAL, plasma cystatin C, urine NGAL and urine cystatin C with time were similar to the changes of plasma IL-6. However, only plasma NGAL levels were closely correlated with levels of plasma IL-6 (R 2 = 0.36, P <0.05). The analysis for plasma cystatin C, urine NGAL and urine cystatin C at 22 hours for severe AKI showed AUCs of 0.78, 0.71 and 0.75 respectively (all P <0.05), and the AUC for plasma NGAL was 0.62 (P = 0.11). There were no signifi cant diff erences in plasma NGAL at 22 hours between severe AKI and no AKI (2,143.32 vs. 2,077.02 U/ml, P = 0.21). Conclusion In this animal model of CLP sepsis, plasma NGAL levels were aff ected by the systemic infl ammatory response, and did not discriminate for AKI. Urine NGAL, plasma cystatin C and urine cystatin C were able to diff erentiate severe AKI from no AKI in CLP sepsis. References serum creatinine [1] . However, serum creatinine is an insensitive and nonspecifi c biomarker [2] . This study was designed to investigate whether a correlation exists between urinary oxygen tension (UOT) and early markers of AKI. The aim was to evaluate whether UOT could provide warning signs of an insuffi cient renal oxygen supply, which can lead to postoperative AKI. Methods Fourteen subjects undergoing cardiac surgery with CPB were included in this prospective clinical pilot study. UOT was measured perioperatively in all patients, both in the operating room (before, during and after CPB) and in the ICU. Biomarkers of AKI in blood and urine were measured preoperatively and postoperatively at 3, 6, 12 and 24 hours after the initiation of CPB. These included serum creatinine and the early urinary biomarkers kidney injury molecule-1 (KIM-1), neutrophil gelatinase-associated lipocalin (NGAL) and cystatin C. Student's t tests and Mann-Whitney tests were used to compare continuous variables. There was a signifi cant decrease in UOT between the start of CPB (138.44 ± 22.19 mmHg) and the lowest UOT during CPB (107.70 ± 23.28 mmHg) (P = 0.001). Dividing the subjects into two groups according to the Acute Kidney Injury Network (AKIN) classifi cation, no signifi cant diff erences were found in mean UOTs between the group of patients with a normal kidney function (n = 7) and the group with AKIN stage 1 or 2 (n = 7). For KIM-1, a signifi cant diff erence between the two groups was found at 3 hours (P = 0.041) after the initiation of CPB. Further, for NGAL a signifi cant increase in biomarker concentrations compared with the preoperative value was observed in the group with an AKIN stage 1 or 2 at all diff erent postoperative time points (3 hours (P = 0.013), 6 hours (P = 0.003), 12 hours (P = 0.009) and 24 hours (P = 0.003)). On the contrary, there were no signifi cant increased urinary NGAL levels measured in the group with the normal kidney function. Conclusion This pilot study was not able to demonstrate any association between perioperatively measured UOT and markers of postoperative AKI. Additional laboratory and clinical studies will be necessary to further defi ne the relationship between the UOT and new biomarkers of AKI. References Introduction Acute kidney injury (AKI) in surgical critically ill patients is an independent risk factor for early mortality. Two novel urine biomarkers, insulin-like growth factor-binding protein 7 (IGFBP7) and tissue inhibitor of metalloproteinases-2 (TIMP-2), may help to detect clinically silent episodes of AKI in the golden hours prior to irreversible damage of the kidney. We evaluated the early predictive value of these biomarkers for AKI, moderate and severe AKI, early requirement of renal replacement therapy (RRT), and ICU mortality, with a cutoff value of IGFBP7/TIMP-2 >0.3. Methods Four to six hours after admission to the surgical ICU, urine biomarkers were prospectively evaluated in all patients with present exposures and susceptibilities for AKI according to the KDIGO recommendation. The incidence and severity of AKI (KDIGO 2012) and requirement of RRT were assessed over 48 hours after admission. In addition, ICU mortality and variables such a norepinephrine dose, mean arterial pressure, hemoglobin level, cumulative fl uid balance and urine production were noted at the time of biomarker evaluation (4 to 6 hours) and for the fi rst 24 hours after admission. The predictive values of biomarkers were better for early prediction than clinical parameters such as urine output within the fi rst 6 hours after admission to ICU. We recently reported a 728-patient multicenter study (Sapphire) where a biomarker combination of tissue inhibitor of metalloproteinases-2 (TIMP-2) and insulin-like growth factor binding protein 7 (IGFBP7) were validated for risk stratifi cation for moderate or severe acute kidney injury (AKI) KDIGO stage 2 and 3 [1] . Methods We subsequently selected two clinical cutoff values for the TIMP2 × IGFBP7 combination from the Sapphire dataset, one (0.3) with high sensitivity (89%) (specifi city = 50%) and one (2.0) with high specifi city (95%) (sensitivity = 42%) for the development of AKI KDIGO stage 2 and 3 within 12 hours of study enrolment. We examined the timing of change in TIMP2 × IGFBP7 relative to change in creatinine using the sign test. Results TIMP2 × IGFBP7 results were available for 178 patients who developed AKI stage 2 or 3. The median TIMP2 × IGFBP7 result was signifi cantly greater than the cutoff value of 0.3 from 24 hours before to 24 hours after AKI 2 or 3 (P <0.01) (Figure 1 ). Conversely, median serum creatinine was not diff erent from baseline prior to development of AKI 2 or 3. Conclusion The TIMP2 × IGFBP7 biomarker combination identifi es patients who ultimately develop moderate or severe AKI 24 hours earlier than serum creatinine. Reference Introduction Computed tomography with contrast medium is a common diagnostic tool in emergency and critical medicine. Contrastinduced nephropathy (CIN) is one of the leading causes of hospitalacquired acute kidney injury. Focus on renal tubule protection may be a hope to improve the eff ectiveness of current strategies. Methods We used HK-2 human renal proximal tubule cells to evaluate the therapeutic potential of resveratrol, a polyphenol phytoalexin [1] , and in animal models a single high dose of EPO has been shown to ameliorate reperfusion injury after ischemia [2] . However, recent human studies have shown confl icting results. We aimed to study the eff ect of a single high dose of EPO preoperatively on renal function after coronary artery bypass grafting (CABG) in patients with preoperative impaired renal function. Methods This single-centre, randomized, double-blind, placebocontrolled study included 75 patients scheduled for CABG with preexisting renal impairment (estimated glomerular fi ltration rate based on p-cystatin C <60 ml/minute and >15 ml/minute). The patients either received a single high dose of EPO (400 IU/kg) or placebo preoperatively. The primary endpoint was renal protection evaluated by p-cystatin C at the third postoperative day compared with the preoperative values. Incidence of acute kidney injury and other renal biomarker changes were among secondary endpoints. Results There was no signifi cant diff erence on the third postoperative day for p-cystatin C levels (2.1 ± 0.8 mg/l for the study group and 1.9 ± 0.5 mg/l for the control group, P = 0.51). There were no signifi cant diff erences in other renal biomarkers or measures between the groups (p-NGAL, p-creatinine, p-urea, and estimated glomerular fi ltration rate). There were no other diff erences in outcome variables between the groups. Conclusion Intravenous administration of a single high dose (400 IU/ kg) of EPO did not have a renal protective eff ect in patients with reduced kidney function undergoing coronary artery bypass surgery. Introduction The aim of this study is to quantify the impact of using eGFR instead of measured creatinine clearance (Clcr) on the evaluation of recovery from acute kidney injury (AKI Introduction Acute kidney injury (AKI) is a serious complication following lung transplantation (LTx) [1] [2] [3] . We aimed to describe the incidence and outcomes associated with AKI following LTx. = 3,207) . The fl uid accumulation percentage (total urine and chest drain losses subtracted from total i.v. intake (l) /weight (kg) × 100) was calculated for 18 hours post surgery as most patients were in the ICU for this period. Acute Kidney Injury Network (AKIN) creatinine criteria were used to classify AKI using creatinine adjusted for fl uid balance. Results Renal replacement therapy was performed on 136 patients in this group (4.2%). The fl uid accumulation percentage was associated with an 8% increase in odds for AKI (OR (CI), 1.08 (1.04 to 1.12)), and a 13% increase in odds for requiring renal replacement therapy (1.13 (1.05 to 1.21)) for each percent increase in fl uid accumulation (l/kg%) after cardiac surgery, after adjusting for variables including APACHE score, cardiac failure, type of surgery, and inotrope use in multivariate analysis. Conclusion In this relatively homogeneous patient group undergoing cardiac surgery, postoperative percent fl uid accumulation at 18 hours was associated with AKI and need for renal replacement therapy. Whether there is residual confounding due to indication for fl uid use or unmeasured risk factors requires further investigation in controlled trials. Introduction Renal replacement therapy in critical care is associated with increased mortality. It is not known for which patients RRT confers the most benefi t, or who will recover function and remain dialysis independent on a long-term basis. Conclusion The reported incidence of AKI in critical care ranges from 20 to 50%, with the highest rates seen in sepsis [1] . Utilisation of CRRT for AKI is higher at our centre than the described 5% [2] , potentially due to close collaboration between critical care and nephrology [3] or relatively lenient ICU admission criteria. Increasing mortality was seen with age, APACHE II score and delay in initiation of RRT. Prospective analysis is required to look at determining biomarkers for AKI and risk factors for mortality; dynamic monitoring of haemodynamic responders (MAP, vasopressor requirement <24 hours), percentage creatinine decrease [4] , severity-of-illness scores and urine output. Results of current RCTs are awaited, which may provide more information on mode of clearance, fl ow rates and early versus standard initiation of RRT to more accurately prognose patients' outcomes. The very elderly (>80 years) UK population is increasing and along with it there is an increased utilisation of critical care resources [1] . The use of chronological age as a bar to treatment is unethical, yet a discrepancy of opinion remains between the disinclined and the elderly advocate intensivist [2] . Acute kidney injury requiring dialysis is a frequent complication of critical illness and is associated with high morbidity and mortality. An increased risk is seen with age due to the high prevalence of risk factors and chronic kidney disease in the elderly [3] . Methods Very elderly patients admitted to our tertiary referral ITU were included in a 2-year (December 2011 to November 2013) retrospective cohort analysis. The outcome of those receiving RRT was reviewed. Results There were 2,297 admissions to the ITU, of which 323 (14%) were classifi ed as very elderly with a mean APACHE II score of 17.3. RRT was utilised in 12.4% (n = 40) of the very elderly patients with a mean APACHE II score of 25. Forty-fi ve per cent of AKI was due to sepsis and 25% due to emergency surgery. ITU very elderly patient mortality was 30.1%, and in those who received RRT was 42% and 67% at ITU and hospital discharge respectively. Recovery of renal function was seen in 19 patients at ITU discharge, of these 11 survived to hospital discharge. Two patients required ongoing IHD in the community. Conclusion Current predictions estimate that the very elderly UK population will almost double by 2030 [4] . The biggest uptake in acceptance of chronic RRT has been in the elderly [5] , where no diff erence has been shown in health-related quality of life from the general dialysis population [6] . Our experience suggests that RRT has a role in the critically ill elderly patient, with 33% surviving to hospital discharge. The challenge is identifying those most likely to benefi t from a cohort with multiple comorbidities, against the available resources, rising demand for critical care in an aging population and increasing expectations [7] . Table 1 as the median and interquartile range. Introduction The aim of this study is to analyze the functional alterations that may infl uence the fi nal result when using citrate versus heparin anticoagulation in critically ill patients [1] . Methods We performed a retrospective and analytical study including patients exclusively submitted to citrate or heparin through years 2011 and 2012. Included were demographic data with SAPS II, SOFA, RIFLE scores and mortality rate. For functional analysis we consider the timing of the beginning, duration, loss of dose and loss of creatinine clearance. We analyzed dialitrauma, considering the variation of: potassium, total and ionized calcium, magnesium, sodium, phosphorus, pH, lactates, bicarbonate, platelets, albumin, creatinine and urea. Data are presented as the average and standard deviations. To access the infl uence of dose and clearance losses on mortality, we used logistic regression test. The study included 44 patients in the citrate group versus 61 in the heparin group. We found no statistical signifi cant diff erences for: age (P = 0.06); SAPS II (P = 0.28); SOFA (P = 0.19); the timing of beginning of the technique (P = 0.61), with 34.8% versus 47.5% of patients in R (RIFLE), 27.8% versus 18% in I (RIFLE) and 16% versus 21% in F (RIFLE); duration of the technique (P = 0.74) and length of stay. Although we noticed a greater loss of dose and absolute creatinine clearance in the citrate group, this had no statistical signifi cance (P = 0.18 and P = 0.13). The mortality found for citrate and heparin groups was 60.4% and 39.4% respectively. The diff erences with statistical signifi cance related to dialitrauma emerged in K + (P = 0.03), Ca 2+ (P = 0.02), Na + (P = 0.004), platelets (P = 0.002), pH (P = 0.02) and bicarbonate (P = 0.0001). Conclusion We may say that there are functional diff erences that must be taken into account. Despite not having statistical signifi cance on this sample, losses of dose and creatinine clearance showed a direct relation with mortality. Reference and mean aortic cross-clamping of 98 minutes (range 25 to 190). We measured MyG, procalcitonin (PCT), and creatinin (sCr) at ICU admission and, if serum MyG was higher than 600 mg/dl, the patient was treated with CVVHD-EMIC2-citrate anticoagulant within 12 hours of ICU admission for 72 hours and a dose of 2,000 ml/hour. Biochemical assays were obtained at 12, 24, and 72 hours and at ICU discharge. Results The pretreatment MyG median value was 10,789 ng/ml; it signifi cantly reduced on average 92.8% during CVVHD (see Figure 1 ) and it remained low at ICU discharge (median value 114 ng/ml). sCr remained stable (average time value equal to 0.94 mg/dl) during CVVHD; PCT also decreased over time with a reduction rate equal to 78% (from 5.35 ± 4.39 mg/dl to 1.23 ± 1.09 mg/dl at the end of CVVHD). Finally, six patients survived at 90 days. Conclusion This small experience confi rms that serum MyG is likely to increase in post-cardiac surgical high-risk patients and suggests a benefi cial eff ect of CRRT treatments with EMIC2 membranes and citrate on serum MyG, potentially preventing AKI. Further larger assessment can be advised for confi rmation. Reference Introduction Rhabdomyolysis is characterized by breakdown of striated muscle due to a great number of causes. Acute kidney injury (AKI) is a common complication as a consequence of high concentrations of circulating myoglobin (Mb). The AKI degree can vary but often requires dialysis, a condition which drastically worsens the ICU stay and prognosis. Since Mb overconcentration represents the cause of AKI, one of the therapy's aims should be its removal to prevent further kidney damage and to allow faster renal recovery. Both intermittent hemodialysis and high-volume CVVHF are poorly eff ective in removing Mb, while smallprotein leakage membranes seem to be promising in this setting. The aim of our study was therefore to measure effi cacy of Mb removal of a new high cutoff membrane (EMIC2; Fresenius, cutoff value 40 kDa) for continuous renal replacement therapies (CRRT) in the ICU setting. Methods We report results of EMIC2-based treatments in seven patients (four male/three female) with diff erent causes of rhabdomyolysis (trauma, sepsis, limb ischemia). Five patients had classic dialysis indications (persistent anuria) while in two patients treatment was prophylactically started. CRRT were delivered in CVVHD mode with the EMIC2 dialyzer and with loco-regional trisodium-citrate anticoagulation. Mb plasma levels were assessed each 12 hours while the removal rate, total body and dialyzer clearances were estimated by kinetic modeling as previously described [1] . Clinical data were also collected and both global and renal patient survival was reported. Results The median Mb value at CRRT start was 6,971 ng/ml (range 4,679 to 48,011 ng/ml). CRRT were delivered with an average blood fl ow rate of 143 ± 45 ml/minute and a dialysate fl ow rate of 2,134 ± 1,334 ml/hour. These operating conditions allowed one to stop treatment on average after 75 ± 47 hours (median 54 hours) with a Mb reduction of 82.2% (range 99.4 to 44.4%). Overall median Mb removal per treatment was 59 mg (range 33 to 279 mg) mainly due to the fi rst 24 hours of treatment (54 mg, range 20 to 187 mg). Only two patients had residual renal function that was in one case measured to account for only 7.45 mg Mb removal during the entire treatment. Six patients survived and recovered renal function with no dialysis need at present follow-up. One patient died during the ICU stay. Conclusion Our data measured high performance of the EMIC2 membrane in Mb removal and confi rm theoretical models indicating that CRRT with a high cutoff membrane can achieve major Mb removal within 24 hours with great superiority in comparison with all other available techniques. Reference Introduction Removing the middle molecular weight substances including cytokines and albumin-bound toxin could be eff ective for patients with acute liver failure (ALF). We have developed a new system, plasma fi ltration with dialysis (plasma diafi ltration (PDF)) [1, 2] , and assessed its effi cacy in multicenter analysis. Methods A subgroup analysis of an observational study conducted in the ICUs of six hospitals. In PDF, simple plasma exchange is performed using a selective membrane plasma separator (Evaxclio EC-2A; Kawasumi Chemical Inc., Tokyo Japan), which has a sieving coeffi cient of 0.3 for albumin, while the dialysate fl ows outside the hollow fi bers. The fl ow rate of the blood, dialysate, substitute and additional substitute was 80 to 100 ml/minute, 600 ml/hour, and 0 to 450 ml/hour according to the rate of water elimination and 150 ml/ hour, respectively. As the substitute from the additional fl uid line, we added 1,200 ml (150 ml/hour) of fresh frozen plasma followed by 50 ml of 25% albumin considering the loss of albumin by diff usion. As an anticoagulant, nafamostat mesilate (Torii Pharmaceutical Co. Ltd, Tokyo, Japan) was used at a rate of 15 to 25 mg/hour. Results A multicenter study was underway from October 2005 to August 2011. We performed PDF on 65 patients with ALF (severe sepsis, 22; post operation, 15; fulminant hepatitis, 11; alcohol hepatitis, 3; graft versus host disease, 4; and others, 10). The serum total bilirubin, plasma PT-INR and the model for end-stage liver disease (MELD) score before the PDF procedure were 15.0 ± 8.15 mg/dl (average ± SD), 2.3 ± 1.5 and 35.8 ± 9.3, respectively. PDF was performed as 9.2 ± 13.2 sessions per patient and the overall 28-day survival rate was 68.5%. According to the severity of the MELD score, we stratifi ed patients into three categories defi ned by the MELD score. The numbers of patients were 15 (23%) in score 20 to 29, 30 (46%) in score 30 to 39 and 19 (29%) in score over 40. Introduction Acute liver failure (ALF) is a critical illness with high mortality. Plasma diafi ltration (PDF) is a blood purifi cation therapy in which simple plasma exchange is performed using a selective membrane plasma separator while the dialysate fl ows outside the hollow fi bers. While several studies demonstrated that PDF therapy is a useful blood purifi cation therapy for patients with ALF, PDF therapy is often diffi cult to employ in ALF patients complicated with multiple organ failure, especially in those with unstable hemodynamics. Furthermore, it is likely to re-occur immediately after PDF therapy. We developed continuous PDF (CPDF) as a new concept in PDF therapy, and assessed its effi cacy and safety in ALF patients compared with conventional plasma exchange (PE) plus continuous hemodiafi ltration (CHDF) therapy in this study. Methods Ten ALF patients (gender: male/female = 6/4, age: 47 ± 14) employed CPDF therapy. The primary outcomes were altered liver function, measured by the model for end-stage liver disease (MELD) score, and total bilirubin and prothrombin time International Normalized Ratio (PT-INR), 5 days after CPDF therapy. Secondary outcomes included Sequential Organ Failure Assessment (SOFA) scores, 5 days after CPDF therapy, and the survival rate 14 days after this therapy. Results The MELD score (34.5 to 28.0; P = 0.005), total bilirubin (10.9 to 7.25 mg/dl; P = 0.048), PT-INR (1.89 to 1.31; P = 0.084), and SOFA score (10.0 to 7.5; P <0.039) were improved 5 days after CPDF therapy. Nine patients were alive and one patient died due to acute pancreatitis, complicated by ALF. The effi cacy of CPDF therapy for maintaining liver function and renal function was not inferior to PE plus CHDF therapy. Parameters of renal function such as the creatinine value were also improved 5 days after CPDF therapy. Circulation parameters such as mean arterial pressure and heart rate were maintained without inotropic and vasopressor support during the CPDF treatment period. The oxygenation index (PaO 2 /FiO 2 ) as a measure of pulmonary function tended to increase after this treatment. We could employ this treatment without any adverse events, such as infections and unstable hemodynamics. Conclusion In the present study, CPDF therapy safely supported liver function and generally improved the condition of critically ill patients with ALF. Introduction It has been demonstrated that blood purifi cation therapy performed by means of venovenous hemodialysis with high cutoff membranes (HCO-CVVHD) may modulate the host infl ammatory response in septic patients with acute kidney injury (AKI), potentially limiting organ dysfunction. Improvement in hemodynamics and respiratory function has been described during HCO-CVVHD treatment [1] . The Sepsis in Florence sTudy (SIFT) has been designed to evaluate changes in infl ammatory biomarkers and tissue oxygenation/perfusion indexes in septic ICU patients with AKI during HCO-CVVHD. Methods Patients with microbiologically confi rmed severe sepsis/ septic shock and AKI (RIFLE criteria F or more) treated with HCO-CVVHD, started within 12 hours from the diagnosis, were prospectively included in the study. The cumulative vasopressor index (CVI), C-reactive protein levels (CRP), serum lactate concentration (Lac) and central venous oxygen saturation (ScvO 2 ) were measured before (T0h) and at 24 hours and 48 hours after HCO-CVVHD initiation. Data are expressed as the median (range). The Mann-Whitney U test was applied to detect diff erences in CVI, CRP, Lac and ScvO 2 at the three time points (statistical signifi cance for P <0.05). Results In 16 ICUs, a total of 16 patients (six cardiac surgery, four abdominal surgery and six medical) met the inclusion criteria and were enrolled in the study. A signifi cant reduction in CRP levels was observed over time: 263 (216 to 358) mg/dl at T0h to 153 (56 to 186) mg/dl at T48h (P <0.05). ScvO 2 signifi cantly increased from 45 (40 to 55)% at T0h to 75 (68 to 77)% at T48h (P <0.05). Finally, serum lactate decreased from 5.1 (3.0 to 9.5) mmol/l at T0h to 1.6 (1.0 to 4.6) mmol/l at T48h (P <0.05). Conversely, CVI did not signifi cantly reduce over time (8.2 (4 to 9) at T0h vs. 4.5 (4 to 8) at T48h, P >0.05). Conclusion Our preliminary data show that patients with sepsisrelated AKI may benefi t from early treatment with HCO-CVVHD. The modulation of proinfl ammatory and anti-infl ammatory mediators, as previously demonstrated [1] , may improve microcirculation, tissue perfusion and cellular oxygenation. Although promising, our results must be confi rmed at the end of the study with larger observations. Finally, a subgroup analysis is absolutely mandatory in order to explore diff erent behaviors of tissue perfusion indexes in diff erent populations of patients. Introduction Little information is available regarding ciprofl oxacin pharmacokinetics and pharmacodynamics in sepsis patients receiving sustained low-effi ciency dialysis (SLED). This study determined the pharmacokinetics and simulated pharmacodynamics of ciprofl oxacin in ICU patients during SLED. Methods This study was a prospective evaluation of ciprofl oxacin pharmacokinetics in patients with sepsis and >18 years of age, urine output <200 ml/day and receiving SLED for at least 8 hours. Following informed consent, plasma samples were collected at baseline and 1, 2, 4, and 8 hours after a ciprofl oxacin 400 mg dose i.v. during SLED and post-SLED therapy at the same times. Dialysate samples were collected at 4-hour intervals during SLED. Pharmacokinetic parameters were determined using WinNonlin and compared between the two periods. Simulated pharmacodynamic parameters were determined for Pseudomonas aeruginosa using MIC = 2. Results A total of seven patients (four male, three female, age 56.9 ± 7.6, APACHE II 26.8 ± 2.4) were enrolled. Ciprofl oxacin was cleared relatively rapidly with a half-life of 6.9 hours and a Ke of 0.108/hour during SLED compared with 11.9 hours and 0.057/hour post-SLED (P <0.05). Simulated pharmacodynamics demonstrated inadequate coverage for P. aeruginosa during SLED with Cmax/MIC ratio 5.7 ± 1.2 and AUC/MIC 77.5 ± 22.3. Conclusion Ciprofl oxacin is rapidly cleared during SLED similar to clearance during normal renal function, which may result in adequate pharmacodynamic coverage for some pathogens. Introduction Antibiotic dosing for patients with acute renal failure receiving continuous renal replacement therapy (CRRT) is a clinical challenge. The aim of this study was to investigate the pharmacokinetics of meropenem (M) during CRRT. Methods A prospective and multicenter study was conducted at seven hospitals. Fifteen critically ill patients undergoing either continuous venovenous hemofi ltration (CVVHF) or hemodiafi ltration (CVVHDF) were included. Serum and ultrafi ltrate (UF) levels of M were determined by liquid chromatography. Blood samples were drawn 24 hours after starting CRRT at 08:00 a.m., 09:00 a.m., 10:00 a.m., 01:00 p.m., 06:00 p.m., 20:00 p.m. and 08:00 a.m. of the following day. CRRT clearance (Cl), total amount of M in the UF (MUF), percentage of the dose extracted by CRRT (EF) and the AUC (ng/hour/ml) were calculated. Results Nine patients were treated with CVVHDF and six with CVVHF. M (0.5 to 2 g) was administered every 6 to 12 h by i.v. infusion over 15 minutes. Data (mean and SD) concerning the dialysate fl ow rate (DF; ml/hour), blood fl ow rate (ml/minute) and the average UF rate (ml/ kg/hour) for CRRT techniques are shown in Table 1 . Pharmacokinetic Introduction Acute kidney injury (AKI) in the critically ill is an independent risk factor for adverse outcome [1] . Previously, it was suggested that high-volume haemofi ltration (HVHF) may confer a mortality benefi t and lead to a reduction in organ failure compared with standard ultrafi ltration rates (UF) [2] . This was not confi rmed by a recent investigation [3] . It has also not been determined whether ideal Introduction Sepsis-induced immunosuppression has long been considered a factor in the late mortality of sepsis patients, but little is known about the immunity of immunocompetent cells and the eff ect of polymyxin B-immobilized fi ber hemoperfusion therapy (PMX-DHP) on sepsis-induced immunosuppression. The present study was designed to evaluate the eff ect of PMX-DHP on recovery from sepsisrelated immunodefi ciency. Methods Patients with septic shock who were treated with PMX-DHP were enrolled in this study. Study 1: (1) numbers of peripheral lymphocytes and CD4 + T cells, especially regulatory T cells (Tregs), and serum cytokine levels were examined to evaluate the eff ects of PMX-DHP in septic shock patients. (2) Peripheral blood mononuclear cells (PBMCs) in these patients were examined to evaluate infl ammatory cytokine production before and after PMX-DHP. The obtained PBMCs were stimulated with interleukin (IL)-2 and IL-12, anti-CD3 antibody, or lipopolysaccharide for 24 hours, and tumor necrosis factor alpha and interferon-gamma (IFNγ) production in the culture supernatants was measured using enzyme-linked immunosorbent assay. Study 2: whole blood from patients with sepsis was incubated with a polymyxin B-immobilized fi lter (cut into small sizes) for small animals for 2 hours (PMX group), or were treated with 200 μg polymyxin B for 2 hours (PLB group), or were not treated (sepsis group). IFNγ production by PBMCs was compared among the three groups. Results Study 1: (1) the number of CD4 + T cells was lower and the percentage of Tregs in CD4 + T cells was higher in septic shock patients compared with those without shock. A signifi cant increase in the number of CD4 + T cells, a signifi cant decrease in the percentage of Tregs in the CD4 + T-cell population, and a signifi cant decrease in serum IL-10 levels were observed 24 hours after PMX-DHP in septic shock patients who survived compared with those who did not. (2) IFNγ production by PBMCs was signifi cantly lower in patients with sepsis than in healthy volunteers. IFNγ production by IL-2-stimulated and IL-12-stimulated PBMCs signifi cantly increased after PMX-DHP therapy. Study 2: IFNγ production by PBMCs in patients with sepsis increased signifi cantly in the PMX and PLB groups compared with that in the sepsis group. Conclusion PMX-DHP directly decreased the number and percentage of Tregs in peripheral blood circulating CD4 + T cells in patients with septic shock. PMX-DHP improved IFNγ production by natural killer (NK)/NKT cells in patients with septic shock. Therefore, PMX-DHP could improve sepsis-related immunosuppression. [1, 2] . The Actigraph device is a sleep watch that has been shown to have equivalent accuracy to polysomnography and previously used during critical illness to show sleep disruption [3] . Our objective was to assess patients long-term sleep quality using the Actigraph device. Methods Study patients were selected from a 24-bed multidisciplinary ICU. Thirteen patients who were ≥18 years old, stayed longer than 4 days in the ICU and did not have and acute brain injury were followed up at 2 months post hospital discharge. The Actigraph device was given to patients to take home and worn for 72 hours. Previously validated algorithms were used to analyze sleep and wake cycles [4] . Additionally, patient completed the Pittsburgh Sleep Quality Index (PQSI), as a measure of subjective sleep quality. Results Sixty-two percent of patients at 2 months post hospital discharge reported poor sleep quality as per the PSQI. The Actigraph results showed patients' average total sleep time was 6.15 hours, with a sleep effi ciency of 78%. The mean time to fall asleep was 12 minutes. Patients had an average of 11 awakenings per night and were awake for an average of 7 minutes during the awakenings. There were no associations found between patients' perceived sleep quality and total sleep, sleep effi ciency or sleep disruptions. Patients' severity of illness, as measured by the APACHE II score, was statistically associated with lower total sleep time (β = -12.6, P = 0.019), reduced sleep effi ciency (β = -1.18, P = 0.042) and higher number of sleep disruptions (β = 0.64, P = 0.023). The number of days ventilated or ICU and hospital length of stay were not statistically associated with the Actigraph sleep parameters. Conclusion Survivors of critical illness have high levels of sleep dysfunction as measured by actigraphy. Patients' severity of illness while critically ill appears to increase the level of long-term sleep dysfunction experienced. There is discordance between objective and subjective measures of sleep quality, which has been shown previously [5] . Objective measures of sleep quality are needed on a larger number of patients to confi rm these fi ndings. be used off -label to augment sedation and delirium treatment. To our knowledge there have been no publications evaluating the effi cacy and safety of clonidine in ventilated critically ill adults. A survey in German ICUs showed that clonidine was used for sedation in 62% of units [2] . We undertook an enquiry to investigate the off -label use of clonidine in Dutch ICUs. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 life-threatening complication [1] . An audit done in 2011 highlighted that lipid profi le and electrocardiograms (ECGs) were rarely monitored. We recommended regular monitoring of these parameters when propofol sedation is used for over 3 days and that propofol-sparing agents are considered in these patients at risk of developing PRIS. Methods In patients who required propofol sedation for 4 days or more, we prospectively monitored: frequency of performing lipid profi le and 12-lead ECG; and frequency of co-administration of a propofol-sparing agent. Providers rated sedation satisfaction, cough, and the ability to perform the intended procedure using a 100 mm visual analog scale at the end of the procedure blind to BIS (0 unsatisfi ed to 100 satisfi ed). Patients were surveyed at 1 hour and 24 hours regarding overall sedation, symptoms, and procedure recall (unpleasant recall 1, no recall 4). Group diff erences were considered statistically signifi cant at P <0.05. Results Twenty-six procedures were monitored, 20 with CS and six endobronchial ultrasound procedures with DS. There was no diff erence with respect to age or gender. The mean doses of midazolam and fentanyl were 5 mg and 85 μg, respectively. BIS values were lower at all predefi ned points of the procedure for DS cases versus CS. Physicians were more satisfi ed with sedation and the lack of cough with DS, but there was no signifi cant diff erence in patient satisfaction between the two groups with regards to overall sedation, procedure-related symptoms or willingness to have repeat bronchoscopy. Patients with no recall had lower nadir BIS scores (46 vs. 71, P = 0.03) and scores at procedure end (76 vs. 94, P = 0.04) compared with those with any recall. There was no diff erence in the doses of midazolam or fentanyl in CS cases despite statistically signifi cant diff erences in patient recall and BIS scores. Junior fellows scored greater satisfaction with sedation, were less bothered by cough and more often felt able to perform the intended procedure compared with senior fellows. Conclusion Deep sedation resulted in greater physician satisfaction with procedural conditions as well as lower BIS scores but no signifi cant diff erence in patient satisfaction compared with conscious sedation. Patients with no recall of the procedure had lower BIS scores at the nadir and end of the procedure. BIS may be a novel tool to monitor procedural depth, allowing proceduralists to better monitor the narrow window between adequate sedation and dangerous oversedation. Withdrawal Assessment Tool-1 (WAT-1) [1] to evaluate children during weaning from analgesics and sedatives. The patient is diagnosed with withdrawal syndrome when the score is 3 or >3. We compared the subjects who ever had a score over 3 and those with lower scores and assessed the risk factors and outcome of withdrawal syndrome between two groups. In fact, we identifi ed at least 25 epidural hematomas that occurred so far from the following countries: Belgium (n = 1), Brazil (n = 1), France (n = 1), Germany (n = 2), India (n = 2), Italy (n = 1), Japan (n = 2), Korea (n = 1), Malaysia (n = 1), Norway (n = 2), Russia (n = 3), Sweden (n = 1), the UK (n = 3), and the USA (n = 4). Even if from the public health point of view the benefi ts seem to encourage the use of epidural analgesia in cardiac surgery with a possible reduction in perioperative mortality, this topic merits further investigation and the decision to insert the epidural catheter should be discussed with the patient considering both local experience and legal dispute in case of medical complications. Introduction It has been established that early enteral nutrition in critically ill patients improves overall outcome and mortality. In our unit, feeding protocols were established based on the ESPEN recommendations and have been implemented for the last 2 years. The purpose of this study was to evaluate the compliance of our septic patients' nutritional approach with our feeding protocols. Methods A prospective study was done on a 24-bed mixed ICU over a period of 18 months. Eighty-three patients ≥18 years were included in the study. All patients were dependent on mechanical ventilation and met the CCM criteria for sepsis upon admission to the ICU. APACHE II score, SOFA score, weight, BMI and nutritional status were calculated. Patients were initiated for enteral feeding based on the established feeding protocol within 48 hours of admission. The feeding status of the patient was recorded on the start day (D0), day 3 (D3) and day 7 (D7). Factors aff ecting the feeding process and its progression were also recorded Results The patient mean age was 71.4 ± 12.2. LOS in the ICU was 9 to 21 days. Based on BMI, 18% of the patients were malnourished upon admission. APACHE II was 26 ± 7.8 and SOFA was 9.2 ± 4.6. The mortality rate was 42.5%. Enteral nutrition started early in 64 (77.1%) of the patients (D0), on day 3 (D3) 29 (45.31%) patients met their caloric goals and on day 7 (D7) only 18 (28.1%) patients achieved their caloric goals. Discontinuation of enteral feeding was mainly due to procedures, whereas late start and/or decreased hourly intake were due to GI complications, GI intolerance, excessive diarrhoea and hemodynamic instability. There was no association between compliance with the feeding protocol and the LOS, nutritional status, severity or disease progression. Conclusion Although the initiation of early enteral feeding seems adequate for a good number of septic patients on D0, is still far off for a signifi cant percentage of those patients on D3 and is even worse on D7. The caloric goal achievements were better on D3 but very suboptimal on D7. There was no association, however, between nutritional status and compliance with the feeding protocols. It is therefore mandatory to follow daily the nutritional therapy of the septic patient and not rely only on the feeding protocols. Conclusion The level of vitamin C was markedly decreased. Replacement of vitamin C should be considered for the homeless who visit the emergency department after alcohol ingestion. Introduction Phosphate is essential for cell and bone function [1] . In critical illness, hypophosphatemia is common and practice is often to correct even mild derangement [2] . Its supplementation has signifi cant cost and risk including hypotension and hypocalcaemia. We investigated whether changing frequency of routine serum phosphate testing had eff ects on the detected incidence of abnormal plasma levels and prescription of phosphate. Methods This was a service improvement project using observational, anonymous data. We collected data on serum phosphate levels in a 33-bed ITU over two 6-month periods before and after introduction of a new testing regime (phases 1 and 2 Conclusion Reducing testing from daily to three times weekly was not associated with a signifi cant change in mean phosphate levels nor with detection of abnormally high/low phosphate levels. Daily testing, however, is associated with higher rates of phosphate prescription. It is known that there is signifi cant diurnal variation in serum phosphate [2] and we speculate that mild hypophosphataemia self-corrects without intervention. Treating mild hypophosphatemia may therefore not be indicated. Introduction Direct ion-selective electrode without dilution is the most eff ective method for determination of the concentration of the ionized fraction of sodium [1] . We tested the hypothesis that the diff erence between indirect and direct sodium assays would be related to the plasma albumin concentration or the total protein concentration. Methods A retrospective observational study was conducted in which plasma sodium concentrations, from 101 paired venous and arterial samples from patients admitted to the ICU, were respectively analyzed on the arterial blood gas (ABG) analyzers (direct ion-selective electrode) and the central laboratory auto-analyzers (indirect ion-selective electrode). A paired t test was performed comparing the central laboratory and ABG measurements. Correlation and regression analysis were performed between total protein concentration, albumin and the diff erences between the central laboratory and ABG assays for sodium. The central laboratory sodium measurement was, on average, 1.46 mmol/l more than the ICU assay, limits of agreement 1.18 to 1.74 mmol/l greater, P <0.001. Bland-Altman analysis of the central laboratory result minus the ICU sodium measurement had limits of agreement of 1.3 to -4.2 mmol/l. The correlation between the assay diff erences and total protein concentration and albumin were respectively r = 0.24 (P = 0.01) and r = 0.20 (P = 0.04). The diff erence in plasma sodium concentration between the assays increased as the plasma concentration albumin or total protein concentration decreased (respectively: r 2 = 0.04 and r 2 = 0.06). Conclusion The diff erence between indirect and direct sodium assays was found to be statistically related to the plasma albumin concentration and the total protein concentration. Although the relationship was found to be weak, the total protein concentration should be monitored when measuring sodium by indirect ion-selective electrode. Reference Introduction Water-electrolyte disturbances are one of the most common complications of acute brain injury of various origins, threatening the life of the patient and requiring timely correction. In this work we studied the structure of water-electrolyte complications in patients in the neurological intensive care with acute brain injury. Methods We analyzed 259 cases of water-electrolyte disturbances that developed in patients treated in the Department of Intensive Care of Russian Polenov's Neurosurgical Institute from 2001 to 2012. Patients were between 16 and 55 years old. A total of 142 patients were operated for brain tumor, 72 of them of basal-supratentorial localization; eight severe brain trauma; 62 of the hemorrhagic type of stroke, one herpes encephalitis. We excluded from this study the patients with heart and renal failure receiving diuretics. We measured BP, HR, CVP, hourly and daily urine output, level of K and Na in plasma, brain natriuretic peptide (BNP) one to four times a day, and levels of K and Na in urine in single and daily servings. All patients were receiving dexamethasone at a dose between 8 and 32 mg/day as an anti-edema therapy, and thus levels of ACTH and cortisol were not investigated. Underlying this is an ultradian rhythm of discrete pulses [1] as a result of the feedforward:feedback interactions between cortisol and ACTH [2] . These pulses are critical for normal function; pulsatile and constant infusions yield diff erent transcriptional responses [3] and patients on optimal (nonpulsatile) glucocorticoid replacement have twice the age-related mortality of the general population [4] . We have now characterised the ultradian rhythm and pituitary-adrenal interaction of patients undergoing coronary artery bypass grafting (CABG). Methods Twenty male patients presenting for elective CABG (on-pump and off -pump) were recruited. Blood samples were taken for 24 hours from placement of the fi rst venous access. Cortisol was sampled every 10 minutes, ACTH was sampled every hour and cortisol binding globulin (CBG) was sampled at baseline, at the end of operation and at the end of the 24-hour period. Results Cortisol and ACTH were pulsatile throughout the perioperative period and the cortisol-ACTH interaction persists (Figure 1 ). The sensitivity of this interaction (calculated by the ratio of cortisol to ACTH pulse amplitude) changed at about 8 hours post surgery such that the adrenal sensitivity to ACTH increased. Conclusion Both cortisol and ACTH remain pulsatile during and after cardiac surgery and the pituitary-adrenal interaction persists, although the sensitivity of the adrenal glands changes throughout the perioperative period. Our study shows that endogenous glucocorticoid levels reach very high oscillating levels following cardiac surgery, which not only invalidate the interpretation of point measures of adrenal function to diagnose adrenal insuffi ciency but also demonstrate that constant infusions of hydrocortisone are unphysiological. Introduction Many if not most critically ill patients are treated with insulin during their stay in the ICU [1] . Intensive monitoring of the blood glucose level is a prerequisite for effi cient and safe insulin titrations in these patients [2] . Current continuous glucose measurement techniques rely on subcutaneous glucose measurements [3] or measurements in blood [4] . We hypothesized that changes in volatile organic compound (VOC) concentrations in exhaled breath refl ect changes in the blood glucose level. Changes in VOC concentrations can be analyzed continuously using a so-called electronic nose (eNose) [5] . Our aim was to investigate exhaled breath analysis to predict changes in glucose levels in intubated ICU patients. Methods Exhaled breath was analyzed in 15 intubated ICU patients who were monitored with a subcutaneous CGM device. eNose results were compared with subcutaneous glucose measurements and linear regression models were built, including subject-specifi c models, and whole-sample models. The models were validated using temporal validation by training the model on the fi rst 75% of measurements and prospectively testing on the last 25% of measurements. Performance of the models was measured using an R 2 value, Clarke error grids (CEG) and rate-error grid analysis (R-EGA). Results Changes in VOC concentrations were associated with changes in subcutaneous glucose levels. R 2 performance had a mean value of 0.67 (0.34 to 0.98) for subject-specifi c models, and a mean value of 0.70 (0.52 to 0.96) for the model for the whole sample. However, when externally validating the model, the predictive performance dropped to a mean R 2 of 0.19 (0.00 to 0.70) for subject-specifi c models, and 0.04 for the model for the whole sample. Point accuracy in CEG was mostly good with >99% in zones A and B; trend accuracy, as visualized with R-EGA, was low. Conclusion Exhaled breath prediction of glucose levels seems promising. However, performance of the current models is too low to be used in daily practice. Introduction Regardless of the ongoing debate on optimum target ranges, glycaemia control (GC) remains an important therapeutic goal in critically ill patients. Dozens of diff erent insulin protocols for ICUs have been developed with diff erent complexity, eff ectiveness, blood glucose (BG) variability and safety. Although comparison of existing protocols is diffi cult due to signifi cant diff erences in processes and outcome measures, computerized clinical decision support systems generally achieved better GC with consistently lower hypoglycaemia rates than that achieved with paper-based protocols [1] . The enhanced Model Predictive Control (eMPC) algorithm, developed by the CLINICIP group, is the eff ective clinically proven protocol, which has been successfully tested at multiple institutions on medical and surgical patients with diff erent nutritional protocols [2, 3] . The eMPC models the behaviour of glucose and insulin in ICU patients with a variable sample interval based on the accuracy of the BG prediction. It has been integrated in the B.Braun Space GlucoseControl (SGC) system, which allows direct data communication between pumps and Space Control with the incorporated eMPC algorithm. Although SGC is already clinically used in dozens of ICUs worldwide, there are few only published experiences with its use [4] . The primary objective of this multicentre European noninterventional study was to evaluate the performance (effi ciency) of SGC under routine conditions in adult ICU patients requiring BG control. The primary endpoint was the percentage of time within the target range, and secondary outcome measures were the frequency of hypoglycaemic episodes and BG measurement intervals. Patients in this trial were assigned to the target range 4.4 to 8.3 mmol/l. Nutritional management (enteral, parenteral or both) was carried out at the discretion of the each centre. Results Seventeen centres from nine European countries included a total of 508 patients. During the study a total of 29,575 BG values were entered into the SGCs and the same number of recommendations were rendered. The mean time-in-target was 77.5 ± 20.9%. The mean proposed next measurement time was 2.0 ± 0.5 hours. Only four episodes of hypoglycaemia <2.2 mmol/l occurred (0.01% of measurements). Conclusion SGC is a safe and very effi cient system to control BG in ICU patients. Introduction Glycemic control in the ICU has been shown to reduce morbidity, mortality and length of stay. However, current methods of blood glucose (BG) monitoring are invasive, intermittent and laborintensive. Continuous glucose monitoring (CGM) has potential to improve safety/effi cacy of BG control. The performance of a noninvasive, transdermal CGM system (Symphony CGM; Echo Therapeutics, Philadelphia, PA, USA) was evaluated in post-surgical ICU patients. Methods Adult surgical patients with planned ICU admission of ≥24 hours at four medical centers were consented. Following admission to the ICU, the skin of an upper arm was cleaned and a 6 mm diameter site was prepared with controlled micro-abrasion using the Symphony CGM system. A transdermal CGM sensor containing glucose oxidase was applied. Following a 1-hour warm-up period, a calibration was performed. Blood samples were obtained from a radial artery catheter approximately every hour, centrifuged to plasma, and glucose was measured using a YSI 2300 STAT Plus Glucose Analyzer (reference BG). A maximum of 30 reference BG samples were collected for each patient. Samples were collected as frequently as every 15 minutes for trend analysis. CGM was prospectively calibrated every 4 hours. All treatment decisions were based on BG alone. Safety was assessed by visual inspection of the site using a dermatological scale following sensor removal. A study was defi ned as evaluable for CGM sessions >16 hours. Results Thirty-two subjects completed the study. Additional subjects were not considered evaluable due to early discharge from the ICU, failure or early removal of the radial artery catheter, or administration of intravenous acetaminophen. The study cohort was 19% female, 28% diabetes, 56% cardiac surgery, with a mean age of 65 ± 13 years. Overall mean absolute relative diff erence between CGM and reference BG was 12.5%. Continuous glucose error-grid analysis, which assesses point and trend accuracy, showed 98.2% of readings in the A zone (clinically accurate) and 1.2% in the B zone (benign errors). Glucose values ranged from 49 to 324 mg/dl. No device or study-related adverse events were reported. Conclusion The Symphony CGM system demonstrated clinically relevant accuracy and excellent safety in a variety of patients and ICU environments. Future studies are needed to determine whether Symphony CGM can be used to direct therapy and improve BG control in this patient population. Introduction The aim of this study was to determine the impact of preadmission or fi rst 24-hour blood glucose (BG) measurements in UK ICUs on mortality. Methods The Intensive Care National Audit & Research Centre case-mix programme database on adult admissions to general, neuroscience and cardiothoracic critical care units was used for analysis. Within the database, the highest and lowest blood glucose (BG) values measured during the fi rst 24 hours from admission were recorded. BG control value was defi ned as BG (≥4.0 ≤9.9 mmol/l). Other BG levels were defi ned as: very low, ≤2.2 mmol/l (≤40 mg/dl); low, >2.2 ≤3.9 mmol/l (40 to 70 mg/dl); high, ≥10.0 <11.1 mmol/l (180 to 200 mg/dl); and very high, ≥11.1 mmol/l (200 mg/dl). Results There was an increased incidence of mortality in those patients with at least one BG measure below 3.9 mmol/l (70 mg/dl) compared with those without (Table 1 ). There was a link between BG levels and LoS for surviving patients with the longest hospital stays (critical care and total hospital) experienced by those with BG levels below 2.2 mmol/l (40 mg/dl). Conclusion There is a strong association between BG levels during admission and mortality and LoS outcomes. Although it is not possible to make the link with causation from our dataset, we present results from the largest single dataset of critical care unit patients [1, 2] . Introduction While there is ongoing discussion of the optimal range for glycemic control in hospital intensive care, recent publications from Mackenzie and colleagues [1] and Krinsley [2] [3] [4] suggest not only that mean BG should be considered, but also that glucose variability and complexity may be equally important. This has increased the need for continuous systems which can provide early warnings of hypoglycemia and eff ectively measure variability. GlySure Ltd has developed an intravascular glucose monitoring system to simplify the application of hospital protocols for tight glycemic control (TGC) at the point of care. Experience with the original research-based instrumentation [5] has now been incorporated into a combined pre-production monitor and Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 autocalibration unit. We have now completed a 34-patient trial using this device and present the data collected from this study. Methods The study used GlySure sterile, single-use sensors and a 5-lumen 9.5-Fr CVC device, allowing the fl uorescence optical-based sensor to be placed into the patient's right internal jugular vein. The screen data were blinded to the bedside staff . Data from the monitor were later compared with sample measurement from the Yellow Springs (YSI) glucose analyzer. The data accuracy was measured using the mean absolute relative diff erence (MARD), an error calculation tool. The device met the primary safety and eff ectiveness endpoints of the trial. The 456 sample values recorded by the monitor based on 8-hour calibrations were correlated with samples taken from the YSI and the MARD for the study was 9.40%. The analysis showed that 89.23% of the data fell within the A zone of the Clark error grid, with the rest falling within the B zone. Conclusion The results demonstrate a good correlation with the accepted standard of blood glucose determination in ICU practice. Early detection of glycemic excursions can provide carers with the opportunity for an early intervention and thus achieve the elusive target of TGC around the chosen target range. References First clinical study data from therapeutic use of a novel continuous Introduction The aim of this study was to determine the safety and effi cacy of treating patients using a novel intravenous continuous glucose monitoring (CGM) system (GlucoClear™; Edwards Lifesciences). The practical benefi t of controlling blood glucose in the critically ill remains contentious, largely because of the lack of tools to adequately measure and therefore manage levels in real time. A system that is able to provide instant, constantly calibrated, accurate values is a major bonus to ICU care and we report here the fi rst clinical use of a novel CGM system directly used to manage postoperative glycaemia. Methods All consecutive consenting adult patients undergoing cardiac surgery involving cardiopulmonary bypass and requiring postoperative insulin (>98% of patients) were enrolled in the study with a target enrolment of 100. Blood glucose was measured via a dedicated peripheral intravenous catheter with values reported every 5 minutes. The primary outcome was the number of data points in the target glycaemic control range (4.4 to 8.0 mmol/l), using a dynamic insulin protocol. Secondary endpoints include mean glucose levels, time in range and number of hypoglycaemic and hyperglycaemic episodes. Results For the fi rst 45 patients, mean age was 67.7 years (male 52.8%), 54% had undergone valve surgery with or without CABG and the majority (67%) had no history of diabetes. The CGM sensor was typically sighted in the forearm or hand (90.5%) and was resited on 7.5% occasions. Median monitoring time in the ICU was 28.4 hours. Mean glycaemic value was 6.8 mmol/l. Performance notably improved with experience and also highlighted that even previously validated dynamic algorithms will need to be refi ned to maximise the benefi t of GGM. Overall, the evidence from the fi rst applied clinical use of this novel CGM showed that proper safe, tight glycaemic control can be achieved. Further investigations are required to demonstrate the reduction in morbidity and mortality using CGM, and we plan to embark on further studies to address this. Impact of corticosteroid administration in septic shock on glycemic variability L Mirea 1 , R Ungureanu 1 , D Pavelescu 1 , IC Grintescu 1 , C Dumitrache 2 , D Mirea 3 , I Grintescu Introduction The purpose of this study was to assess the relation between glycemic control and the severity of sepsis in a cohort of patients with vasopressor-dependent septic shock treated with corticosteroids [1, 2] . In a prospective, controlled study, 134 patients with septic shock were randomized into three study groups: group A (n = 43), 200 mg/day hydrocortisone hemisuccinate in four daily doses; group B (n = 47), same dose of hydrocortisone hemisuccinate in continuous administration; group C (n = 44), no hydrocortisone hemisuccinate. Patients with diabetes mellitus were excluded. The duration of hydrocortisone treatment was a maximum 7 days. The target blood glucose (BG) level was below 180 mg/dl. BG values were analyzed by calculating mean daily values, standard deviation (SD) of BG values as an index of glycemic variability, and insulin doses. The local ethics committee approved the study. Results There were no diff erences between the three groups at the beginning of the study regarding demographic data and the clinical characteristics, including BG value. BG levels were strongly correlated with severity of septic shock estimated by APACHE II score (r = +0.241; P = 0.005) or Simplifi ed Acute Physiology Score II (r = 0.280; P = 0.001) -Pearson correlation. The risk of death is signifi cantly increased if SD of BG is more than 20 mg/dl (67.7% vs. 20.8%, P = 0.000). A total 94.4% of deceased patients in group A registered a SD of BG more than 20 mg/dl versus 89.5% in group B or 40% in group C (P = 0.006). In total, 53.5% of patients in group A needed insulin therapy versus 25.5% in group B or 27.3% in group C. The dose was between 30.28 ± 6.65 UI/day in group A, 37.85 ± 11.95 UI/day in group B, and 14.28 ± 5.76 UI/day in group C (P >0.05). Conclusion BG variability is highly associated with mortality compared with BG mean daily value or insulin dose. SD levels above 20 mg/dl were associated with a signifi cantly higher mortality rate relative to those with SD levels below 20 mg/dl. Introduction Tuberculous meningitis (TBM) is the least common extrapulmonary manifestation of tuberculosis. Although the UK incidence of TBM is relatively low, it carries a high mortality and morbidity [1] . Neurological deterioration continues to be an important reason for ICU admission [2] . Little is known about the outcomes for TBM patients requiring ICU admission. Our aim is to evaluate patient demographics, TBM clinical data, and the necessity for organ support, and whether this can be associated with outcome. Methods A retrospective study at a tertiary centre of patients with TBM admitted to our ICU between 2000 and 2012. Data were retrieved on demographics, microbiology, radiology and pharmacological fi ndings and type and level of organ support. APACHE II and SOFA scores were calculated. Patients were stratifi ed into two groups: CSF PCR+ve and CSF PCR-ve. Results Eight patients (six males:two females) were evaluated. Total mean age was 43.9 ± 8.09 years. A reduction in GCS was the main reason for ICU admission. All patients received ≥3 anti-TBM drugs and steroids. Two CSF PCR+ve patients had primary drug resistance to isoniazid. There was also a longer mean delay in the time of onset of anti-TBM treatment in CSF PCR+ve patients (4 ± 2.88 days). Higher APACHE II and SOFA scores on admission (mean = 23, 8.5) were associated with a positive CSF PCR result. Increased requirements for mechanical ventilation (100%), tracheostomy (50%), inotropes (75%), neurosurgical intervention (predominantly CSF drainage) (100%) and enteral feeding (50%) were all signifi cant for this group. Mortality and long-term neurological morbidity were substantially higher for PCR+ve patients (75% and 25%). In contrast, the majority of culture-negative patients survived (75%), and experienced good recoveries at follow-up. Overall mortality was 50%. Conclusion This is the second documented case analysis of TBM and ICU admission in adult patients [1, 2] . Findings show that poor outcomes are associated with positive CSF PCR results. Factors linked to poor outcomes include delays in initiating anti-TBM treatment, neurosurgical interventions, and an increased requirement for multiorgan support. Earlier drug susceptibility testing would be preferable particularly for patients with positive CSF PCR cultures. Given the potential severity of TBM, a high index of clinical suspicion remains critical towards optimizing outcomes. References Introduction The aim was to evaluate the role of intrathecal lactate as an early predictor of spinal cord injury during thoracoabdominal aortic aneurysmectomy. Forty-four consecutive patients were scheduled to undergo thoracoabdominal aortic aneurysmectomy. Two patients had a type B dissecting aneurysm; all other 42 patients suff ered from degenerative aneurysm. Methods During surgery, samples of cerebrospinal fl uid and arterial blood were simultaneously withdrawn to evaluate lactate concentration. Samples were collected at fi ve fi xed times during and after surgery: T1 (beginning of the intervention), T2 (15 minutes after aortic cross-clamping), T3 (just before unclamping), T4 (end of surgery), and T5 (4 hours after the end of surgery). Results Mean lactate levels in cerebrospinal fl uid rose consistently from the beginning of the intervention steadily until after surgery (T1 = 1.83 mmol/l, T2 = 2.10 mmol/l, T3 = 2.72 mmol/l, T4 = 3.70 mmol/l, T5 = 4.31 mmol/l). Seven patients developed spinal cord injury; two of them had delayed injury occurring 24 hours after the end of surgery; the remaining fi ve had early onset. In this group of fi ve patients, preoperative cerebrospinal fl uid lactate levels were signifi cantly (P = 0.04) higher than those of the other 40 patients preoperatively (2.12 ± 0.35 vs. 1.79 ± 0.29 mmol/l). Conclusion The preoperative cerebrospinal lactate concentration is elevated in patients who will develop early-onset spinal cord injury after thoracoabdominal aortic aneurysmectomy. This may allow a better stratifi cation of these patients, suggesting a more aggressive strategy of spinal cord function preservation and possibly guaranteeing them a better outcome. year in North America [1] , with over 50% occurring at the cervical level [2] . Cervical spinal cord injuries (CSI) are at particular risk for mechanical ventilation (MV), pulmonary complications and increased length of hospital stay. A few small cohort studies looked at predictors of MV [3] [4] [5] [6] , and to our knowledge there are no studies addressing factors associated with prolonged MV. The purpose of this study was to compare known clinical predictors of MV and determine predictors of prolonged MV. Methods We conducted a retrospective chart review of consecutive CSI admitted between 1 January 2005 and 1 March 2009. We recorded data related to the injury, the duration of MV, respiratory complications, ICU and hospital length of stay and patients' outcomes. A review of the literature identifi ed known predictors (ASIA level, ISS, level of injury, and so forth). Univariate and multivariate logistic regression were used to identify predictors of MV and prolonged MV. Of the 208 patients, 82% were male and the mean age was 51 years. Hospital mortality was 8.7%. Main causes of injury were motor vehicle accidents (39.7%) and falls (43.2%). Injuries below C4 level represented 51.5% of the population. A complete loss of motor function (ASIA level A and B) was found in 34.9% of patients. The mean and median ISS score was 20.7. In total, 78 patients required MV (37.5%) and 30 patients required prolonged MV (14.4%). After multivariate analysis, four predictors of MV were identifi ed: pneumonia (OR = 52.83); ISS score >22 (OR = 4.09); age (OR = 1.02); level C1 to C4 (2.34); and two predictors of prolonged MV: ASIA score A and B (OR = 5.57) and pneumonia (OR = 8.76). Conclusion In our study ISS, cervical level and age were associated with MV but not with the need for prolonged MV, whereas pneumonia was an independent risk factor for both. This is a potentially preventable risk factor where specifi c strategies can be applied to improve patients' outcome. comparing microcirculation of brain-dead diagnosed patients and healthy volunteers. However, the presence of conjunctival fl ow in case of general cerebral fl ow is completely absent, making it diffi cult to use conjunctival fl ow as a substitute for brain fl ow. Previously we developed a model to predict increased ICP, 30 minutes in advance, using the dynamic characteristics of routinely monitored minute-by-minute ICP and mean arterial blood pressure (MAP) signals [1] . The model was developed using data from the Brain-IT database [2] . Here we present external validation results of this model, on a more recent cohort of adult TBI patients from the AVERT-IT project [3] . Methods A retrospective analysis of physiological data collected at the minute resolution, from 43 adult patients from the AVERT-IT project. A total of 67 episodes of ICP above 30 mmHg lasting at least 10 minutes were identifi ed in this cohort. Four-hour time series of ICP and MAP anteceding each episode by 30 minutes were analyzed. Additional time series not preceding elevated ICP episodes were used for validation. Results Table 1 summarizes the main fi ndings. Performance of the model in the original study [1] is reproduced in the fi rst column. The model retains identical performance for all criteria in the cohort of more recent TBI patients of the AVERT-IT database. Conclusion The obtained external validation results, on a previously unseen cohort of adult TBI patients, confi rm the robustness of the model to accurately predict future increased ICP events 30 minutes in advance. The general applicability of the model is probably due to its sparseness, as it only uses two routinely monitored signals as input, namely ICP and MAP. These results are a large step forward in our work toward an early warning system for elevated ICP that can be used worldwide. In patients with aneurysmal subarachnoid hemorrhage (SAH), hypervolemic therapy may result in fl uid overload that may be associated with adverse clinical outcomes [1, 2] . We hypothesized that a goal-directed transpulmonary thermodilution (TPT) monitoring protocol aiming for normovolemia may result in decreased fl uid intake while sustaining adequate volume status in poor-grade SAH patients. Methods Following the introduction of the hemodynamic protocol in 2011, 26 consecutive patients with SAH were included until 2013. Using TPT (PiCCO; Pulsion), cardiac output (CO), global enddiastolic volume index (GEDVI) and extravascular lung water index (EVLWI) were determined. Fluid administration was targeted at fl uid unresponsiveness. Indications for start of the protocol were: hypotension (in spite of fl uids), pulmonary edema or cardiac stunning, daily fl uid balance ≤-1 l, cerebral ischemia (DCI) with progressive symptoms. Data were collected on fl uid intake and output up to 3 days before and 3 days after the start of TPT. We assessed the course of fl uid input and output and hemodynamic parameters before and after the start of the protocol with the generalized estimating equation. Results Mean age was 55 ± 16, and median Glasgow Coma Scale on admission was 8 (IQR 6 to 13). TPT was started at a median of 1 day after ICU admission (IQR 0 to 4). DCI developed in 70% and the in-hospital death rate was 45%. Compared with days preceding the protocol (day -3 to -1), TPT (day 0 to 3) was associated with decreased fl uid intake (compared with day 3 as reference; day -1: +1.14 ± 0.31 l, P <0.001; day 0: +0.68 ± 0.29 l, P = 0.019; day 1: +0.73 ± 0.31 l, P = 0.02), increased Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 S163 fl uid output (day -2: -0.91 ± 0.46 l, P <0.05 compared with day 3), and consequently a strong decrease in fl uid balance (day -3: +2.10 ± 0.56 l, P <0.001; day -2: +2.66 ± 1.14 l, P = 0.02; day -1: +1.41 ± 0.37 l, P <0.001). The decreased fl uid intake and fl uid balances did not result in decreased CO, GEDVI or EVLWI on days 1 to 3 compared with day 0 (mean PCCI 3.5 l/minute/m 2 , mean GEDVI 761 ml/m 2 , mean EVLWI 10.8 ml/kg). Conclusion Our data suggest that in poor-grade SAH patients goaldirected fl uid management with TPT is feasible to decrease intake and increase diuresis without adverse eff ects on cardiac output or preload parameters. Future research should assess the eff ect of such a protocol on clinical outcomes. References The receptor for advanced glycation end products (RAGE) is a multiligand receptor of the immunoglobulin superfamily that has been implicated in multiple neuronal and infl ammatory stress processes. In the present study, we investigated changes in RAGE immunoreactivity and its protein levels in the gerbil hippocampus (CA1 to 3 regions) after 5 minutes of transient global cerebral ischemia. Methods The ischemic hippocampus was stained with cresyl violet (CV), NeuN (a neuron-specifi c soluble nuclear antigen) antibody and Fluro-Jade B (a marker for neuronal degeneration). Results Five days after ischemia-reperfusion, delayed neuronal death occurred in the stratum pyramidale (SP) of the CA1 region. RAGE immunoreactivity was not detected in any regions of the CA1 to 3 regions of the sham group. RAGE immunoreactivity was detected only in the CA1 region from 3 days post ischemia, and the RAGE immunoreactivity was newly expressed in astrocytes, not in neurons. In addition, the level of RAGE protein was highest at 5 days post ischemia. In brief, both the RAGE immunoreactivity and protein level were distinctively increased in astrocytes in the ischemic CA1 region from 3 days after transient cerebral ischemia. Conclusion These results indicate that the increase of RAGE expression in astrocytes at post ischemia may be related to the ischemia-induced activation of astrocytes in the ischemic CA1 region. Introduction The purpose of this study is to investigate the relationship between continuously monitored regional cerebral blood fl ow (CBF) and microdialysis values in severe subarachnoid hemorrhage and traumatic brain injury patients. Methods Advanced multimodal neuromonitoring including measurements of CBF (QFlow, Hemedex) and brain lactate, pyruvate, lactate/ pyruvate ratio, glycerol and glucose values using microdialysis (CMA600, microdialysis) were performed in 21 patients with severe subarachnoid hemorrhage (n = 17) and traumatic brain injury (n = 4). Thirteen of the patients were successfully discharged from the ICU while eight did not survive. Additional recorded parameters include PbrO 2 (Licox, GMS) ICP, CPP, MABP, CVP, local brain temperature, body core temperature, PCO 2 , and blood glucose among others. The cerebral monitoring probes were inserted via a Bolt (ICP, PbrO 2 , microdialysis) and an additional burr hole (CBF). All probes were positioned in the penumbra and location was verifi ed by brain CT. The PbrO 2 arm of this study and its signifi cance is still underway and will be announced later. Thirteen of the patients were successfully discharged from the ICU while eight did not survive. The fi nal data are currently under statistical evaluation, which will be completed at the time of presentation. However, there is indication of a link between brain glucose levels and CBF values, but it is not clear as to the CBF-PbrO 2 correlation that is the second part of this study under evaluation. This may be due to the fl uctuation of brain glucose because of brain ischemia, hyperemia, hypermetabolism or hypometabolism. So far we are able to establish a correlation of CBF and lactate/pyruvate ratio only in persistently low CBF values. Conclusion This will be a fi nal report of a study in human patients with severe subarachnoid hemorrhage and traumatic brain injury. The results indicate correlations of varying signifi cance between the pooled data still under statistical analysis. We hope that the outcome of our study will be able to answer questions regarding the pathophysiology of severe brain injury and guide us in the titration of therapy, as it is needed by each individual patient [1] [2] [3] [4] . Introduction Several reports indicate the potential usefulness of monitoring brain metabolic parameters and their correlation with the system [1] [2] [3] [4] . We want to establish diff erences and correlation in pCO 2 , lactate, serum sodium and C-reactive protein (CRP) between arterial and jugular venous bulb blood. Methods An observational study. Between 1 January and 31 October 2013 we included neurocritical patients (NCP) with multimodal neuromonitoring (MMN). Daily samples of arterial blood and venous jugular bulb blood were obtained for measuring pCO 2 , lactate, serum sodium and CRP. Results There were 45 NCP, six (13%) with MMN (fi ve men). Mean age was 37 ± 11 years (35 to 61). Diagnostics: two TBI, two SAH, one stroke, one lupus encephalitis. APACHE II was 27 ± 6.5 (25 to 39). Glasgow Coma Scale at admission was 14 ± 4 (4 to 14). pCO 2 (mmHg): arterial 41 ± 6.3 versus jugular 45 ± 7.4 (r = 0.7, P = 0.007). Lactate (mg/dl): arterial 11 ± 5.6 versus jugular 13.5 ± 3.9 (r = 0.7, P = 0.9). Sodium (mEq/dl): arterial 141 ± 4.5 versus 141 ± 4.4 (r = 0.8, P = 0.15). CRP (mg/dl): arterial 8 ± 7.4 versus 17 ± 11.6 (r = 0.9, P <0.001). The correlation and trend curves are shown in Figure 1 . Conclusion A suitable correlation is observed for the arterial-jugular bulb in diff erent variables. There is a signifi cant diff erence in CRP and pCO 2 values being persistently higher in the jugular, particularly for CRP. Studies are required to defi ne its interpretation and potential usefulness. Results Wernicke's area activation was observed in nine patients (eight patients with TBI, one with hypoxia). During the subsequent examination, which lasted from 3 to 12 months, seven patients with activated Wernicke's area showed further consciousness expansion up to the minimal consciousness state (further consciousness expansion was seen in two patients). Two other patients with activated Wernicke's area did not show any signs of consciousness. Two patients revealed signifi cant activity in the Broca's area. Conclusion According to the fi rst results of the study one can conclude that behind the outwardly similar clinical symptoms in patients in VS lies a diverse (due to the organization of brain functions) group of patients. fMRI enables one to reveal the fi rst signs of cognitive activity; that is, reveals the linguistic value of speech addressed to the patient which cannot be detected during routine neurological examination. Introduction A patient is declared brain dead (BD) when physicians determine permanent loss of brain functions. Unfortunately, criteria for defi ning BD vary across diff erent countries [1] . We therefore decide to survey BD diagnostic modalities in Europe in order to describe diff erences. Methods A multiple-choice questionnaire was developed on an online platform [2] . Direct link was sent to national representatives of the European Society of Intensive Care Medicine and NeuroIntensive Care section's members. Thirty-three countries were contacted. Answers were reviewed. In cases of discrepancies or missing data, participants were contacted for further clarifi cation. Descriptive statistics have been applied. Results Twenty-eight participants returned the questionnaire (85%). Every country has either specifi c law (93%) or guidelines issued by the scientifi c society (89%). Clinical examination, essential to the diagnosis, is the only requirement in 50% of countries. Coma, apnea, absence of corneal and cough refl exes are always necessary. Blood pressure and electrolytes are checked in 64% as mandatory prerequisites. The apnea test is legally defi ned in 86% of countries. Eighty-two percent of countries require achievement of a target paCO 2 level while the Netherlands' law states target apnea duration. Number of physicians (median 2, range 1 to 4), number of clinical examinations (median 2, 1 to 3), and minimum observation time (median 3 hours, 0 to 12) are variable requisites in diff erent countries. In 50% of nations, additional tests are required. Hypothermia (4%), anoxic injury (7%), inability to complete clinical examination (61%), toxic drug levels (57%), and inconclusive apnea test (54%) are legal indications to perform additional tests. Cerebral blood fl ow investigation is mandatory in 18% of countries, while it is either optional or used only in selected cases in 82%. Conventional angiography is still the preferred method (50%), followed by transcranial Doppler (43%), angioCT (39%), CT perfusion and angioMR (11%). EEG is always (21%) or optionally (14%) recorded. Russia and Croatia evaluate both EEG and cerebral blood fl ow (7%). Conclusion There are still areas of uncertainty and disparities in brain death diagnosis in European countries. This predisposes to misdiagnosis and confusion both for clinicians and families. Measures to promote uniformity of brain death procedures and clinical practice are therefore desirable. Introduction ICU-acquired weakness is a frequent complication of critical illness. It is unclear whether it is a marker or mediator of poor outcomes. We aimed to determine acute and long-term outcomes and costs of ICU-acquired weakness among long-stay (≥8 days) ICU patients and to assess the impact of recovery of weakness at ICU discharge. Methods Data were prospectively collected during an RCT (Clinical trials.gov: NCT00512122) [1, 2] . Impact of weakness (MRC sum <48) on outcomes and costs was analyzed with 1:1 propensity score-matching for baseline characteristics, illness severity and risk factor exposure prior to assessment. Among weak patients, impact of persisting weakness at ICU discharge on risk of death after 1 year was examined with multivariable Cox proportional-hazard analysis. Results A total 227 of the 405 (56%) long-stay assessable ICU patients were weak; 122 weak patients were matched to 122 not-weak patients. As compared with matched not-weak patients, weak patients had a lower likelihood at any time for live weaning from mechanical ventilation (HR: 0.709 (0.548 to 0.918), P = 0.009), live ICU (HR: 0.738 (0.570 to 0.955), P = 0.021) and hospital discharge (HR: 0.682 (0.521 to 0.893), P = 0.005). In-hospital costs/patient (+30.5%, €+5,443/patient, P = 0.04) and 1-year mortality (30.6% vs. 17.2%, P = 0.015) were also higher. The 105/227 (46%) weak patients not matchable to not-weak patients had even worse prognosis and higher costs. At any time within the fi rst year following ICU admission, compared with patients who recovered from weakness and adjusted for potential confounders, those with persistent weakness and MRC sum between 36 and 47 at Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 S167 ICU discharge had a higher likelihood of death (HR: 1.937, 95% CI: 1.048 to 3.581, P = 0.035). This likelihood of late death was even higher for those patients with a more severe degree of persistent weakness (MRC sum <36) (HR: 1.815, 95% CI: 3.693 to 7.514, P <0.001). Conclusion Patients with ICU-acquired weakness had worse acute morbidity outcomes, consumed more resources and revealed higher mortality after 1 year than patients without weakness. Persistence of weakness at ICU discharge further increased late mortality. Introduction Cerebral near-infrared spectroscopy (NIRS) represents an exciting prospect for the non-invasive monitoring of cerebral tissue oxygenation in traumatic brain injury (TBI). Earlier attempts at clinical application of cerebral NIRS demonstrated that further work was needed [1] . The basic layout of the probe portion of these devices consists of a light source and a light detector, arranged at calculated distances and confi gurations in order to observe target tissue. We aim to determine which commercially available NIRS probe represents the most sensitive layout of sources/detectors for the greatest sensitivity in observing tissue oxygenation at the optimal depth for TBI. Methods The optimal depth for grey matter target tissue (grey-white matter junction) was ascertained by reviewing a series of brain CT scans of patients who had sustained a TBI. Set (average) measurements were derived from these identifying the target depth of the grey matter strip from the point of probe placement. Currently there are fi ve commercially available cerebral NIRS systems. Table 1 presents the variety of confi gurations off ered by each device. Source detector layouts were modelled and simulated using the NIRFAST® computational light modelling software (developed at the University of Birmingham) [2] . The novel approach of this modelling system makes no extrapolated assumptions based on subtraction. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 Results We reviewed 32 trauma series CT brain images and the average depth to grey matter was derived as 21.37 mm (range 16.59 to 31.03 mm, SD 2.33) from the surface (Figure 1 ). The spectrum of sensitivity of the fi ve probes was modelled ( Figure 2 ). As is apparent here, the FORE-SIGHT (CAS Medical, CT, USA) probe currently off ers the greatest sensitivity at our derived target depth. Conclusion Based on the computational modelling of our work, the FORE-SIGHT NIRS device by CAS Medical source detector layout provides the greatest sensitivity at depth for the purposes of cortical profound impact of TBI is not only felt by the individuals who suff er the injury but also their caregivers and society as a whole. In this study we observed the long-term outcome of patients with severe traumatic brain injury admitted to our intensive care. Methods This study includes all patients (n = 160) with severe head trauma (GCS <9) admitted to the ICU of the emergency department of a tertiary referral center (Careggi Teaching Hospital, Florence, Italy) from 2009 to 2012. All patients will undergo a clinical assessment after 1 year, which is routine post-intensive follow-up. As an objective index of ability to function after injury, the Glasgow Outcome Scale (GOS) will be used. The neurological evaluation to determine the outcome of patients by GOS is performed in two diff erent modes. All eligible patients 1 year after discharge from the ICU are contacted by telephone by a nurse of the intensive care staff and invited to make a visit to the surgery follow-up, where a intensivist evaluates the patient and determines the GOS. For the patients who were still hospitalized at 6 months in rehabilitation departments, GOS assessment is performed by a doctor of the structure and notifi ed by telephone. The ICU and hospital mortality was respectively 33.7% (n = 54) and 36.9% (n = 59). The mortality at 1 year was 44.4% (n = 71). The results of the neurological follow-up at 1 year were: GOS 2: 5.6% (n = 9), GOS 3: 10% (n = 16), GOS 4: 13.1% (n = 21), GOS 5: 26.9% (n = 43). Conclusion According to the other studies, our data confi rm that the severe traumatic brain injury is associated with a high mortality at 1 year. One-half of the survivors have a diff erent level of disability. Introduction Recent studies have shown that 1,25-dihydroxyvitamin D3 (vitamin D) defi ciency may aff ect negatively the clinical course of traumatic brain injury (TBI) [1] . This problem becomes important with respect to the older patient considering a 50% prevalence of vitamin D defi ciency [2] . Data from the Third National Health and Nutrition Examination Survey [3] document more than 60% of Caucasians aff ected by D defi ciency [4] so that all patients with TBI of any age are theoretically at risk of unfavorable outcome [2] . The objective of this preliminary study was to determine whether low levels of vitamin D at admission to the ICU (<24 hours) could negatively aff ect neurological recovery of patients with TBI. Methods We retrospectively analyzed the data of 46 patients aff ected by TBI (65% severe, 9.5% moderate, 28.5% moderate) both isolated or associated with other extracranial lesions. The sampling of vitamin D was carried out within 24 hours from ICU admission. We had registered GCS at the moment of presentation (GCS in) and at discharge (GCS out) and their diff erence (GCS diff ) compared with levels of vitamin D. Patients that died in the ICU were assigned a GCS out = 0. See Table 1 . Results Our data, according to other studies [5] , confi rm the presence of a defi ciency of vitamin D (Table 1) ; however, they do not demonstrate a statistical signifi cance correlation at the univariate regression (R = 0.04; P = 0.786) between vitamin D level and outcome from the ICU. There was no correlation stratifying patients for age, for TBI class, for Injury Severity Score and for BMI. Conclusion Vitamin D defi ciency is really prevalent in our TBI cases but does not seem to aff ect neurological recovery at ICU discharge; however, these preliminary results should be exposed to several criticisms and need to be confi rmed with prospective studies. Introduction Severe traumatic brain injury (TBI) is a major cause of death in people between 19 and 45 years old. Gastrointestinal dysfunction is the most common complication due to mucosal ischemia, motility disorders, and disruption of the gut barrier, with severe consequences: malnutrition, weight loss, and high risk of infections [1] . Therefore, maintaining the intestinal barrier function is a systematic engineering project. Selected new probiotics due to the capacity to bind and neutralize toxins, and to interfere with pathogen adherence, by immunomodulatory properties, mopping up the infection, could improve recovery of critically ill patients [2] . Our aim was to assess the eff ects of a new probiotic in an early enteral regimen on clinical outcome of severe TBI patients, in terms of VAP incidence, tolerance to enteral nutrition, duration of mechanical ventilation, and mortality rate. Methods A prospective randomized 1-year study of 64 patients 19 to 78 years old allocated to receive for 10 days either an early enteral diet plus a new probiotic (Bioent; Lactobacillus bulgaricus 10 trillion CFU/cp + activated charcoal) every 6 hours (Group A) or the same formula without probiotics (Group B). The diets were isocaloric and isonitrogenous, and there were no diff erences between groups in gender, age, and nutritional status. We assessed the VAP incidence, duration of mechanical ventilation, tolerance to enteral nutrition, length of ICU stay, duration of diarrhea episodes, and mortality rate. The ANOVA test and t test were carried out; P <0.05 was considered signifi cant. Results The infection rate was higher in group B, the duration of mechanical ventilation was shorter in group A, and the patients in group A received 91.7% of total caloric needs by an enteral route versus 74.68% in group B. There is a signifi cant diff erence in the number of diarrhea episodes, and the ICU length of stay was signifi cantly lower in group A; there was no signifi cant diff erence in the mortality rate between groups. See Figure 1 . Conclusion Early administration of new probiotics to severe TBI patients could have benefi cial eff ects in terms of reduction of GI dysfunction, VAP incidence and length of ICU stay. was noted (Spearman's rho = 0.566, P <0.001). The IMPACT and the APACHE II were found to identify slightly diff erent groups of patients that eventually do not survive (Figure 1 ). Conclusion The IMPACT and the APACHE II models showed equal performance for 6-month mortality prediction. A moderately strong, positive correlation, with some major discrepancies between the models, was found. Thus, features of both the IMPACT and APACHE II models are valuable for optimal outcome prediction in patients with TBI treated in the ICU. References demographic profi les of severe TBI in the local context, with implications in the management of severe TBI, particularly the utilisation of critical care resources. Results A total of 780 patients were admitted with TBI during the study period, of which 365 patients (46.8%) sustained severe TBI. The majority (75.3%) of the severe TBI patients were male. There was a bimodal preponderance of severe TBI cases in young adults (age 21 to 40) and older people (age >61). Motor vehicle accidents (48.8%) and falls from <2 m (35.1%) were the main mechanisms of injury. Invasive monitoring was frequently employed in these patients with severe TBI: arterial blood pressure monitoring in 298 patients (81.6%), central venous pressure monitoring in 219 patients (60.0%), and intracranial pressure monitoring in 173 patients (47.4%). The incidence of use of tiered therapy such as sedation, mild hyperventilation, osmotherapy with mannitol, cerebrospinal fl uid drainage, barbiturate coma and decompressive craniectomy to control ICP converged with international practices. Conclusion Young adults and older people involved mainly in motor vehicle accidents and falls respectively were among the high-risk groups for severe TBI. Management of these patients goes beyond the ICU and involves, but is not limited to, social support, emotional motivation and community reintegration of these patients [2] . TBI among the high-risk groups is largely preventable. Public awareness and prevention programmes will go some way in reducing the incidence of TBI amongst the high-risk groups. Results We studied 531 patients, age 40.35 ± 19.75 years, APACHE II 17.94 ± 6.97 points, Glasgow Coma Scale at admission 7.53 ± 3.83 points. Cranial tomography at admission was: diff use injury type I (10.4%), diff use injury type II (28.1%), diff use injury type III (24.5%), diff use injury type IV (8.3%), evacuated mass lesion (22.6%), non-evacuated mass lesion (6.2%). Hospital mortality was 28.6%, 171 (32.2%) patients died after 1 year (6.6% missing) and 181 (34.1%) died after 3 years (16.2% missing). Regarding work activities, after 1 year, 28.5% of 326 patients evaluated have no diffi culties with work, 4.6% have diffi culties but work as before, 10.1% work only part-time or have changed to a job requiring minimum eff ort and 56.7% of patients do not work. After 3 years, 41.2% of 238 patients evaluated have no diffi culties with work, 4.6% have diffi culties but work as before, 12.6% work only part-time or have changed to a job requiring minimum eff ort and 41.6% of patients do not work. Evolution between 1 and 3 years by the McNemar test was statistically signifi cant (P <0.001). A total of 173 patients were in similar situation, only one had deteriorated and in 62 (26.05%) patients the evaluation of work activity had improved. Conclusion After 1 year of admission from the ICU with TBI, more than 50% of patients have diffi culties with work. After 3 years the number of patients who work has increased although approximately 40% of the surviving patients do not work. Introduction Antiseizure prophylaxis is recommended for preventing only early post-traumatic seizures (PTS) in the guidelines for the management of severe traumatic brain injury (TBI) by the Brain Trauma Foundation. Phenytoin is recommended to reduce the incidence of early PTS prophylaxis. Early enteral nutrition has recently shown theoretical advantages for prevention of bacterial translocation to maintain normal turnover of gut mucosa and is commonly used for TBI patients. Our hypothesis is that the enteral administration of antiepileptic agents is also useful for early PTS. Methods This retrospective observational study included all adult patients admitted to our tertiary academic medico-surgical ICU due to TBI from September 2011 to August 2012. Patients who have epilepsy as a past history were excluded. Clinical data were collected from electrical medical archives. The baseline characteristics collected were age, gender, diagnosis, antiepileptic agents, timing of start and adverse eff ects of those agents, and methods of administration. Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 every 2 minutes. A total of 120 students without prior CPR training were recruited from upper secondary school, during regular class hours, and randomly assigned to the t-CPR group (n = 60) versus the v-CPR group (n = 60). The Resusci®Anne SkillReporter™ manikin was used to evaluate CPR performance. Data were transferred from the manikin into a computerized database using the Laerdal SkillReporting System V.2.2.1 software. Further analysis was based on audio-recordings and video-recordings. Primary outcome measures were the results of the Cardiff evaluation test; the secondary measures were global scoring of a complete 7-minute period of CPR. Results The mean chest compression rate increased signifi cantly in the v-CPR group as compared with t-CPR (110 ± 16 vs. 86 ± 28; P <0.0001), while depth remained constant (48 ± 13 mm vs. 47 ± 16 mm, P = NS). Hand positioning was correct in 91.7% of cases with v-CPR, but only in 68% with t-CPR (P = 0.001). The hands-off period was almost nonexistent in the v-CPR group (0 vs. 7 seconds; P = 0.0016), but the median no-fl ow time was signifi cantly greater in the v-CPR group (146 vs. 122 seconds, P < 0.0001). As a consequence, global evaluation of CPR performance revealed a signifi cant improvement in v-CPR group score as compared with the t-CPR group (6 vs. 5, P < 0.001). Conclusion Video-assisted CPR using this original algorithm allows bystanders to reach compression rates and depths close to international guidelines and to reduce hands-off events during CPR. Initial anticoagulation strategy for extracorporeal cardiopulmonary resuscitation patients Y Iwashita, M Matsuduki, M Yukimitsu, K Yokoyama, T Nakata, A Yamamoto, K Ishikura, H Imai Mie University Hospital, Tsu, Mie, Japan Critical Care 2014, 18(Suppl 1):P488 (doi: 10.1186/cc13678) Introduction Extracorporeal cardiopulmonary resuscitation (ECPR) is increasingly being used in emergency and critical care medicine in Japan. Although a major complication of this procedure is bleeding, the optimal heparin dose and activated coagulation time (ACT) remain unknown. Methods We retrospectively evaluated the initial heparin doses, ACT values, and complications of patients who received ECPR between February 2011 and November 2013 at the Emergency and Critical Care Center, Mie University Hospital, Japan. Results ECPR was performed in 45 patients, and the ACT was evaluated in 32 patients. All patients were administered 3,000 U unfractionated heparin at the time of priming the circuit. Patients for whom cannulation took a longer time received an additional 2,000 to 3,000 U unfractionated heparin. The average heparin dose administered was 53.6 U/kg body weight. The average ACT was 231.3 seconds. In 17 of the 32 patients, the ACT exceeded 200 seconds. Three patients experienced fatal bleeding in the chest wall, which could not be stabilized by conservative treatment. One patient developed a cerebral infarction. There were no signifi cant diff erences between the patients with fatal bleeding and those without fatal bleeding with regard to the heparin dose, ACT, and duration of CPR. Conclusion According to the Extracorporeal Life Support Organization guidelines, the target ACT should be around 1.5 times the normal ACT. However, it is diffi cult to obtain the normal ACT in emergency situations. Many of our patients' ACTs exceeded 200 seconds, and three patients experienced fatal bleeding that was possibly due to chest compression. Post-cardiac arrest patients often experience coagulopathy due to either cardiac arrest or hypothermia therapy. Therefore, an anticoagulation protocol other than the pulmonary extracorporeal membrane oxygenation protocol is required for ECPR patients. We evaluated our anticoagulation protocol for ECPR and observed that our patients' ACTs frequently exceeded the target value and some experienced fatal bleeding. The anticoagulation protocol for post-CPR patients may need to be reconsidered. Introduction Out-of-hospital cardiac arrest (OOHCA) causes 60,000 UK and 300,000 US deaths each year. Survival to hospital discharge in the developed world has historically been 7 to 10% with obvious cognitive impairment in 10% of survivors. Primary percutaneous coronary intervention (PPCI) and targeted temperature management (TTM) (or at least hyperthermia avoidance) have been shown to improve survival in comatose patients post OOHCA. There is no reliable method to predict poor outcome on presentation. We aimed to identify factors associated with poor outcome in our single-centre regional referral OOHCA population. Methods We performed a pragmatic single-centre retrospective review over 18 months commencing 1 January 2011 of all patients admitted to our regional OOHCA centre ICU following successful resuscitation from OOHCA. In keeping with guidelines, all patients were assessed for suitability for PPCI and TTM. A good outcome was defi ned by a Pittsburgh Cognitive Performance Category (CPC) score of 1 to 2 (independence, mild impairment) on hospital discharge. A poor outcome was defi ned as death or CPC 3 to 4 (moderate to severe impairment, coma) on hospital discharge. CPC scoring was determined Presenting base defi cit (mmol/l) -7.9 ± 6.2 -13.8 ± 5.9 -11.5 ± 6.7 <0.001 Presenting lactate (mmol/l) 4.3 ± 3.0 8.3 ± 4.5 6.8 ± 4.4 <0.0005 Introduction Lack of standardized care contributes to low survival in admitted out-of-hospital cardiac arrest (OHCA) patients. The objective of our study was to implement a Post Arrest Consult Team (PACT) and improve the quality of care for admitted OHCA patients. We conducted a prospective cohort study with concurrent controls from February 2011 to February 2013 in a network of 29 Toronto-area hospitals. The PACT was implemented in two hospitals and functioned as a consult service with a nurse and physician oncall 24/7. Patients from other network hospitals acted as concurrent controls. The PACT focused on four key processes of care: targeted temperature management (TTM); coronary angiography; avoidance of premature withdrawal of life-sustaining therapy (WLST <72 hours) on the basis of neuroprognostication; and electrophysiology assessment. We included nontraumatic OHCA patients who were >18 years old, survived at least 6 hours, and were comatose. We excluded patients with do-not-resuscitate orders, intracranial or other severe bleeding. We used generalized linear mixed models to assess whether PACT implementation was associated with higher odds of achieving each of the four targeted processes of care while adjusting for secular trends unrelated to the intervention. The primary analysis included 162 patients from two intervention hospitals and 892 from 27 control hospitals. Thirtytwo percent of the patients were female and the mean age was 65.3 ± 16.5 years. Almost one-half (46%) of patients had a shockable initial cardiac arrest rhythm, 41% had bystander CPR, and 5% had an AED applied. PACT did not improve use of TTM (ratio of ORs = 1.03, 95% CI = 0.89 to 1.20), angiography for patients without ST-elevation Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 S178 myocardial infarction (ratio of ORs = 1.10, 95% CI = 0.87 to 1.40), or electrophysiology assessment (ratio of ORs = 1.06, 95% CI = 0.81 to 1.38) as compared with concurrent control hospitals. Patients in the intervention group were less likely to have life support withdrawn within 72 hours on the basis of neuroprognosis compared with patients in the concurrent control group (ratio of ORs = 0.62, 95% CI = 0.39 to 0.98). Conclusion PACT was associated with reduced WLST <72 on the basis of neuroprognostication but did not improve other important post-cardiac arrest processes of care. Further work is underway to identify factors that infl uenced implementation. This will guide future consideration of the PACT model in other settings. One Introduction This retrospective audit evaluated adult patients who suff ered in-hospital cardiac arrest (IHCA) against the recent National Confi dential Enquiry into Patient Outcome and Death (NCEPOD) report [1] . It looked specifi cally at the recognition of the acutely unwell, the interventions made, the decisions taken from admission through to the post-arrest period and the outcomes following cardiopulmonary resuscitation (CPR). The audit aims to guide future improvements in preventing cardiac arrest and enhancing end-of-life care decision-making. Methods Medical notes of adult patients suff ering IHCA, over a 1-year period, were identifi ed and data were collected using a validated NCEPOD audit tool. These data included patient demographics, initial clerking and consultant review, patient care during the 48 hours prior to cardiac arrest, the resuscitation status of the patient, the resuscitation attempt, the post-cardiac arrest care and survival to discharge rates. Results Medical notes were available for assessment for 69 out of the 82 patients that were identifi ed as having IHCA between 1 October 2011 and 30 September 2012. The frequency of IHCA showed no correlation to day of the week or month. Initial clerking was incomplete in historytaking (16% vs. 14% in NCEPOD) and in examination (46% vs. 24% in NCEPOD). The majority of patients were appropriately escalated in a timely fashion (94% vs. 82% in NCEPOD), but fi rst consultant review was delayed beyond 12 hours in 49% of cases (48% in NCEPOD). A total 81% of patients suff ered cardiac arrest 24 hours after admission (68% in NCEPOD). Warning signs for cardiac arrest were considered present in 59% of cases (75% in NCEPOD), with a signifi cant proportion going unrecognised (27%) despite multiple medical reviews. Out-ofhours CPR attempts (68% vs. 59% in NCEPOD) seemed be associated with poorer survival. The survival to discharge rate after in-hospital cardiac arrest was 10.1%. This compares with 14.6% in the NCEPOD data and 20% in larger studies [2] . Ninety per cent of patients had no documentation of resuscitation status (78% in NCEPOD). The results from this audit highlight persistent defi ciencies in the care pathway for the acutely unwell patient. Improvement will be focused on earlier consultant review and better recognition of warning signs with appropriate action taken. Furthermore, earlier routine and senior clinician-led discussions on appropriate end-of-life care are vital. Introduction Therapeutic hypothermia (TH) improves the neurologic outcome of patients who survive after cardiac arrest but suff er from severe secondary neurological damage [1] . In 2010, the use of TH after cardiac arrest was included in the ERC guidelines. The purpose of this study was to investigate the knowledge of the medical and nursing staff on the implementation of TH in patients after cardiac arrest [2] . Methods Data were collected by an anonymous questionnaire designed for the purpose of research, addressed to medical and nursing staff of Greek hospitals. The questionnaire consisted of questions about knowledge and behavior related to TH induction, target temperature and duration of cooling. Information about the potential barriers to implementation of TH was also collected. We obtained 344 questionnaires from 16 hospitals. The population of the study was registered nurses (RN) (63.8%) and doctors Critical Care 2014, Volume 18 Suppl 1 http://ccforum.com/supplements/18/S1 S179 (36.2%). The majority of health staff (81.5%) had never implemented TH. A total 45.8% of respondents stated that the main reasons for not using TH were the lack of information and training about the method, the lack of nursing staff , the lack of available cooling methods and the required time. The most common methods of application were cold packs and intravenous fl uids. Only 30.2% of the doctors and 5.5% of the nurses (P <0.001) actually had the knowledge to implement TH, and this was demonstrated by correct answers. Of the respondents who answered that they did know the method, only 23.9% answered correctly; about the target temperature, the maintenance and rewarming phase. A total 59.1% of doctors, despite having attended the Advanced Cardiac Life Support seminar, were not able to answer correctly the knowledge questions. Continuous education of health professionals and the existence of a protocol were proposed by 65% of participants as the best way of increasing knowledge and adherence with ERC guidelines about TH. Conclusion Therapeutic hypothermia is rarely used in Greek hospitals. The level of knowledge is mainly related to the lack of education and the lack of information about new techniques. Programs for continuing education are necessary for the use of new therapeutic techniques in the fi eld of health. Results Sixteen patients of average age 57 included 12 men. Twelve patients had CPC of 1 to 3 on hospital discharge. Sixteen shivering events were monitored with calorimetry among the 12 patients. The average rate of energy expenditure (without shivering) leading up to and following paralytic treatment was 1,425 kcal/hour (± 489 kcal/ hour) and 1,386 kcal/hour (± 235 kcal/hour). This was signifi cantly greater than the predicted basal energy expenditure (1,089 ± 222 kcal; P = 0.007 and P <0.001). Time from a change in baseline energy expenditure to recognition and treatment of clinical shivering was 57 (± 64) minutes, and from treatment with neuromuscular blockade to baseline energy expenditure was 30 (± 20) minutes. This accounts for a total diff erence of 15,223 kcal (± 10,997 kcal) before treatment and 7,113 kcal (± 3,706 kcal) after treatment for each shivering episode compared with baseline (P = 0.01 and P = 0.003). Conclusion The energy burden of shivering is underestimated by standard nutritional formulas in patients undergoing TTM after cardiac arrest. Subclinical shivering is associated with increased energy expenditure. Clinical recognition occurs long after the increase in metabolic activity, and persists for a signifi cant period of time after treatment. These fi ndings should infl uence how shivering is monitored and treated during TTM. Temperature management following cardiac arrest: introducing a protocol improves compliance with targets P Creber, G Talling We audited the achievement of therapeutic hypothermia (TH) before and after the introduction of a cooling protocol. Instituting TH is recommended following the return of spontaneous circulation (ROSC) for many patients who survive a cardiac arrest [1, 2] . The key intervention may be the avoidance of hyperthermia rather than cooling [3] . Methods We conducted a chart review of all patients admitted to the Department of Critical Care (DCC) at our hospital following cardiac arrest over 2 years in 2010 to 2012 (Group 1). We recorded compliance with key recommendations produced by the Royal College of Anaesthetists [4] although we defi ned post-ROSC hyperthermia as >37.2°C rather than >38°C. A TH protocol was designed and personnel in the emergency department and DCC educated as to its use. Recommended practice was the infusion of cold i.v. normal saline (1 to 2 l) followed by the use of an intravascular cooling device (Alsius CoolGard™). Data collection was then undertaken after introduction of the protocol for all patients admitted to the DCC following cardiac arrest in November 2012 to 2013 (Group 2). Results Forty-three patients were admitted in Group 1, 28 in Group 2. Of these, 42% in both groups were following out-of-hospital (OOH) VF arrests. Cooling was attempted in 88% and 82% of OOH VF patients respectively. For patients with either in-hospital or non-VF/ VT cardiac arrests, the numbers cooled were 16% and 12.5%. Cooling initiation within 1 hour increased from 27 to 50%. Achievement of a target temperature of 32 to 34°C within 4 hours of ROSC was 55% and 50% respectively. Target maintenance for 12 to 24 hours after ROSC increased 79% to 100%. Avoidance of hypothermia <31°C for 48 hours after ROSC improved 95% to 100%. Slow rewarming at 0.25 to 0.5°C/ hour to 37°C was achieved in 76% and 90%. Avoidance of temperature >37.2°C for 48 hours after ROSC increased 84 to 100%. Of the patients cooled, survival with good neurological outcome was achieved in 52% in Group 1 and 88% in Group 2. Conclusion The institution of a temperature management protocol improved compliance with recommended goals, both in achieving hypothermia and in the avoidance of hyperthermia. Occurrence of SND was defi ned as neurological disarrangement such as irritability, confusion, or delirium within 48 hours after operation. Results Durations of CPB, aortic clamp, and circulatory arrest were 194 ± 51, 136 ± 53, and 70 ± 17 minutes respectively. Nine patients underwent ACP (89 ± 26 minutes) and 11 patients underwent RCP (56 ± 25 minutes). Concentration of ALac, ScvO 2 , and P changed at 2, 4, 6, 10 hours after CPB as follows: ALac 4.0 ± 1.9, 5.1 ± 2.6, 5.3 ± 3.3, 4.0 ± 2.8 mmol/l, ScvO 2 71 ± 6.7, 67 ± 10, 67 ± 9.7, 65 ± 11%, and P 2.9 ± 0.9, 2.4 ± 0.7, 2.2 ± 1.0, 1.8 ± 0.8 mg/dl, respectively. Serum P recovered to 2.6 ± 1.4 at 18 hours after CPB. The incidence of hypophosphatemia (<2.6 mg/dl) and SND in our series were 18/20 (90%) and 14/20(70%) respectively. There was a correlation between minimum P and time to confi rm M6 on the GCS (P = 0.508, Fisher r-to-z transformation), but no signifi cant correlation between SND. The mortality rate during the fi rst 28 days was 5% (1/20 Introduction Therapeutic temperature modulation (TTM) is widely used in the care setting to improve outcomes of patients with traumatic brain injury (TBI). Through fever prevention, both oxygen utilization and caloric expenditure are reduced, so metabolic effi ciency can be maximized [1] . However, patient cooling is not without consequences and shivering is experienced by more than 70% of patients achieving TTM. Because shivering triggers an increase in metabolic demand, causing additional oxygen consumption and the promotion of catabolism, its prevention is ideal [2] . We set out to review the data surrounding the anti-shivering component of a normothermia protocol in the surgical ICU (SICU) of one Minnesota hospital. Methods A retrospective review was conducted looking at SICU patients managed with a normothermia protocol, with particular attention paid to the anti-shivering portion of the protocol. Serum magnesium (Mg) levels were assessed prior to initiation of TTM and Bedside Shivering Assessment Scale (BSAS) scores were collected. Results Twenty patients receiving TTM for TBI were evaluated (March to October 2013). One-half of the patients maintained targeted BSAS scores <1 for the full duration of TTM (n = 10 of 20). Serum Mg levels at the initiation of TTM were observed to negatively correlate with the level of shivering, as indicated by the BSAS scoring system (P = 0.02). See Figure 1 . Conclusion The literature suggests the positive impact of TTM on patient outcomes can be maximized with shivering prevention [2] . Current SICU practices provide a similar Mg loading dose for all patients, regardless of baseline Mg levels. In our observed patients, achieving a baseline serum Mg level >2 was associated with lower shivering scores throughout the TTM course. This supports the hypothesis that serum Mg concentrations prior to TTM are important predictors of shivering reduction, and suggests that loading doses of Mg should be tailored to the individual patient to achieve such levels. Adverse events in British hospitals NHS for Patients: Improving Medication Safety Safe Prescribing An Age Old Problem. London: NCEPOD Standardising the Assessment of Acuteillness Severity in the NHS. London: Royal College of Physicians Socioeconomic deprivation and burns Ethnic diff erences in burn mechanism and severity in a UK paediatric population Comparison of antipyretic eff ectiveness of equal dose of rectal and oral acetamenophen in children Acetaminophen has a greaterantipyretic effi cancy thanaspirin in endotoxemia. A randomised double blind placebo controlled trail Rapid response systems: going beyond cardiac arrest and mortality Rapid response teams: qualitative analysis of their eff ectiveness Intensive care transfers Confi dential inquiry into quality of care before admission to intensive care Complications after unintentional intraarterial injection of drugs: risks, outcomes, and management strategies Transfusion-related acute lung injury: defi nition and review Toward an understanding of transfusion-related acute lung injury: statement of a consensus panel Anaesth Intensive Care Low value of routine chest radiographs in a mixed medical-surgical ICU Abandoning daily routine chest radiography in the intensive care unit: meta-analysis Renal perfusion assessment by renal Doppler during fl uid challenge in sepsis Renal failure in septic shock: predictive value of Dopplerbased renal arterial resistive index Eff ect of blood volume, mean circulatory pressure, and stress relaxation on cardiac output Surviving sepsis campaign: international guidelines for management of severe sepsis and septic shock: 2012 Terlipressin induced hyponatremic seizure Hyponatremia in patients treated for severe gastrointestinal bleeding due to portal hypertension Acetaminophen-induced hypotension Automating and simplifying the SOFA score in critically ill patients with cancer References 1. Chaitman BR.: Circulation Heart disease and stroke statistics -2013 update Fatores de Risco Cardiovascular no Brasil: os próximos 50 Anos Management of the critically ill geriatric patient Decreased T-cell repertoire diversity in sepsis: a preliminary study Vitamin D defi ciency Incidence and risk factors of vitamin D defi ciency in critically ill patients: results from a prospective observational study Relationship between vitamin D status and ICU outcomes in veterans Relationship of vitamin D defi ciency to clinical outcomes in critically ill patients Routine chest X-rays in intensive care units: a systematic review and meta-analysis Chest radiography practice in critically ill patients: a postal survey in the Netherlands Application and comparison of scoring indices to predict outcomes in patients with healthcare-associated pneumonia Combining the National Early Warning Score with an early warning score based on common laboratory test results better discriminates patients at risk of hospital mortality. In Rapid Response Systems and Medical Emergency Teams Test characteristics of ultrasonography for the detection of pneumothorax. A systematic review and meta-analysis International Liaison Committee on Lung Ultrasound. International evidence-based recommendations for point-of-care lung ultrasound Artifi cial ventilation in the semi-recumbent position improves oxygenation and gas exchange Eff ect of body positioning in ventilated obese patients The eff ect of semi-recumbency in ventilated morbid obese patients Acute respiratory distress syndrome: the Berlin Defi nition Predictors of mortality in acute lung injury during the era of lung protective ventilation P273 PEEP titration on the basis of intratidal resistance-volume profi les S Buehler 1 , S Schumann 1 , M Lichtwarck-Aschoff Sweden Critical Care The Acute Respiratory Distress Syndrome Network Bedside ultrasound assessment of positive endexpiratory pressure-induced lung recruitment Which is the most important strain in the pathogenesis of ventilator-induced lung injury: dynamic or static? The visco-elastic properties of the lung General method for describing and extrapolating monotonic transients and its application to respiratory mechanics Lung hysteresis as a function of VT (A) and PEEP (B). *P <0.05 versus VT 250 ml or PEEP 0 cmH 2 O (Holm-Sidak method) The state of immunity in newborn infants with respiratory disease Method of treatment of RDS in neonates receiving mechanical ventilation The ARDS Defi nition Task Force: JAMA 2012 Determination of respiratory system mechanics during inspiration and expiration by FLow-controlled EXpiration (FLEX): a pilot study in anesthetized pigs Flow-controlled expiration (FLEX): a novel ventilation mode to attenuate experimental porcine lung injury Weaning from mechanical ventilation Managing the apparent and hidden diffi culties of weaning from mechanical ventilation Airway mucus function and dysfunction Extubation failure in intensive care: predictors and management CESAR Trial Collaboration: Effi cacy and economic assessment of Conventional Ventilatory Support Versus Extra-Corporeal Membrane Oxygenation for Severe Adult Respiratory Failure (CESAR): a multicentre randomised controlled trail Referral to an extracorporeal membrane oxygenation centre and mortality among patients with severe 2009 infl uenza A(H1N1) Australia and New Zealand Extracorporeal Membrane Oxygenation (ANZ ECMO) Infl uenza investigators. Extracorporeal membrane oxygenation for 2009 infl uenza A(H1N1) acute respiratory distress syndrome Regionalisation of intensive care and extra-corporeal membrane oxygenation services in the UK: beliefs about the evidence, benefi t and harm People's experiences of being mechanically ventilated in an ICU: a qualitative study Facilitating speech in the patient with a tracheostomy Experience of an intermediate respiratory intensive therapy in the treatment of prolonged weaning from mechanical ventilation Tracheostomy tube in place at intensive care unit discharge is associated with increased ward mortality The Fourth National Audit Project of the Royal College of Anaesthetists and the Diffi cult Airway Society: Major Complications of Airway Management in the UK. Report and Findings. London: Royal College of Anaesthetists Frenchay Hospital, Bristol, UK Critical Care High Impact Intervention High-frequency oscillation for ARDS High-frequency oscillation in early ARDS The dependence of infectious complications from the trauma severity Emerg Infect Dis The clinical signifi cance of candida colonization of respiratory tract secretions in critically ill patients Adverse events and their relation to mortality in out-ofhospital cardiac arrest patients treated with therapeutic hypothermia BTS guidelines for the management of community acquired pneumonia in adults: update Nosocomial infections by Staphylococcus epidermidis Does this adult patient with suspected bacteraemia require blood cultures? Mediators of infl ammation in acute kidney injury Erythropoietin and renoprotection Erythropoietin (EPO) in acute kidney injury Severe acute kidney injury according to the RIFLE (risk, injury, failure, loss, end stage) criteria aff ects mortality in lung transplantation Short-term and long-term outcomes of acute kidney injury after lung transplantation Chronic kidney disease after lung transplantation: incidence, risk factors, and treatment Ann of Intensive Care Adv Chronic Kidney Dis London: House of Commons Library Research Regional citrate versus systemic heparin for anticoagulation in critically ill patients on continuous venovenous haemofi ltration: a prospective randomized multicentre trial the ICU: a case series P Fabbrini 1 , R Rona 1 , M Migliari 1 , M Early use of polymyxin B hemoperfusion in abdominal septic shock. The EUPHAS randomized controlled trial Nutrition intervention in the critically ill cardiothoracic patient Prognostic value of nutritional screening tools for patients scheduled for cardiac surgery Treatment of hypophosphatemia in the intensive care unit Diurnal variations in serum biochemical and haematological measurements Dryagina Russian Polenov' s Neurosurgical Institute, Saint Petersburg, Russia Critical Care P438 Evaluation of blood glucose control in ICU patients with Space GlucoseControl: a European study J Blaha Intensive insulin therapy in critically ill patients The NICE-SUGAR Study Investigators: Intensive versus conventional glucose control in critically ill patients The impact of the severity of sepsis on the risk of hypoglycemia and glycemic variability Eff ect of mode of hydrocortisone administration on glycemic control in patients with septic shock. A prospective randomized trial P448 Tuberculous meningitis: a 10-year case analysis of critical care admissions N Ojukwu Chronic Dis Inj Can P455 Eff ects of cardiac output-guided hemodynamic management on fl uid administration after aneurysmal subarachnoid hemorrhage B Bergmans Transcranial Doppler for predicting delayed cerebral ischemia after subarachnoid hemorrage Brain death worldwide: accepted fact but no global consensus in diagnostic criteria P468 Choosing a cerebral near-infrared spectroscopy system for use in traumatic brain injury: deriving the ideal source detector layout D Davies 1 , M Clancy 2 , Z Su 1 , H Denghani 2 , A Belli 1 1 National Institute for Health Research Surgical Reconstruction and Microbiology Measuring functional and quality of life outcomes following major head injury: common scales and checklists Prediction of outcome after moderate and severe traumatic brain injury: external validation of the(IMPACT) and (CRASH) prognostic models Vitamin D and genomic stability Combination treatment with progesterone and vitamin D hormone may be more eff ective than monotherapy for nervous system injury and disease Plan and Operation of the Third National Health and Nutrition Examination Survey, 1988-94. DHHS publication No. (PHS)94-1308 -Vital and Health Statistics, series 1 Calcium and vitamin D: skeletal and extraskeletal health Vitamin D defi ciency in septic patients at ICU admission is not a mortality predictor Traumatic brain injury and intestinal dysfunction Probiotics in critically ill patients A brief review of traumatic brain injury rehabilitation Time to Intervene? A Review of Patients who Underwent Cardiopulmonary Resuscitation as a Result of an In-hospital Cardiorespiratory Arrest. London: NCEPOD Despite this, in our country there are still few centers that apply hypothermia post CA regularly. We describe a Chilean experience with endovascular hypothermia post CA All surviving comatose patients after CA were included, and underwent endovascular hypothermia management according to protocol. CoolGard™ internal cooling equipment was used. Variables: delay between CA and hypothermia (34°C), time in hypothermia, complications, ventilatory and hemodynamic management. Main outcome measures: mortality and neurological follow-up to 6 months Delay was considered (time between CA and achieving target temperature of 34°C): median 10 Complications of hypothermia: fi ve hypokalemia (18.5%), three ventricular arrhythmias (11%), one vein thrombosis (3.7%) There was no mortality or severe complications associated with endovascular hypothermia. It is a safe and feasible technique implemented in Latin American critical care units. Even the delay in achieving the objective of hypothermia is very long. References 1 Therapeutic hypothermia after cardiac arrest Year in review 2010: critical care -cardiac arrest and cardiopulmonary resuscitation ERC Guidelines Implementation of therapeutic hypothermia Figure 1 (abstract P502) References The 28-day survival rates were 73.3%, 40% and 16%, respectively. Conclusion PDF may be a useful blood purifi cation therapy for ALF, but PDF should be performed below a MELD score of 30. References Acknowledgement Supported by EC 81/00469. Acknowledgement Funding from Edwards Lifesciences. References Acknowledgement GH and HVM contributed equally to this study. References Introduction Diffi culties in prediction and early identifi cation of acute kidney injury (AKI) have hindered the ability to develop preventive and therapeutic measures for this syndrome. We tested the hypothesis that a urine test measuring insulin-like growth factor-binding protein 7 (IGFBP7) and tissue inhibitor of metalloproteinases-2 (TIMP-2), both inducers of G 1 cell cycle arrest, a key mechanism implicated in AKI, could predict AKI in cardiac surgery patients. Methods We studied 50 patients at high risk for AKI undergoing cardiac surgery with cardiopulmonary bypass. Serial urine samples were analyzed for [TIMP-2] × [IGFBP7] concentrations. The primary outcome measure was AKI as defi ned by international consensus criteria following surgery. Furthermore, we investigated whether urine [TIMP-2] × [IGFBP7] could predict renal recovery from AKI prior to hospital discharge. Results Twenty-six patients (52%) developed AKI. Diagnosis based on serum creatinine and/or oliguria did not occur until 1 to 3 days after cardiopulmonary bypass. In contrast, urine concentration of [ Urine microscopy score combined with albumin creatinine ratio score improves prediction of future acute kidney injury (AKI) and worsening AKI JJ Dixon 1 , K Lane 1 , W Fleming-Nouri 2 , H Cheema 2 , P Walker 2 Introduction Patients with AKI have a high morbidity and mortality. Diagnosis of AKI may be improved by examination of the urinary sediment with microscopy and by measurement of urine albumin. A urine microscopy score (UMS; 0 to 4 points) of renal tubular epithelial cells and granular casts has previously been developed to aid diagnosis. We have devised a urine albumin:creatinine ratio score (ACRS; 0 to 4 points) and have combined this with the UMS with the aim of stratifying the risk of developing future AKI or worsening AKI (UMS-ACRS; 0 to 8 points). The aims were to compare UMS-ACRS in critically ill patients with and without AKI at ICU admission, and to determine whether a high UMS-ACRS can predict if critically ill patients develop AKI or worsening AKI. Methods Investigators were blinded to diagnosis prior to urine collection. Microscopy was performed on centrifuged urine obtained from 227 consecutive critically ill patients in a general ICU on day 1 of admission. Five photographs were taken and the mean UMS was calculated. An independent reviewer scored the photographs. The urine albumin:creatinine ratio was calculated and the ACRS determined. The UMS was then combined with the ACRS. Results Mean UMS-ACRS ± SD was higher (2.66 ± 1.57) in patients with AKI on ICU admission (n = 106) compared with those without AKI (mean UMS-ACRS = 2.40 ± 1.03; n = 120), unpaired t test P = 0.14. Patients who developed AKI or worsening AKI (n = 58) had a mean score of 2.79 ± 1.21 versus 2.30 ± 1.14 in those who never developed AKI or improved (n = 150), P = 0.006. UMS-ACRS score >2 on admission had a sensitivity of 0.89 for identifying progressive AKI. UMS-ACRS score of 5 to 8 on admission had a positive predictive value for worsening AKI of 60%, a negative predictive value of 69% and a likelihood ratio 3.1 for developing AKI or worsening AKI. Conclusion The mean UMS-ACRS is higher in patients with AKI. Combining UMS with ACRS improves prediction and stratifi cation of which patients will develop AKI, or progressive AKI, after ICU admission. Clinical implications are that urine microscopy and ACR calculation are safe, inexpensive, non-invasive and may improve prediction of AKI in critically ill patients, potentially leading to earlier intervention and improved outcomes. Introduction Acute kidney injury (AKI) is increasingly common in critically ill patients and many patients with severe kidney injury require continuous renal replacement therapy (CRRT). However, little is known regarding the incidence rate and associated factors for developing chronic kidney disease after CRRT in AKI patients. This study aimed to investigate renal outcome and the factors associated with incomplete renal recovery in AKI patients who received CRRT. Methods Between January 2011 and August 2013, 397 patients received CRRT in our ICU. Among them, patients who had normal renal function before AKI and were discharged without maintenance renal replacement therapy (RRT) were included in this study. We examined the incidence of incomplete renal recovery with estimated glomerular fi ltration rate (eGFR) <60 ml/minute/1.73 m 2 during follow-up. Factors that increased risk of incomplete renal recovery after AKI were investigated with multiple logistic regression. Results Forty-one AKI patients were discharged without further RRT and followed up for a mean of 7 months. Sixteen (39.0%) of 41 patients incompletely recovered their renal function. Patients with incomplete renal recovery showed older age and longer duration of anuria compared with complete renal recovery patients (69.7 ± 7.0 vs. 54.8 ± 16.9 years, P = 0.002; 128.6 ± 192.1 vs. 26.9 ± 66.6 hours, P = 0.019). Multivariate analysis adjusting for sex, initial eGFR, hemoglobin level, diabetes mellitus and hypertension showed that old age and long duration of anuria were independent risk factors for incomplete renal recovery (OR = 1.143, 95% CI = 1.020 to 1.281, P = 0.021 and OR = 1.011, 95% CI = 1.001 to 1.032, P = 0.038, respectively). Conclusion The renal outcome of severe AKI requiring CRRT was poor even in patients with previous normal renal function. Long-term monitoring of renal function is needed especially in severe AKI patients with old age and long duration of anuria. Introduction A high postoperative serum myoglobin (MyG) concentration predicts the incidence of acute kidney injury (AKI) and need for renal replacement therapy (RRT), as reported in several surgical settings. The incidence of such events in the ICU is reported to be 2 to 5% of all causes of AKI [1] , but can even worsen in the case of cardiac surgery (40.3%) [1] . MyG is a small protein (17.8 kDa) that can be removed with RRT, typically in convection cases. New-generation membranes, removing sub-albumin protein molecular weight solutes, can be used in diff usive treatments (CVVHD) with the advantage of limiting albumin loss and easily combining with citrate anticoagulation, pivotal for cardio-surgical settings. We assessed the eff ectiveness of EMIC2 with citrate anticoagulation in AKI prevention of post-cardiac surgical patients. Methods This is a case series of eight patients (mean age 62.7 years, fi ve male, EuroSCORE log 15.61) in cardiac surgery on CPBP for 150 minutes Introduction Sleep deprivation is recognised as an important cause of morbidity after ICU admission, but most centres do not routinely assess their patients' sleep. Considering the invasive nature and costs associated with objective sleep measurements, they are unsuitable for routine use. Subjective measurements off er an easy-to-use and economical alternative, the most well validated of which is the Richards-Campbell Sleep Questionnaire (RCSQ). This can be used to derive an accurate estimation of the sleep effi ciency index (SEI), a well-validated measure of sleep. However, there are several concerns regarding patients reporting their sleep quality and quantity using these questionnaires [1] . Additionally, they cannot be used to assess sleep in sedated or delirious patients. It has been suggested that one way to bypass the drawbacks of patients assessing their own sleep would be to utilise nursing staff [1] . Previous smaller scare studies have since agreed with this suggestion [2] . This study aimed to assess whether staff were able to use the RCSQ to accurately assess their patients' sleep.Methods Fifty-nine patients consented to complete the RCSQ for each night of their ICU admission. Alongside this, the nurses who had cared for these patients were asked to assess their patients' sleep using the RCSQ. These were then matched with their patients' responses. The Bland-Altman method was applied to assess for agreement between patient and nurse SEIs in order to reveal whether nurses could accurately estimate their patients' night sleep. Additionally, Cronbach's alpha was derived to assess for internal consistency. Ethical approval was gained prior to the start of the study. Results A total of 126 pairs or RCSQs were gathered. The mean diff erence between nurse and patient SEIs was -0.9, suggesting there is no signifi cant trend regarding nurses overestimating or underestimating patients' SEIs. In total, 94.2% of nurses' estimations fell within the limits of agreement. The variability of the diff erences was consistent across the range of averages. Cronbach's alpha was 0.63 between nurse and patient scores, suggesting questionable reliability between the RCSQ pairs. See Figure 1 . . However, we found no correlation between the PSI and the % delta power, both at baseline (R = 0.32, P = 0.37) and after pain stimulation (R = 0.17, P = 0.63). The % delta power from frontal electrodes (Fp2-Fp1) was well correlated with that obtained from posterior electrodes (P4-Pz; R = 0.42, P < 0.0001), and remained unchanged after pain stimulation, indeed confi rming a deep sedation level in our patient cohort. Conclusion Using standard EEG in this small cohort of mechanically ventilated ICU patients, a deep sedation level was frequently observed. Our preliminary data suggest that simplifi ed EEG with the SedLine system is less accurate than the standard 19-channel EEG to assess the depth of sedation in the ICU setting. Introduction Pain is a common problem in mechanically ventilated patients and assessing pain in the sedated patient is diffi cult, as patients are often unable to verbalise [1] . Behavioural pain assessment tools have been developed and validated for the assessment of pain in this patient group. A behavioural pain assessment tool was implemented as part of a wider ventilator-associated pneumonia (VAP) prevention programme within a large UK ICU. Methods The Behavioural Pain Scale (BPS) [2] was selected and implemented into daily clinical practice supported by a tailored education programme and incorporation into the clinical information system (MetaVision). Questionnaires pre and post BPS implementation were used to assess nursing staff opinions on pain assessment and opioid infusion titration. Four months of data were compared pre and post implementation, examining aspects of patient sedation and analgesia exposure and ventilated patient outcome parameters. In the pre-implementation and post-implementation questionnaire (response rate of 38% and 37% respectively of nurses surveyed), nursing staff reported they were signifi cantly more confi dent in titrating opioids after implementation (3 (2; 3) and 3 (3; 4) respectively; P <0.01), despite a lack of signifi cant diff erence in their reported confi dence to assess pain. Compliance was good, with a median daily documentation rate of 66% with the standard being once per 8-hour shift. Median BPS score rank was signifi cantly higher on patient movement (2 (1; 2)) compared with at rest (1 (1; 2); P <0.001) (1 = no pain, 2 = mild pain). No statistically signifi cant diff erence was seen in the length of stay, duration of mechanical ventilation, VAP rate or median sedation exposure. Pre-implementation median opioid administration when looking at morphine (mg) equivalence was 455.8 (203.1; 1,174.8), and post implementation the median increased to 620.3 (218.1; 1,502); however, this did not reach statistical signifi cance (P = 0.235). There was no statistically signifi cant change in the prescribing of analgesic adjuncts. The sample size was underpowered to detect signifi cant diff erences. Conclusion The BPS was successfully implemented into routine nursing practice and signifi cantly improved nurse confi dence in opioid infusion titration. Analgesic use was not signifi cantly diff erent between evaluation periods, but the results indicate that further education on anticipating pain and treating pain pre-emptively with timely administration of opioids is needed. References In sepsis, sympathetic nerve activity is increased, which helps maintain arterial pressure in the face of nitric oxide-induced vasodilatation. Accordingly, we investigated the haemodynamic eff ects of the centrally acting α-adrenoceptor agonist clonidine in an ovine model of severe sepsis. Methods A prospective interventional blinded crossover study in 12 Merino ewes with cardiac and renal fl ow probes implanted to continuously measure cardiac output and renal blood fl ow. Arterial pressure was continuously monitored and blood and urine samples were taken. After 24 hours of control, sepsis was induced by an intravenous bolus and continuous infusion of live Escherichia coli for 32 hours. After 24 hours of sepsis, animals were randomly and blindly allocated to vehicle infusion or clonidine (1 μg/ml/kg/minute) for 8 hours. The E. coli infusion was then discontinued, gentamycin 150 mg given i.m. and the animals were followed for 16 hours during recovery. The animals that survived were crossed over to the alternative treatment 2 weeks later.Results Complete data were collected on eight animals/group, three animals died/group. Hyperdynamic sepsis with hypotension and acute kidney injury of similar degree developed in the two groups. Haemodynamic and renal eff ects of clonidine are shown in Figure 1 .Conclusion In ovine hyperdynamic sepsis, clonidine transiently increased urine output without aff ecting creatinine clearance. Conclusion Enteral administration of melatonin was adequate in the early phase of critically illness, with pharmacokinetics similar to published data [2] . The administration of melatonin seems to increase the TAC, with a possible meaningful role in critically ill patients. References Introduction Blood glucose (BG) control in acute illness improves outcome. However, how to manage BG levels, or BG target, is not clearly elucidated. In this study, the BG target was suggested by the analysis of the relationship between BG profi le and the severity of the diseases. Methods Ninety-six patients were studied. The following parameters were calculated during the fi rst week after ICU admission. (1) Maximum value of SOFA score (SOFAmax). (2) Mean, standard deviation, maximum, minimum, and diff erence of BG levels (BGm, BGsd, BGmax, BGmin, BGd (BGmax -BGmin), respectively). BG levels were measured basically every 6 hours. (3) Correlation between SOFAmax and BG parameters using two-dimensional (correlation coeffi cient r t ) and linear regression analysis (r l ). Mortality of the patients with SOFAmax 0 to 3, 4 to 5, 6 to 7, 8 to 9, and 10 or more were 0%, 14%, 23%, 40%, and 89%, respectively. (2) r t and r l (r t /r l ) between SOFAmax and BG parameters: [1] , ranolazine is used as an antianginal agent for the treatment of chronic angina pectoris when angina is not adequately controlled by other agents [2] . Besides its cardiovascular eff ects, ranolazine improves diff erent neuronal functions, and thus its use has been proposed for the treatment of pain and epileptic disorders [3, 4] . Since astrocytes are involved in neuronal infl ammatory processes, and autoimmune and neurodegenerative diseases [5] , we have investigated the antiinfl ammatory and antioxidant eff ects of ranolazine in primary cultured astrocytes.Methods We incubated diff erentiated rat astrocytes in primary culture (10 days of culture) [5] for 24 hours with ranolazine (10 -5 , 10 -6 , 10 -7 M). We measured the protein expression levels of PPARγ and Cu/Zn-SOD by western blot technique. Protective eff ect of ranolazine on cell viability was assayed using MTT conversion assay. Finally, to evaluate the eff ect of ranolazine on the IL-1β cytokine and TNFα mediators, we used the enzyme-linked immunosorbent assay technique.Results Compared with control cells, treatment with ranolazine induced an increment of anti-infl ammatory PPARγ and reduced the proinfl ammatory mediators IL-1β and TNFα in primary cultured astrocytes. Ranolazine (10 -6 M) also increased the expression of antioxidant protein Cu/Zn-SOD and caused a signifi cant increase in cell viability.Conclusion Ranolazine decreases infl ammatory mediators IL-1β and TNFα, and increases anti-infl ammatory PPARγ as well as the antioxidant Cu/Zn-SOD in astrocytes in culture. These results suggest that ranolazine could be useful as a neuroprotective drug in pathologies inducing infl ammatory damage and oxidant processes. References Evaluation of the ocular microcirculation in brain-dead patients: fi rst step towards a new method of multimodal neuromonitoring? T Tamosuitis Introduction Multimodal neuromonitoring is a part of goaldirected therapy in severe neurosurgical pathology that leads to better understanding and therefore accurate and timely correction of disturbances of cerebral perfusion. The aim of our study was to evaluate and compare microcirculation in the conjunctiva of the eye and sublingual mucosa in brain-dead patients and healthy volunteers.No studies to our knowledge with this purpose were performed previously. We hypothesized that direct videomicroscopic evaluation of conjunctival microcirculation is linked to cerebral blood fl ow.Methods We evaluated microcirculation of the eye conjunctiva and sublingual mucosa of 10 brain-dead diagnosed patients and 10 healthy volunteers. Brain-death diagnoses were certifi ed by cerebral angiography. Direct in vivo observation of the microcirculation was obtained with sidestream dark-fi eld imaging. Assessment of microcirculatory parameters of convective oxygen transport (microvascular fl ow index (MFI), proportion of perfused vessels (PPV)), and diff usion distance (perfused vessel density (PVD) and total vessel density (TVD)) was performed according to international criteria. Results All brain-dead patients required vasopressor support to sustain perfusion of donor organs. The MFI of small vessels was signifi cantly lower in brain-dead patients in comparison with healthy controls in ocular conjunctiva (2.6 (2.4 to 2.8) vs. 3.0 (3.0 to 3.0), P = 0.03) and in sublingual mucosa (2.8 (2.6 to 2.9) vs. 3.0 (3.0 to 3.0), P = 0.04). TVD and PVD of small vessels were signifi cantly lower in brain-dead patients in comparison with healthy controls in ocular conjunctiva (10. Introduction Stroke is the main cause of deterioration of activity of daily living in Japan. The social problem in Japan is the diff erence in medical quality between the urban and depopulated areas. To improve the problem, telemedicine using a mobile device between general physicians and stroke specialists became important with the increasing demand for rapid and correct diagnosis for treatment of acute stroke. We developed a system for rapidly exchanging diagnostic images and clinical information in depopulated areas to develop the standard thrombolytic therapy using alteplase for acute ischemic stroke [1, 2] .Methods A system was consisted of communicating patient data and imaging between the hospital system and participating staff members in and out of the hospital using mobile devices. The system can transfer clinical data and large volumes of CT and MRI, and expert opinion in real time. We developed the system (k-support) in the Kaifu area, which is a typical depopulated area in Tokushima Prefecture, Japan, between the general physicians in Tokushima Prefectural Kaifu Hospital and specialists in stroke and cardiovascular disease at Tokushima University Hospital from February 2013.Results The k-support system was managed in 102 emergency patients, 65 patients (64%) were classifi ed as neurological disease, 41 (40%) as stroke, 11 (11%) as head injury, two (2%) as epilepsy. The detail of stroke was ischemia in 35 (34%), hemorrhage in four (4%) and SAH in two (2%). Two ischemic stroke patients were treated with intravenous thrombolysis of alteplase using the k-support system and a 'dripand-ship' paradigm. One patient using alteplase showed complete recanalization of the middle cerebral artery. The consultations resulted in hospitalization in 42%, transfer in 37% and return home in 20%.Conclusion Before introduction of the k-support system, the standard thrombolytic therapy using alteplase for acute ischemic stroke could not operate in the Kaifu area due to the absence of stroke specialists and the long distance to a neighboring stroke center. The telemedicine system using a mobile device as the k-support system can be used anytime, anywhere and by anyone. This system can communicate with the doctors, between general physicians in depopulated areas and specialists in urban areas, and may become a useful tool for acute patient management in not only stroke but also other emergency diseases. References Methods Twenty rats were assigned to one of three groups: a double 250 μl intracisternal injection (ICI) was realized with saline in the CONTROLS group (n = 6) or with autologous arterial blood in the SAH (n = 8) and SAH+TER (n = 6) groups. Treated animals received an oral administration of 30 mg/kg/day TER during 5 days following blood injection. Rats were evaluated using a functional isotope imaging technique (high-resolution microSPECT). Brain capture of three 99m technetium radiolabeled tracers was evaluated: HMPAO at day (D) 0, 2 and 5 for cerebral perfusion quantifi cation, DTPA at D3 for blood-brain barrier (BBB) integrity study and annexin V-128 at D4 for apoptotic activity study. Radioactivity was measured in a predefi ned region of interest: cerebrum, cerebellum and brainstem. Statistical analysis: oneway ANOVA followed by Student's t test. Results Brain HMPAO perfusion microSPECT ( Figure 1) Introduction After a patient is declared brain dead, the cessation or withdrawal of therapy in Japan is quite a diffi cult operation because of the legal issues involved. Therapy is continued except in the case of donation of organs for transplantation from brain-dead patients until their cardiac arrest. Consequently, we may determine what the fi nal cause of death is in brain-dead patients. Our hypothesis is that braindead patients ultimately die of cardiac failure. Methods From January 2011 to December 2012, brain-dead adult patients in our ICU were investigated. The declaration of brain death is determined by clinical neurological examinations, electroencephalography, and the auditory brain-stem response test. Age, sex, primary diagnosis, Glasgow Coma Scale (GCS) on admission, the number of days from the diagnosis of brain death until cardiac arrest (days), the fi nal cause of cardiac arrest, urine volume for the last 24 hours (ml), serum potassium levels (mEq/l), PaO 2 (mmHg) and We investigated whether blood alcohol levels (BAL) had any impact on presentation and outcome in patients with traumatic brain injury (TBI). Methods Forty-six patients with TBI requiring intubation between January 2000 and December 2012 were included. Patients were grouped into BAL-positive (>0.5‰; n = 24) and BAL-negative (<0.5‰; n = 22). Physiological parameters and outcome (survival to hospital discharge (STHD)) and neurological outcomes (Glasgow Outcome Scale (GOS), Cerebral Performance Category (CPC) and Glasgow Coma Scale (GCS)) were analyzed. Diff erences between groups were analyzed using Student's t test and results presented as mean ± SD. Results There were no signifi cant diff erences in gender and age distributions between the BAL-negative and BAL-positive groups (73% vs. 88% male; P = 0.218; 53 ± 21 vs. 43 ± 17 years; P = 0.098). There were also no diff erences in initial systolic blood pressure (BP; 144 ± 30 vs. 132 ± 37 mmHg; P = 0.241) and respiratory rate (RR; 13 ± 6 vs. 12 ± 7/ minute; P = 0.891) but BAL did aff ect the initial GCS score (7 ± 3 vs. 5 ± 2; P = 0.022). There was no eff ect on CPC (3.5 ± 1.9 vs. 2.8 ± 1.8; P = 0.185), GOS (2.4 ± 1.8 vs. 3.0 ± 1.7; P = 0.270), and GCS at discharge from the hospital (14 ± 2 vs. 14 ± 2; P = 0.801). There were also no diff erences in length of hospital stay (LOHS; 15 ± 23 vs. 23 ± 34; P = 0.372) and STHD (1.6 ± 0.5 vs. 1.3 ± 0.5; P = 0.083).Conclusion BAL did not have a signifi cant eff ect on presenting physiological parameters such as systolic BP and RR. It did, however, aff ect the presenting GCS score, which was signifi cantly lower in the BAL-positive group. BAL did not seem to have an eff ect on functional outcome or mortality measured by STHD. No favorable neuroprotective or deleterious eff ect was observed, demonstrated by no signifi cant diff erences in CPC, GOS or GCS scores at discharge. Psychiatrists or neurointensivists evaluated delirium using the DSM-IV criteria for delirium. Assessments results were blinded and performed independently. Criterion validity of the CAM-ICU and the ICDSC were calculated from 2 × 2 frequency tables using standard defi nitions of sensitivity, specifi city, positive and negative predictive value, and overall accuracy. Because delirium was assessed repeatedly, estimates of 95% CIs for binary repeated data using generalized estimating equation in conjunction with the Huber-White estimator were performed. Inter-rater reliability for the CAM-ICU and ICDSC was assessed with the kappa coeffi cient. During an 8-month period, 61 patients (mean age 56.4 ± 18.5 years, mean APACHE II score 11.4 ± 6.5, mean GCS 13 ± 2) were enrolled. The overall sensitivity and specifi city for CAM-ICU (62% and 74%, respectively) and ICDSC (64% and 79%, respectively) were similar. The overall kappa for inter-rater reliability for the CAM-ICU and ICDSC was 0.64 and 0.68, respectively. In subgroup analyses, the CAM-ICU and ICDSC showed increased sensitivity and decreased specifi city in moderate TBI as well as deeper levels of sedation (RASS -2 to -3).Conclusion Criterion validity and inter-rater reliability of the CAM-ICU and ICDSC were good. Severity of TBI and depth of sedation infl uence delirium assessments. Clinicians should be aware of those limitations before using these clinical tools in this population. Introduction Traumatic brain injury (TBI) is a major cause of death and long-term disability worldwide, and understanding its eff ect on health is critical. Although mortality has been a gold standard for years, functional scales and quality of life have been used as main outcome measures over the last decades due to their usefulness and aid in decision-making for medical teams, patients, and families. Despite this preference, no consensus exists for the most optimal outcome measure. We systematically reviewed outcome measures used in randomized controlled trials (RCTs) in patients with severe TBI admitted to an acute care hospital. Methods We searched MEDLINE, EMBASE, Cochrane Central, BIOSIS and references of eligible trials. RCTs published over the last 7 years (2006 to October 2013) in 18 selected journals (based on impact factor) were considered for inclusion. RCTs performed in adults with severe TBI (GCS ≤8) were eligible. The primary endpoint was the outcome measures used in RCTs. The secondary outcomes were the timing of assessment and the methodological quality of trials using the Cochrane risk of bias assessment tool. Two independent reviewers selected trials and collected data using a standardized form. Results From 5,602 citations retrieved after removal of duplicates, 36 RCTs met eligibility criteria. The outcome measures most frequently used were neurophysiologic indices (n = 18, 50%), the Glasgow Outcome Scale (n = 17, 47%), mainly at 6 months, nonspecifi c complications (n = 15, 42%), mortality (n = 12, 33%), and biomarkers (n = 10, 28%). Nine trials reported only physiologic indices and did not present any clinical or functional outcome measures. The methodological quality of included RCTs was heterogeneous. We observed a low risk of bias for sequence generation (n = 29, 80%), allocation concealment (n = 20, 55%), complete data reporting (n = 30, 83%), selective reporting (n = 36, 100%), and sample size (n = 21, 58%) but a high risk of bias for blinding (n = 20, 55%). Conclusion A simple prognostic model, based only on age and GCS, displayed a fairly good prognostic performance in predicting 6-month mortality of ICU-treated patients with moderate-to-severe TBI. The use of more complex scoring systems added little to the prognostic performance. Results Of 65 patients with TBI, 25 patients (18 men, seven women; mean age 56.7 ± 20.1) who were administered antiepileptic agents for PTS prophylaxis were studied. Fifteen cerebral contusions, 10 acute subdural hematomas, nine traumatic subarachnoid hemorrhages, two cerebral infarctions, two pneumocephalus and one traumatic intracerebral hemorrhage were shown in 25 patients. All patients were alive 28 days after the injury. Fourteen patients (56%) were intravenously administered (13 phenytoin and one phenobarbital), while 11 patients (44%) were administered with enteral feeding (four valproates, four carbamazepine and three zonisamides) as PTS prophylaxis. The average start day of PTS prophylaxis was 2.6 days after the injury by intravenous administration, and 2.2 days by enteral administration, respectively. Two patients with phenytoin showed hepatic dysfunction as an adverse eff ect and no patient showed early PTS both by intravenous and by enteral administrations. Conclusion The present study has some limitations because it is a single-center retrospective analysis. However, enteral administration of antiepileptic agents could be useful for PTS prophylaxis. Considering cost, adverse eff ects and serum monitoring, there is a possibility of enteral administration of antiepileptic agents as an alternative to intravenous phenytoin. Introduction We aimed to summarize the effi cacy of simulation-based education in cardiopulmonary resuscitation and airway management [1] . Methods We searched the MEDLINE, Scopus and EMBASE databases for all peer-reviewed articles enrolling physicians/medical students in a simulation of either cardiopulmonary resuscitation or airway management protocols compared with no intervention or traditional teaching methods. We categorized the outcomes of the studies into four groups: task success, process skill, time skill, knowledge. Task success was defi ned as evaluation of successful completion of the task, process skill as evaluation of the procedure, time skill as the time required to complete the task, and knowledge as the objective assessment of conceptual understanding. When studies investigated more than one outcome, we considered the primary outcome, the overall measure or the most clinically relevant outcome. Results From 8,528 articles, we selected 24 studies (13 randomized controlled studies, eight pre-post studies, three case-control studies) involving 1,149 participants. Compared with no intervention or traditional teaching methods, simulation was associated with a signifi cant improvement from mild to moderate of all outcomes (Figure 1 ). Log of odds ratio for task success was 2.03 (0.46 to 3.59) in favor of simulation. Pooled eff ect size for process skill was 0.48 (0.11 to 0.84), for time skill was 0.29 (0.13 to 0.73) and for knowledge was 0.41 (0 to 0.84). Conclusion Simulation is an eff ective educational method to improve performance of physicians/medical students in the application of protocols for cardiopulmonary resuscitation and airway management. Results Thirty-two resuscitation episodes during transportation by ambulance were analyzed. Median CPR time per episode was 846 seconds (range 126 to 1,833 seconds). In total, the fraction of time without chest compression was 19.5 ± 7.6% (mean ± SD). Reasons for interruption and its fraction of time in total hands-off time were as follows: 36% accounted for rhythm analysis/pulse check, 31% for ventilation, 11% for setting up automated chest compression devices, 8% for tracheal intubation/placement of supraglottic airway devices, 4% for intravenous line placement/administration of adrenaline, 3% for rescuer change, and 7% for adjustment of patient position/correction of rescuer posture and others. Conclusion The fraction of time without chest compression observed in this study was comparative with those found in other studies in spite of the diffi cult situations, such as during transportation. Most frequent reasons for hands-off time were rhythm analysis and ventilation even though the ambulance crews strictly adhered to the guidelines. Introduction During advanced life support (ALS) it is still impossible to predict return of spontaneous circulation (ROSC) or outcome. Cerebral saturation (rSO 2 ) measurements with near-infrared spectroscopy can be used in cardiac arrest circumstances and could play a role in predicting ROSC or neurologic outcome [1] . It is known that an initial rhythm of asystole and a long no/low-fl ow time has a worse outcome. We measured rSO 2 during ALS in out-of hospital cardiac arrest (OHCA) patients and compared the mean rSO 2 during the fi rst minute with the time between the emergency call (EC) and start of ALS. Methods With IRB approval, rSO 2 was prospectively measured between December 2011 and November 2013 in 51 OHCA cases with presumed cardiac cause during ALS. One sensor of the EQUANOX Advance was applied to the right side of the patient's forehead when the medical emergency team arrived. The measurement was continued until death of the patient or arrival at the ICU. Survivors (S) are defi ned as sustained ROSC longer than 20 minutes. CPR data were collected using the Utstein CPR data registration. The Mann-Whitney test and Pearson chisquare were utilized for comparison of S and nonsurvivors (NS) data. The Pearson test was used to examine correlation. Of the 51 patients, 21 were S. Mean age was 70 years (± 15) in the S, of which 10 (48%) were male, and in the nonsurvivors (NS) mean age was 70 years (± 17) (P = 0.916) with 23 (77%) male patients (P = 0.042). The initial rhythm was asystole in 11 S and in 20 NS (P =0.386), pulseless electrical activity in two S and NS (P = 1), and ventricular fi brillation in eight S and six NS (P = 0.207). The arrest was witnessed in 15 NS and 16 S (P = 0.083). Lay rescuer BLS was performed in 17 NS and nine S (P = 0.4). A signifi cant diff erence in time of EC and start of ALS was observed between S (12 minutes; 8 to 15) and NS (14 minutes; 12 to 17) (P = 0.03). Mean initial rSO 2 was 34% (± 23) and 24% (± 12.5) in S and NS (P = 0.077). We observed a negative correlation between mean initial rSO 2 and time between EC and ALS (correlation coeffi cient -0.243; P = 0.041). Conclusion A tendency towards higher mean initial rSO 2 in S compared with NS was observed together with a negative correlation between mean initial rSO 2 and time between the EC and start of ALS. Also a signifi cant diff erence in time between the EC and start of ALS between S and NS is observed. Larger studies are needed to confi rm the possible function of rSO 2 as a surrogate for time between the EC and start of ALS. Reference Introduction Induced hypothermia is applied in the ICU and in the operating theater to reduce ischemia-reperfusion injury. It is thought that induced hypothermia may hamper the immune response and therefore carry the risk of acquiring or aggravating an infection. We investigated the eff ect of hypothermia on host response by comparing survivors of cardiac arrest in which body temperature was kept at either 33°C or 36°C. Methods As a substudy to the Target Temperature Management trial [1] in which cardiac arrest patients admitted to the ICU were randomized to maintaining either 33°C (n = 11) or 36°C (n = 9) for 24 hours, blood was drawn at the start and end of the target temperature phase as well as after reaching normotemperature. Host response was measured via monocyte human leukocyte antigen-DR (HLA-DR) expression and via whole blood stimulation with TLR ligands lipopolysaccharide (LPS) and lipoteicoic acid (LTA) for 24 hours. Plasma levels of interleukin (IL)-1β, IL-1RA, IL-6, IL-8, IL-10, tumor necrosis factor alpha (TNFα), macrophage infl ammatory proteins (MIP)-1, monocyte chemotactic protein (MCP)-1 and soluble CD40 ligand levels were determined with ELISA or Luminex. Statistics were by unpaired Mann-Whitney U tests. Results HLA-DR expression was decreased compared with healthy controls, without diff erences between 33°C and 36°C. Following whole blood stimulation with LPS, TNFα and IL-6 production was lower after cardiac arrest compared with healthy controls. The 33°C and 36°C groups diff ered at baseline in TNFα levels after LPS whole blood stimulation. After 24 hours of temperature management, there was no diff erence in both TNFα and IL-6 production between the groups following TLR ligand stimulation. Introduction Shivering complicates targeted temperature management (TTM) by increasing metabolism, oxygen consumption, rest ing energy expenditure, and carbon dioxide production and is associated with lower brain tissue oxygen levels; all of these may limit the eff ectiveness of TTM. However, the recognition and measurement of shivering are subjective and ill-defi ned. The Bedside Shivering Assessment Scale (BSAS) is the only validated tool to describe the intensity of shivering. We hypothesized that the derived electromyography (dEMG) value measured by the bispectral index monitor (BIS) would agree with energy expenditure due to shivering, compared with the BSAS. Methods We measured continuous indirect calorimetry during a 2 to 5 hour time span during targeted temperature management in 12 patients being treated for hypoxic ischemic encephalopathy after cardiac arrest. Patients were excluded if seizing, requiring >FiO 2 0.5, exhibiting early spasticity, or not shivering. The BSAS was measured every 15 minutes by a blinded observer and shivering was treated for a BSAS ≥1, as per institutional protocol. The association of dEMG and BSAS as a predictor of resting energy exposure (REE) was measured using linear regression and Pearson's correlation. The study population included 12 patients. Average age was 54 years, nine patients were male, eight patients had a CPC of 1 to 3 on hospital discharge. There were a total of 182 measurements of BSAS, dEMG, and REE. There is improved correlation between REE and dEMG compared with REE and BSAS (0.24 (CI = 0.10 to 0.37) vs. 0.10 (CI = -0.04 to 0.24); P <0.001 vs. 0.14). Each increase in dEMG resulted in an increase of 14 kcal of energy expenditure (P = 0.003). Conclusion Continuous dEMG power measured by Covidien BIS monitors is a more accurate and less subjective measure of shivering burden compared with the intermittent BSAS. Introducing dEMG into clinical practice may improve recognition of shivering, allow quantifi cation of the metabolic cost of shivering, and serve as a more reliable research tool for diagnostic and treatment strategies of shivering. Introduction Therapeutic hypothermia (TH) following out-of-hospital cardiac arrest (OHCA) improves neurological outcomes in such patients. During TH, neuromuscular blockade is used to control shivering in the patient. In our hospital, we use vecuronium as a neuromuscular blocker. However, occasionally, prolonged eff ects of vecuronium delay accurate evaluation of patients' neurological function or extubation. Unfortunately, the factors involved in the prolonged eff ect of vecuronium in TH remain unclear. Methods We conducted a retrospective cohort study of patients managed with TH following OHCA at our institution from April 2010 to September 2013. We defi ned full-muscle reaction to train-of-four stimulation (TOF) as the end of eff ects of vecuronium. In this study, the time from the end of vecuronium administration to full-muscle reaction to TOF was evaluated as the outcome. We calculated the adjusted hazard ratio (HR) for the outcome using Cox regression analysis after adjustment for age, gender, albumin levels, estimated glomerular fi ltration rate, temperature at the end of vecuronium administration, and total amount of vecuronium per kilogram of body weight. . We supposed that there might be a large consumption of high-energy phosphates during the rewarming period, and this might be associated with awakening and SND. Due to diffi culty in measuring such high-energy phosphates at the bedside, we measured the serum phosphate concentration (P) as a substitute, and assessed its infl uence on awakening and postoperative SND. Methods Twenty patients with a mean age of 68 ± 11 years who underwent open TAR under DHCA applied at a temperature of 20°C, with antegrade cerebral perfusion (ACP) or retrograde cerebral perfusion (RCP), were enrolled. Arterial blood gas lactate concentration (ALac), ScvO 2 , and P were measured at 2, 4, 6, and 10 hours after weaning from cardiopulmonary bypass (CPB). We also measured time to confi rm M6 on the Glasgow Coma Scale (GCS), and incidence of SND.